• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Remember that last time around Nvidia pulled a switcheroo and the much-anticipated Volta was a data-centre part and they launched Turing as the consumer part.

I imagine it's the datacentre variant of ampere we'll be hearing about at the gtc, with consumer cards to come afterwards, at some point, potentially unspecified point, in the future.
 
20 months between 970 and 1070, then about 30 months between 1070 and 2070. Even September this year is still only 24 months.

WwZeFgj.png



Autumn 2016 release for RX 400 series cards, followed only in April 2017 for the RX500 series cards, so only about 8 months.

Then Vega in August 2017, and Radeon VII in February 2019, again each of those only 6 to 8 months apart.

Then RX5700 series in July 2019 only 5 to 6 months after Radeon VII.


We're overdue an AMD release by now, its been over 10 months since the RX5700 series. Not really overdue an NVIDIA release yet.


I haven't upgraded since the RX480 so Im eagerly awaiting the next releases.
 
That's good then.

I already pimped out my gaming HTPC with an 3950x, 32gb RAM and 1TB Samsung Evo Plus, using a 2070s as an stop gap.

I have £1500 saved the for 3080ti, so if it turns out to be that price then my wallet is ready!

I've also been saving and have a similar amount, I skipped the 1080ti and 2080ti mainly due to budget but im hoping to get a new one this time.

Prey its not anymore... but we'll see
 
Which games? I mean with any game, you kind of have to be sensible with settings. Toggle on RTX or hairworks or some ridiculously intensive process which does very little and it can cause any GPU a few issues.

If you're insistent on going ultra on every single game.. then sure.. as with any other gen.. for that you'll need the top of the line card.

My RTX 2080 has troubled me with two games; KCD and RDR2. With setting optimisation, RDR2 got very very playable.

I imagine an RTX 2080 equivalent GPU in a console can easily power games to 60fps given they can optimise better and consoles normally hover around the low to medium settings territory for graphics.

The ones mentioned in the post above yours. And no, I'm not talking about overkill settings like Super Sampling, 8xMSAA or RT and not even 4k, quite a bit less. :)

Try running RDR 2 at 4k on that 780ti in native resolution. :)

I play on triple 1080p screens for a while as well (as I prefer the larger FoV and can always fall back on a single 1080p when performance is not enough or the game doesn't support multi displays), and that's about 75% of 4k. There are quite a lot of new and relatively older games that require tweaking to various degrees to keep it stable at 60fps - that means no dropping into 50pfs or even lower, no 50 to 70fps and call it a 60. And no, no "dirty" hairworks (aka The Witcher 3) or RT of any kind. For instance, RDR 2 has the majority of the settings to low or disabled. Never mind "medium" across the board as it would deep constantly into 50fps (and perhaps lower).

Anthem, Crysis 3, Deus Ex MD, Just Cause 4, Kingdom Come Deliverance (if I remember exactly), Metro Exodus (without RT), Tom Clancy's The Division 2 (and the 1st one I think), Watch Dogs 2, Quantum Break (I think) are just some that require various adjustments - some minor, some important (as in "it bothers me that I have to lower settings that much or is noticeable). Increase even more the resolution to 4k and the drop will be more significant where it would probably bother me a lot.

Sure, each to their own, some won't mind 30-40fps or even lower - after all, there are millions of people playing games that go into 20fps and don't have the best image quality (plenty example on consoles), or just drop settings as low as needed, ergo wanting constant 60fps at relatively high settings (not maximum, mind you!), may be seen as... ridiculous? :) Still, in my book, if is X resolution at Y fps, then that Y fps has to be maintained and frame rates should not drop lower. Everything else is just marketing.

Can the next consoles do 4k@60FPS now, at native resolution? Sure, for most part should be no problem, but in the future... well, that's another story.
 
20 months between 970 and 1070, then about 30 months between 1070 and 2070. Even September this year is still only 24 months.


Autumn 2016 release for RX 400 series cards, followed only in April 2017 for the RX500 series cards, so only about 8 months.

They where the same die, even the 590 was just on samsung version of Glofo's 14nm marketed as 12nn


Then Vega in August 2017, and Radeon VII in February 2019, again each of those only 6 to 8 months apart.

VII was not only just a Die shrink of vega so the amount of engineering needed was a lot less, the die used was already 4 months old cast offs from the Radeon Instinct MI50/60[/QUOTE]

Then RX5700 series in July 2019 only 5 to 6 months after Radeon VII.


We're overdue an AMD release by now, its been over 10 months since the RX5700 series. Not really overdue an NVIDIA release yet.

Given AMD's gpu release pattern the past 5 years I think its wishful thinking they be a significant release with 12 months of the last one ie a new Architecture
 
@KillBoY_UK yeah i appreciate the intermediate releases are rehashes of existing GPUs. I guess the super variants of the Nvidia cards fall into the same category and they only came out last year.

I dont follow the GPU release schedules like many here do but its kinda frustrating because whenever I do think its time for a new GPU there is always a new gen coming 'in a few months'. It makes me not want to spend any money on old tech (although its really not that old) and prices have gone stupid.

My RX480 was comparatively good value considering what we have available now. I could have upgraded it 4 or 5 times since 2016 when I bought it and I don't think I'd be much better off, certainly in the wallet I'd be far worse off.
 
Why do they even need so many increments of pricing?

Produce a £200 good enough card for most gaming and then produce a £600 top of range card and thats it.
There needs to a fair range to choose from to cover so many different monitor resolutions & refresh rates

for example
1080p 60hz
1440p 120hz
4K 120hz

Plus say the 1440 120hz person can't afford spend £600 but can afford £300 to £400
 
There needs to a fair range to choose from to cover so many different monitor resolutions & refresh rates

for example
1080p 60hz
1440p 120hz
4K 120hz

Plus say the 1440 120hz person can't afford spend £600 but can afford £300 to £400

They've kind of overdone it at the moment though. There's a card for nearly every £50 increment.
 
Back
Top Bottom