• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

@Twinz ok fair enough. Obviously there are multiple benchmarks available and as you say driver iterations. All I can do is pick a consistent benchmark across all cards otherwise it would be a complicated mess.

With respect to the 1080 Ti it shows that that card was good value. If it wasn't for that one card, there would be a more visible shift to the right between Pascal and Turing. Its the 1080 Ti that was the outlier here.

There is then a clear step to the right between the original Turing range and the super range. The super's clearly offer better value here.

If the Ti's were removed from this chart, the patterns would be much clearer. Its demonstated to me that I don't need to buy a Ti, and if I don't, then price and performance changes between generations are in fact very predictable and consistent, and the super range is clearly much better than the 1st Turing release. The super release is what the 1st Turing release should have been. Forget the Ti, it doesn't fit this pattern and you're clearly just paying for the very best which is only 10% ahead of the 2080 Super (in this benchmark).

rfpAweE.png

If you want to show the *historic* progress of given price points, you need to bring in the 700 and 900 price points and scores.

It's not in chart form, but it is pretty clear here:


https://www.reddit.com/r/hardware/c...i_the_rtx_3080_has_about_20_increase/fyrfh3a/

For reference here is information about previous releases of Nvidia:

GTX 780 Ti - $699 - (7.11.2013)

GTX 980 - $550 (18.9.2014) - 7% faster than 780 Ti, $150 cheaper, 10 months after the release of 780 Ti

GTX 980 Ti - $650 (1.6.2015) - 30% faster than 780 Ti, $50 cheaper, 18 months after the release of 780 Ti

GTX 1080 - $600 launch (27.5.2016) , cut to $500 (1.3.2017) - 27% faster than 980 Ti, $150 cheaper, 12 months after the release of 980 Ti

GTX 1080 Ti - $700 (5.3.2017) - 43% faster than 980 Ti, $50 more expensive, 21 months after the release of 980 Ti

(EDIT: Anandtech puts 1080 Ti at +74% at 4K and +68% at 1440p over the 980 Ti)

RTX 2080 - $700 (20.9.2018) - 8% faster than 1080 Ti, same price, 18 months after the release of 1080 Ti

RTX 2080 Ti - $1000 (27.9.2018) - 29% faster than 1080 Ti, $300 more expensive, 18 months after the release of 1080 Ti

RTX 3080 - $??? (9.2020?) - 20% faster than 2080 Ti (?), unknown price, almost 24 months after the release of 2080 Ti

RTX 3080 Ti - $??? (9.2020?) - 35%-40% faster than 2080 Ti (?), unknown price, almost 24 months after the release of 2080 Ti

Performance figures are from techpowerup.com

Given the whole 2 years from Turing to Ampere, 20% is hardly impressive. Pricing will decide the fate of these cards. Also, the second time Nvidia left performance on the table by going with an inferior node, like with Turing.
 
If prices are just as high as with the current series are those looking to buy still going to buy?

I have the money easy peasy but spending more than £500 on a GPU is crazy.

Same mate. Although I have a young tribe to feed these days and the wife balked at me spending £300 on the vega, if only she knew what the rtx cards were asking...

Its gonna be a toss up with what nvidia can release around the £5-600 mark, then pray AMD have something substantial to trigger a offer frenzy.

If this fails to materialise I am very very tempted to just get PS5 on release and not think about it for a year or so.
 
I might wait until Cyberpunk releases in Nov until I buy. That way I buy a card that will deliver the performance I want at 1440p in that game.

I want a 3080. But over £600 on GPU is getting extravagant.
 
780Ti to 980Ti was +30% performance for *less* money.
980Ti to 1080Ti was +43% for a single digit percentage increase in price.

Heck the 1080 non-Ti was 27% faster than the 980Ti for *less* money.
 
Same mate. Although I have a young tribe to feed these days and the wife balked at me spending £300 on the vega, if only she knew what the rtx cards were asking...

Its gonna be a toss up with what nvidia can release around the £5-600 mark, then pray AMD have something substantial to trigger a offer frenzy.

If this fails to materialise I am very very tempted to just get PS5 on release and not think about it for a year or so.

That's what I think a lot of people are gonna do. But seen a report today saying the consoles are gonna be $599 +.
 
If you want to show the *historic* progress of given price points, you need to bring in the 700 and 900 price points and scores.

Ok here we are.

5B09LeQ.png


Its pretty obvious which generation is the odd one out here. Remove the 1st release of Turing and you get a really consistent set of steps between generations and you get a consistent set of prices between generations.

The 2080 Ti is clearly an odd one out here, almost, I would say, to the extent that it does not belong in this comparison at all.

The super range is clearly where the 1st release should have been, and the equivalent card to the 1080 Ti is not the 2080 Ti, but the 2080 Super.
 
Yep nice one that @danlightbulb !

If you look at the *060 on turing they have crept up. OK the super has good score but you would be blinkered to not acknowledge the lower cards have been milked upwards. Even the 2070's show this too.
 
A number of films that i have seen (and enjoyed) have been filmed on a Red camera. Without diving into the behind the scene stuff, I can't say with 100% certainty they were shot in 8K.
As far as i am aware most hollywood films are shot on Arri cameras.

Why do you ask?
Because capture and display format have absolutely no bearing on artistic merit.
 
Yep nice one that @danlightbulb !

If you look at the *060 on turing they have crept up. OK the super has good score but you would be blinkered to not acknowledge the lower cards have been milked upwards. Even the 2070's show this too.

Yes there has been some price creep across equivalent cards between generations, but at the same time there have been substantial performance increases between generations. If we eliminate the 1st release Turing, then the jump from the 1070 to the 2070 S is massive, as is the jump from 1060 to 2060 S. The 2060 Super is the same performance as the 2070 for two thirds the price, and incidently the same price as the 770 was back in Kepler.

You can see which cards offered the best value of each generation easily there. It was the 970 or 980 Ti on Maxwell, the 1070 or 1080 Ti on Pascal, and now the 2070 S in Turing.

Im amazed how clear it is that something went badly wrong with 1st release Turing here. It just doesn't fit with the previous pattern, but the super range does and appears to have rectified the issue.


780Ti to 980Ti was +30% performance for *less* money.
980Ti to 1080Ti was +43% for a single digit percentage increase in price.

Heck the 1080 non-Ti was 27% faster than the 980Ti for *less* money.

Sure, I get that. However what that chart shows is that the 2080 Ti is not the successor card to the 1080 Ti. If you follow the historical pattern, the 2080 super is the successor, and that's the same price.
 
Last edited:
Had the 2080Ti launched at ~$700, it would have been a "good" generational improvement and in line with previous generations.

Yeah but they seen an opportunity to hype up ray tracing and the price increased just because of the emphasis they spent the bulk of their presentation focusing on. That and the new cooler which was "meh" at best all contributed to the price bump.
 
Its certainly refreshing to be talking about it properly, and the RTX cards were promising you something that has yet to be worth that hike. Im hoping they warm customers with more sensible prices knowing consoles are going to be in their faces so they will want to appeal to that demographic which will drift into 2021 sales too.
 
Will probably £3000 in 2 years time with nvidinflation.



I'm going to sit tight till xmas for the dust to settle as all the big cards will be out by then so can check the benches/feedback/driver support etc and make a informed decision.
Way to go there's no value in pre-ordering and Xmas early 2021 will see significant financial woes in G7.
 
The super range is clearly where the 1st release should have been.

Edit* The 2080 super would have been less progress than previous generations, but at least it shows some* progress.

If you look at the 600-800 row, you can see that something is missing where that row meets the 2100-2200 column. Following the pattern, something should be there, (I don't care what they want to call it) but there's nothing there with Turing.
 
Right Ive made some more modifications which to me, make sense as to how to represent the progression.

Firstly, you may not agree but I have eliminated the 1st release of Turing completely. It just doesn't fit with the historical pattern and if you consider the super range as what Turing should have been, things make much more sense.

Secondly I have eliminated the 2080 Ti. Again, it just does not fit and the progression from Pascal to Turing super range makes much more sense. In fact, the successor to the 1080 Ti is clearly the 2080 super, not the Ti, and the successor to the 1080 is the 2070 Super, and the successor to the 1070 is the 2060 super. You can see what I have done in my chart annotations below.

iRXtpKj.png


My final comment is that whilst the percentage increases are reducing from generation to generation, the increments in raw scores are actually pretty similar. As anyone who works with percentages knows, you do have to be careful interpreting them because as the numbers get larger, the percentages look smaller for a given increase. This is exactly what we're seeing here.

Also Turing super release is cheaper, along the upgrade path I have indicated, than Pascal was.


If you look at the 600-800 row, you can see that something is missing where that row meets the 2100-2200 column. Following the pattern, something should be there, (I don't care what they want to call it) but there's nothing there with Turing.

Yes I agree with you, the increment between 1080 Ti and 2080 Super was poor as you can see on my latest chart above. We would have expected it to be further to the right. However then it would start interacting with the dud card 2080 Ti which is a major outlier in this sequencing. I think that is the crux of the matter. 1st Turing was clearly awful.
 
Last edited:
Its certainly refreshing to be talking about it properly, and the RTX cards were promising you something that has yet to be worth that hike. Im hoping they warm customers with more sensible prices knowing consoles are going to be in their faces so they will want to appeal to that demographic which will drift into 2021 sales too.

The problem with that is we haven't really seen anything really super impressive regarding RT on the consoles. If the consoles don't match like 70%+ the RT quality and effects of Ampere, Nvidia marketing will make the gap seem even bigger and they get to keep their prices unless RDNA 2 on the desktop can make a dent. I could be wrong but I doubt Nvidia has any warmth for anything other than cold hard cash(like most companies).
 
Back
Top Bottom