• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Even something like the 2080Ti doesn’t even saturate pcie 3.0 8x


Actually, yes it does. The reviewers of Horizon Zero Dawn found a 20% increase in performance going from x8 to x16. And streaming from RAM or NVME will fully saturate a 5700 XT on a v3 x16 slot.
 
I think showing it this way highlights that Turing 1 was a dud release - its not covering it up at all. Then it shows that they fixed it with Turing 2, which I think is a fair assessment. Yes benefit of hindsight and all that.

Here is my prediction of where the Ampere series will land. No science to it, just lined it up visually by eye.

IqG4dXn.png
Your prediction seems pessimistic to me. Considering that the pattern shows there's something missing at the $700/14500 block *this generation*. The next generation should have something in the area of $700/17000. -If you treat the 2080ti as the out-of-pattern aberration that it is.
 
Your prediction seems pessimistic to me. Considering that the pattern shows there's something missing at the $700/14500 block *this generation*. The next generation should have something in the area of $700/17000. -If you treat the 2080ti as the out-of-pattern aberration that it is.

If you fancy giving me around 5 data points I'll plot your line on too?
 
Actually, yes it does. The reviewers of Horizon Zero Dawn found a 20% increase in performance going from x8 to x16. And streaming from RAM or NVME will fully saturate a 5700 XT on a v3 x16 slot.

It is only one out of many games out there, and not necessarily one of the best performing ported games either. The vast majority of games it doesn’t make much if any difference. Sure that may change with the new cards and future games but I’m not expecting it to do so.

Sure pcie 4.0 seems the far more logical choice to buy into at this moment in time. And in the next few generations 3.0 16x won’t be fully able just like the 2.0 of today.

Id be interested to see the testing however though when the new cards are out.
 
If you fancy giving me around 5 data points I'll plot your line on too?

780ti >>>>>>>>>>>--980ti>>>>>>>>>>>>>>>>>>>>++1080Ti>>>>>>>>>>>>>>missing Turing card (price ++ if you move performance gain more to the right like the 1080ti did, or price -- if performance gain moves to the right more like the 980ti did, then do the same thing for the new Ampere card.
 
Turing 1 and Turing 2 need to be separated because its pretty obvious from the chart progressions that Turing 1 was a dud. They have however fixed it for Turing 2 and restored the historical improvement pattern in all but the very top cards.

If you were on a 1080, moving to a 2070 S got you 23% more (the same +3000 score increase as previous jumps) for less money. If it turns out that the upgrade to the 2070S is named the 3060, well so what if it offers another 25% jump? You seem focused on the naming or tiers rather than what you get for your money - got to see past that marketing.

Turing 2 was only a price hike if you followed the naming, so if you went from Ti to Ti, or from 1080 to 2080, then yes, prices went up. But you didn't need to do that to get the equivalent performance jump as historically occured, you could drop a 'naming tier' and still get it.

It looks to me like it simply wasn't worth upgrading from a 1080 Ti in the new gen, but it probably will be in the next because that will be 2 generations away and you'll get the jump - likely by dropping down a tier or two as well.
In that case it becomes really important to ask why "Turing 2" was released.

If we conclude that it was only released because of AMD pressure, then we can perhaps expect the same from Ampere.

If AMD aren't able to push nVidia, Ampere might end up with the same "Turing 1" magnitude price hikes.

We could even see the same thing again - Ampere 1 and Ampere 2. Depends how blatant nV want to be in their milking operation :p
 
In that case it becomes really important to ask why "Turing 2" was released.

If we conclude that it was only released because of AMD pressure, then we can perhaps expect the same from Ampere.

If AMD aren't able to push nVidia, Ampere might end up with the same "Turing 1" magnitude price hikes.

We could even see the same thing again - Ampere 1 and Ampere 2. Depends how blatant nV want to be in their milking operation :p
It was released because turing 1 wasnt selling well. And Nvidia dont want to be seen to just massively lower prices on their range and admit that they were wrong.
 
OK so what do we know so far guys about the 3080ti or 3090ti (Do we even know the name of it yet :confused: )

What the price of it ?
What the performance of it ?
How much memory does it have ?
How much power does it draw ?
Does it have full HDMI 2.1 support ?
What power inputs does it need (6-pin, 8-pin, etc) ?
Are they even going to be releasing the high end TI version straight away ?
ETC
ETC
 
Last edited:
OK so what do we know so far guys about the 3080ti or 3090ti (Do we even know the name of it yet :confused: )

What the price of it ?
What the performance of it ?
How much memory does it have ?
How much power does it draw ?
Does it have full HDMI 2.1 support ?
What power inputs does it need (6-pin, 8-pin, etc) ?
Are they even going to be releasing the high end TI version straight away ?
ETC
ETC
By the sounds of it the 3090 will probably cost 2 grand, which is utterly, utterly insane.
https://www.guru3d.com/news-story/rumor-geforce-rtx-3090-pricing-to-sit-at-2000.html
 
Anyone who honestly thinks there isn't a VERY realistic chance the 3090 will be £2K+ obviously hasn't been paying attention the past few years. It's far more likely than not at this point.
 
Back
Top Bottom