• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Yeah you are right and I didn't mention the Ti's (like from the old 750 days) as I didn't want to confuse things even further now that Ti refers to the top cards, so referred only to the current "Super" editions.

One of the only smart buys at the beginning of a generation is the top-end Ti version which stays performant and retains a lot of value during the lifecycle. I hope Ampere doesn't buck this trend.
I was thinking of the 1050 and 1660?

Still so many questions. Will the 80Ti be released with the 80 cards like Turing or will it be months down the line as previous releases?
 
You don't see how my post relates to yours... really? Like, at all? Phew... an intellectual discussion with you is not easy, chuk. :D

Your post talked about Super editions like it was a new thing. It's not a new thing. It obviously happened with Turing and Nvidia clearly see now it as a viable mid-refresh strategy to maintain the performance lead and to compensate for any advancements in the manufacturing process since the initial cards were released..

I know they exist :p I was more surprised that people are already expecting Nvidia to short change us, and double checking that I'm not misunderstanding the reason for their existence.
I guess i could have worded it better

If Nvidia release most things from Ampere on the Samsung 8nm process, then we can realistically expect to see a 7nm refresh with improved clocks and specs next year. This is because, as we all should know by now, Nvidia failed to secure TSMC production capacity and had to switch to Samsungs inferior process in order to fulfil production requirements. However, even if Nvidia did release everything on 7nm, I would still realistically expect to see a Super refresh next year because Nvidia now know that this mid-cycle strategy works and was accepted by the community, who voted with their wallets.

So in this regard, unless you buy a 3080Ti, then you are very likely going to have a card that will be superceded next year. However, at the moment it would not surprise me if they refreshed the Ti too in some form, though I hope that would not be the case as historically this card was the golden child and remained top dog until the end of the cycle.

On the following two points
1. Refresh based on TSMC 7NM
While i agree that Nvidia will want to hold on to the crown no matter what. We are making assumptions on how much of clock bump they will get and how much of a difference it will make to performance. I was under the assumption that biggest difference is power consumption between the two nodes.

2."Nvidia now know that this mid-cycle strategy works and was accepted by the community, who voted with their wallets."
Isn't the reason it "worked" because everything below a 2080ti was a failure in terms of price to performance?
 
Are you new to PC tech (joke) :p
It's nothing new. Nvidia have done it with Ti versions lower down the stack. AMD have just done it with XT CPUs.

It does bring up the old connundrum, wait for the new release. Sit tight for the S series and the 4000 series is only 12 months away.

If you always wait you can save a packet of money :p
Depends on how you define new :p

I might as well wait till RDNA 3 and Hopper to make my decision.;)
 
On the following two points
1. Refresh based on TSMC 7NM
While i agree that Nvidia will want to hold on to the crown no matter what. We are making assumptions on how much of clock bump they will get and how much of a difference it will make to performance. I was under the assumption that biggest difference is power consumption between the two nodes.

2."Nvidia now know that this mid-cycle strategy works and was accepted by the community, who voted with their wallets."
Isn't the reason it "worked" because everything below a 2080ti was a failure in terms of price to performance?

Nvidia released the best performing cards they had at the time and then refreshed them later when they had better and more refined cards to release. Anything else such as views on them being price/performance "failures" is pure subjectivity. Judging by the overall sales, they were not failures. Also, the Super cards were around 10-15% faster on average, which is perhaps not so significant as to make purchasers of the original cards feel 'too' hard done by. It would still of course be very annoying to some people, though.
 
Last edited:
Nvidia released the best performing cards they had at the time and then refreshed them later when they had better and more refined cards to release. Fact

Anything else such as views on them being price/performance "failures" is pure subjectivity. Judging by the overall sales, they were not failures. Also, the Super cards were around 10-15% faster... on average, which is perhaps not so significant as to make purchasers of the original cards feel 'too' hard done by. It would still of course be very annoying to some people, though.

I mean ... isn't this just how everything works?

Sony release the PS4, a year or so later there's a price drop and a 'slim' redesign as they work out how they can do it better and cheaper.
 
I mean ... isn't this just how everything works?

Within reason, yes, though it varies as to how aggressive companies are in their refresh cycles. I guess people have issues with assuming that Nvidia are holding back the better designs on purpose, when there is very little evidence to suggest that is the case.
 
It's also a generational thing.

Those of us still middle-aged or younger (I'm 40) are used the idea of augmenting yourself with glasses or hearing aids, and it's not only not a big deal, but you'd bend over backwards to find a device which could restore your sight/hearing (when the time comes). In other words, we embrace the idea that devices can help.

I've found some people of our parents gen are much more resistant to doing anything to improve their "natural" senses. I don't know why; it's something of a mindset perhaps mixed with some stubbornness in accepting the inevitable.

Me, when my hearing starts to go I'll be researching the best hearing aids and doing whatever I can to keep my hearing.
I don't think you can generalise like that I have a load of mates in their 40s who have really poor eyesight but simply refuse to countenance wearing glasses. They say moronic things like "I'm the only one in my family who doesn't wear glasses" yet are unable to read the labels on packaging for a Sunday roast etc. This isn't one or two it's most of my friends of that age. They simply can't accept eyesight declines with age. My uncorrected vision is better than a lot of theirs yet I also own reading glasses.
 
To be honest even my 2080 struggled 4K60. Although I would be happy to compromise if all games supported DLSS.
A 2080TI struggles at 4K 60 in a lot of games if you want a decent level of MSAA all ultra etc, it's simply not quite fast enough. The situation will only get worse with newer AAA games.
 
A 2080TI struggles at 4K 60 in a lot of games if you want a decent level of MSAA all ultra etc, it's simply not quite fast enough. The situation will only get worse with newer AAA games.

Have you actually played 4k60? I can tell you that MSAA is not nearly as important when you get to 4K as it is for 1080 or 1440.

It's a personal choice, though I far prefer the extra detail/sharpness I get at 4k. 4k with no AA looks far superior to 1440P with 4xaa.
 
A 2080TI struggles at 4K 60 in a lot of games if you want a decent level of MSAA all ultra etc, it's simply not quite fast enough. The situation will only get worse with newer AAA games.
If 4K becomes the standard for next gen games, then I suspect it shouldn't be an issue with the high end 3000 cards.
 
+1 Same I'm not in a rush and we could see an 1800XT 7800GTX 512mb 1900XT situation although hopefully not. When the dust settles I'll make a decision.

Yea, I am happy to wait until November now as Cyberpunk got delayed. That said if they price the RTX 3070 well from day one, I may buy it to play Flight Simulator 2020. If not then will wait.

Also we will have custom cooled cards by November hopefully, so I could go for a Zotac which has 5 years warranty and usually have good coolers. Will see, though i must admit I may not be able to resist getting one from day one as I have been waiting a long time as I skipped the 2000 series and do fancy something new :D
 
A 2080TI struggles at 4K 60 in a lot of games if you want a decent level of MSAA all ultra etc, it's simply not quite fast enough. The situation will only get worse with newer AAA games.

I don't really get the relationship between game image quality and GPUs. The 2080ti is the top consumer gaming GPU, so what's the point in any current gen game using image quality settings that are too much for that GPU to handle? Its a waste of development time making settings and textures for max IQ if no GPU can run it.

It feels backwards, that game development would be pushing the GPU manufacturers forward - should really be the other way around as hardware clearly is the ultimate limiting factor. It is common sense that all current games max settings would be those aligned to the performance of the 2080ti and no higher.

The only time it seems to pay off is when you come back to an older game several years down the line, and can run it all on max settings no problem. However, image/texture quality always moves on, so a newer game on medium settings tends to still look better than an older game on max.
 
Back
Top Bottom