• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

Think I disagree with that even the marginal uplift is very poor other than 5090

You just have to look how much the cards are cut back , 5080 is more like 70 tier

That's the big problem. The difference between the 5080 and 5090 is crazy. 50% faster at 4k! That's such a massive gap between tiers in the same gen.

I remember when the very top silly prices card was only a bit faster than the one below it (is the ultra cards etc).

They absolutely could have had a decent new stack with good progress across the board. Instead they've cut down the 5080 and below so much, they are barely upgrades over the 4xxx series.
 
Last edited:
That's the big problem. The difference between the 5080 and 5090 is crazy. 50% faster at 4k! That's such a massive gap between tiers in the same gen.

I remember when the very top silly prices card was only a bit faster than the one below it (is the ultra cards etc).

They absolutely could have had a decent new stack with good progress across the board. Instead they've cut down the 5080 and below so much, they are barely upgrades over the 4xxx series.

Even the 5090 uplift from 4090 they increased the price so you still paying for what you should be getting uplift without such a price increase to go with it
 
Think I disagree with that even the marginal uplift is very poor other than 5090

You just have to look how much the cards are cut back , 5080 is more like 70 tier

Good point, and something I missed in my post. The gulf is huge between 5080 and 90 and it's obvious a ti or super is on the cards.

And yeah they are being disingenuous with how they have cut them back and named them to there actual performance.

But again if they were priced more fairly the price to performance metrics would be easier to stomach imo.

Maybe they will unlaunch the 5080 and below like the 4080 12gb lol. Bar the 5090 they all do need unlaunching and rebranding tbh but that will never happen.

Until they get real competition they will never change. I really hope the 9070xt takes a big chunk of the market away from them and make them think again but again unfortunately I doubt it.
 
Think I disagree with that even the marginal uplift is very poor other than 5090 even the 5090 uplift generational you paying extra for it

You just have to look how much the cards are cut back , 5080 is more like 70 tier

The power use on a 5090 is silly too and that needs to be worked out in all of this too. Also the melting connectors still a disaster. So even the 5090 with the uplift, silly price, PhysX 32 bit & 32 bit CUDA removed and of course the scalping done from AIBs/retailers/scalpers has killed this whole generation.
 
I watched a DF vid on the 5080 where the very first thing they said was “the important thing is that it beats the 4080” or something similar and it’s like… no ****? That’s like the absolute minimum required thing that it needs to do. Ridiculous way to open a discussion on it.
 
We probably will never see large performance increases again between generations unless some new revolutionary tech come along in how raster performance works.

Disagree, the 4090 and 5090 show it’s possible.

In the old days the performance difference between say a 780Ti and a Titan was in the single digits but there was a huge price difference.

Now they have widened the performance difference between the 90 part and the rest of the stack to try to funnel as many people to buy the 90 card and a hugely inflated cost relative to the other parts (profit).

One of the offshoots of this is that all the other cards in the stack are compressed in to a smaller performance window.

It’s pure greed on nvidias part.
 
Last edited:
Disagree, the 4090 and 5090 show it’s possible.

In the old days the performance difference between say a 780Ti and a Titan was in the single digits but there was a huge price difference.

Now they have widened the performance difference between the 90 part and the rest of the stack to try to funnel as many people to buy the 90 card and a hugely inflated cost relative to the other parts (profit).

One of the offshoots of this is that all the other cards in the stack are compressed in to a smaller performance window.

It’s pure greed on nvidias part.
And would be a great strategy if they actually had 5090s for sale
 
For anyone that cares LTT Labs did a scan of the 5090 connector...


Pretty obvious from the scan why they're treating what should be 3/4 separate 150/200W rails as a single 600W rail, and what's causing cables to melt.
 
For anyone that cares LTT Labs did a scan of the 5090 connector...


Pretty obvious from the scan why they're treating what should be 3/4 separate 150/200W rails as a single 600W rail, and what's causing cables to melt.

No such trouble with the 3090 Ti. Can only assume they changed the design to cut cost on the board.
 
Disagree, the 4090 and 5090 show it’s possible.

In the old days the performance difference between say a 780Ti and a Titan was in the single digits but there was a huge price difference.

Now they have widened the performance difference between the 90 part and the rest of the stack to try to funnel as many people to buy the 90 card and a hugely inflated cost relative to the other parts (profit).

One of the offshoots of this is that all the other cards in the stack are compressed in to a smaller performance window.

It’s pure greed on nvidias part.

It was possible and there was a large increase between 3090 and 4090, but there was a large gap in the node to account for that increase. 3090 was 8nm and 4090 4nm and it was something like a 70% performance increase. The 5090 is still on a 4nm node with some architectual changes and had a roughly 25% increase over the 4090 but it took 25% more power draw to get it there. TSMC even say performance increase node to node will not be massive increase. The 6090 will more than likely be on a 3nm node and TSMC own documents say "At the same power consumption, the 3nm node could potentially offer a 10-15% performance increase over the 4nm node". So you can either give it more power for more gains, or gain in efficency or else at the same wattage it will be roughly 15% better than a 5090.

The next step is 3nm and we cant go like the 3090 and 4090 and jump 4nm in a generation, it just isnt technically possible anymore.

Price to performance sucks, even for the 5090 and its only going get worse due to their greed.
 
Until they get real competition they will never change. I really hope the 9070xt takes a big chunk of the market away from them and make them think again but again unfortunately I doubt it.

It won't.

It's doubtful going to be cheap enough, and they won't get the OEM market share to make a difference.

AMD and rDNA 4 are not going to be the saviours of GPU'S for gaming.
 
The hate for DF is so childish :cry:. They literally show you all the graphs on screen of the exact performance and because they say performance is "fine" there shills. Why dont you just look at the graphs only then.

The first 30 seconds or so of this video is what I was specifically referring to in my post (before yours).


:o - and even the remainder of the info is curiously phrased. Oh yes, remember the ‘good old days’ where the 80 series would beat the previous 90 series…
 
Back
Top Bottom