• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

One thing I've not seen mentioned much with regards to the 50-series is the DP2.1 support.

With all the "uber" monitors being announced recently with bonkers resolutions and/or refresh rates, it could be that DP2.1 support is a reason to upgrade in itself before long.

There was a degree of surprise when the 40-series stuck with DP1.4 but, to be fair, there wasn't really any need for anything better a couple of years ago. That's changing.
 
I was really thinking to get 5090, not the workload but gaming. But the way it's designed (with ai in mind) I might pass on it. Nvidia clearly wasn't making gaming gpu this time, and hope it will give AMD advantage to pull out a competitive house next time
AMD will never be able to match the xx90 series.

It's just a 20-30% performance increase, which will be the same every generation going forwards.

Same is happening with cpu's as the nanometer process is it getting to hard to have a big jump each generation so they focus marketing on other things.

Nanometer....
1080ti 16nm
2080ti 12nm
3090 8nm
4090 5nm
5090 4nm

If you are on 40 series it doesn't make sense but 30 series and below it's at least 100% jump in performance.
 
going back to old 40 series threads on release is pretty fun

69% said no to upgrading to a 40 series

trying to open Kaapstad owner thread almost breaks my laptop with the amount of pictures in it
 

This one is also a good read if you go back to October 12th 2022, the 4090 launch day and watch the chaos unfold if anyone's wondering how the launch might go in a few weeks time :D
 
Last edited:
If 4090 +34% on 3Dmark Speed Way turns out to be true. Thats what the comparison would look like.

MscQTPu.jpeg
 
AIBs margins have been trimmed to the utmost (largely by NV) over the past few years. My guess is the lowest-priced AIB will start at £2200 for a 5090, £1200 for a 5080 RRP. Then there will be a bit of reseller pork belly added. We're looking at £2300 and £1300 for a bottom-rung Zotac or PNY.

One of the reasons nV made the prices somewhat lower than expected was to allow AIBs to add some on top.
I never saw a single aib 4090 below 2000. I think it will be even worse this time.
 
+30% gen on gen is a good improvement IMO. Even us 4090 users used frame gen on Cyberpunk and Alan Wake 2 etc so it will come in useful.
People seem to forget it will still have insane raw performance as well as improved frame gen etc.
The way many people are talking on forums they seem to think the 5090 is less powerful than a 4080 or something.
 
I've seen a couple comments, I'm pretty sure even on here in the past week about how the 5090 is cut down and overpriced, I don't know the die size and don't remember the spec sheet, but I remember reading it and thinking that it looked absolutely fucken' buckwild bonkers? It's just too much money for me to justify especially to my family, and my friends would rip me apart... but it seems like it probably fits its price from the specs?

What were people talking about? Is it a small die or something?
 
+30% gen on gen is a good improvement IMO. Even us 4090 users used frame gen on Cyberpunk and Alan Wake 2 etc so it will come in useful.
30% at a given price point would be an improvement, but if Blackwell money isn't any faster than Ada money, it's stagnation in my view.

I think the real generational improvement this gen is GDDR7. The 80 card should get a nice bump from the bandwidth. -even in the absence of a meaningful increase in Cuda cores. Now that Nvidia moved the goalposts on pricing for the last gen, the 5080 looks like a decent generational improvement this gen. (Comparatively)

The 90 card looks like Nvidia had to go the brute-force route for a generational bump and passed on the extra cost to us.

I also think these cards will be productivity beasts with the extra bandwidth.
 
Last edited:
Its not really because last gen they out of the gate charged £1200+ for the 4080 and I think this is why they came with £999 this time..
Yeah that part makes sense to me, but they made the 80 class even further apart from the 90 class to justify that price IMO. If it was only +10% no way would they charge as much for the 5090 IMO. Would anyone here really pay +1000 for 10% uplift?
 
So what is it that triggers the bigger improvements? Is it when they shrink the die or whatever, if that's due for the 6 series then maybe that's the one it will be worth upgrading to
 
Yeah that part makes sense to me, but they made the 80 class even further apart from the 90 class to justify that price IMO. If it was only +10% no way would they charge as much for the 5090 IMO. Would anyone here really pay +1000 for 10% uplift?

Its the upsell. They read all you guys saying the 4090 was best value and doubled down on this tactic. :p

Regarding the latter, I skipped the Ada so coming from an ampere its not so bad, would as usual though prefer any gen improvement and more vram over the sku's. Supers will probably be best but again mid cycle.
 
Last edited:
+75w from pcie slot. What did the extra 150w give in performance out of interest?
Honestly, it was like 3-5% from what I remember, I've kept my 4090 undervolted since, but the thing here is that the sheer amount of cores is why the card is clocked lower than the 4090, what I'm thinking, is that if you could somehow unlock power, then there's a chance you could push this card higher to similar clocks as the 4090
 
As Parisv said you can get a little more from the pcie slot.

However compare core count and TDP, divide the two numbers: Shows the out of the box wattage per core is exactly 0.028watts per Core for both the 4090 and 5090. That means there is no efficiency gains and both cards have been pushed as hard as each other, and we know the 4090 was pushed quite hard by Nvidia, hence the tiny overclocking headroom, so in summary, yes I concur the 5090 will have little to no OC headroom
ah that sums it up nicely, thanks
 
Back
Top Bottom