• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

The cooler design is a marvel; even just the heatsink on the FE 5090 is amazing and is different to how anyone else makes heatsinks. Nvidia tried everything and spared no expense in designing this near on 600w card from fitting in two slots and small pc cases

 
Last edited:
I'd want at least +50% if I was spending 1k and probably closer to 100% for 2k
IDK I'm trying to be reasonable and fair.

3070 vs 4070 is like 30%. Those are considered 2 good cards. Seems like a reasonable uplift, IDK.

I did trust userbenchmark's number though, just since I figured it was nvidia vs nvidia, but now that I'm looking around, it seems like their numbers are nonsense.

Like they say the 3080 is like 25% faster than the 2080, but if I check benchmark videos, it seems like it's a bigger uplift than that. More like 35-40%. I thought as long as AMD wasn't in the skew, then they were vaguely reliable.


The whole conversation is moot though since I agree, things are too expensive for what they bring to the table. But the market is willing to accept 30% I think
 
Mine are the same, which is why I went from 1080ti to 3080, and am looking for my next upgrade. If you upgraded to something like the 4090, then you sort of signed up to having a limited upgrade path i.e. 5090 or nothing.
I went exactly the same path, You'd think 2 years of gens these days and progress would be better but obviously not because of the same node so was expected to be honest and I don't expect the reviews to show anything different to what we already know because if it was Nvidia would have made a song and dance about it at CES which they obviously did not except 1x and 2x and the bullcrap lol.
 
Last edited:
IDK I'm trying to be reasonable and fair.

3070 vs 4070 is like 30%. Those are considered 2 good cards. Seems like a reasonable uplift, IDK.
Those are under $600 though
I did trust userbenchmark's number though, just since I figured it was nvidia vs nvidia, but now that I'm looking around, it seems like their numbers are nonsense.

Like they say the 3080 is like 25% faster than the 2080, but if I check benchmark videos, it seems like it's a bigger uplift than that. More like 35-40%. I thought as long as AMD wasn't in the skew, then they were vaguely reliable.
Try more like +70% for the 3080, it even blew the 2080ti away. This was why people went nuts for it especially with an initial MSRP of £650.

Screenshot-963.png
 
Last edited:
3090 wasn't the halo card though, the 3090ti was. I thought the 5090 wasn't out yet?

No it wasn't, the 3090 was the "halo" card, the 3090Ti was a cynical release, 16 months later (and only 8 months before the 40-series was launched) priced at £2k and barely any faster than a 3090 anyway.

It was just to milk those daft enough to fork out for it and most people ignore it and pretend it never existed :)
 
Last edited:
It definitely will be slower.

The 4090 is 16,384 shaders pulling 450W. The 5080 is only 10,752 shaders pulling 360W. Remember both are on the TSMC 4nm process.

Weren't they saying the old scheduler left loads of the 4090 cores unused though? And all the benchmarks where the 4090 was only 30% ahead of the 4080.

I'm not saying the 5080 WILL match it, but it might get close!
 
Weren't they saying the old scheduler left loads of the 4090 cores unused though? And all the benchmarks where the 4090 was only 30% ahead of the 4080.

I'm not saying the 5080 WILL match it, but it might get close!

I mean derbauer has already said it's worse.
 

Gonna be interesting on all series’

Nice. It's take a while but its basically a better version of this (good video worth watching). More controlled timing of frames and AI filling.


Nvidia have been slow to implement this. AMD could have beaten them to the punch if they wanted.
 
Last edited:
Amidst all the fervour surrounding frame generation, I've been thinking...

Are we caught in a kind of "FPS arms race"? I'm reminded of the daft megapixel races that have affected cameras (and phones) in the past.
Everything these days seems to be about ridiculous FPS numbers. CES is full of new monitors each boasting higher refresh rates than the previous models. Now all of Nvidia's 50-series marketing is touting framerates in the 200+ range.

Now I get that there are competitive gamers for whom framerate is everything but frankly I feel this is a relatively small demographic.
For the rest of us, such ludicrously high framerates really aren't necessary at all. No, I'm not one of those claiming "60 FPS is all you need" and I have been a proponent of high & variable refresh rates since the advent of G-Sync & Freesync many years ago.
I'm saying that high frame rates aren't the be-all and end-all to enjoying a game. In many cases, the variable refresh rates that G-Sync/Freesync facilitate actually negate the need for high refresh rates as the monitor can adapt better to lower FPS rather than needing to maintain FPS above the monitor's fixed refresh at all times.

As an example, take Cyberpunk which has been used a lot in Nvidia's marketing, clearly because it's such a difficult game to run. As such, it's clearly a great showcase for their new features. Throw DLSS Performance mode and 4x MFG at it and hey presto, you've got 4K at 200+ FPS. Sadly, this scenario means only 6.25% of what you're seeing was actually generated by the game engine.

Personally, I run Cyberpunk on my 3440x1440 screen with full path tracing and DLSS Quality mode and average in the 75 FPS region with my 4090.
Nvidia would have us believe that this isn't good enough and I should be using even stronger DLSS with MFG to get my framerates up might higher.

Frankly the game feels smooth enough for me and my G-Sync monitor ensures the monitor stays synced to the fluctuating FPS from the GPU.
The trade-off from using a stronger form of DLSS just isn't worth it to me, let alone using frame generation.

What I'm saying is that 75-ish FPS is absolutely fine for me and I suspect it would be for many others, despite what Nvidia would have us believe.
 
I think its more a resolution arms race that has got out of control. Because people want to render a 4k image at max settings at 60 or 120 fps this is the route gaming has gone down, even though in reality it is always better for game developers to build better artwork, textures etc. rather than ensuring it runs at 4k60/120. It started with checkerboard rendering in the Playstation 4.
 
Last edited:
Back
Top Bottom