• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

For me dlss comes down to been able to control sharpness setting. Too much image looks artificial, too little and it looks soft compared to native. There is a sweet spot if you truly want the best image quality.
Sharpness doesn't add details though, it just makes things more... sharp. :) Not the same thing. Just playing in CP2077 I can see new model brings up considerably more details and image stability, which reduces the need to sharpen the image much. It's still FAR from perfect and what amount of details native 1440p can bring, especially with animated textures, transparency etc. - but it's still a very good improvement for sure.
 
Helldivers 2 doesn't have DLSS or FSR, it's a proprietary upscaler that is indeed complete pants.
As far as I've read, it's just TAA used to upscale by simple temporal accumulation, with 0 AI and the likes. Hence, old tech and old results. :)
 

I mean, I know we likely already realized this stuff but.. This guy's been fairly good at data consolidation/analysis so figured he's worth sharing.
Plus he got me to thinking about where the 5080 will stack up in the lineup of 4080 / Super / 4090 / 5090 ... and it looks like it'll be significantly slower than the 4090, and only giving us 5-10% (10% is best case scenario) over the 4080 super. It's basically just confirming our earlier predictions that the 5090 will be the only card that actually has any real uplift :/

(Also, pretty much confirming our suspicions that the 5080 is really just a mislabelled 5070)

I’m not defending the 5080, it looks like a distinctly terrible price/perf from the existing 4080. But I do need to ask how being ~10% slower than a 4090 is “significant”, while being 10% faster than a 4080 is “terrible”? Technically both are not significant.

A 4080 (non super) is 25% slower than a 4090 as it is.

Like so say, not saying the 5080 is good, but even 10% - 15% slower than a 4090 for a lot less money is still a better price/perf than a 5090.
 
Sharpness doesn't add details though, it just makes things more... sharp. :) Not the same thing. Just playing in CP2077 I can see new model brings up considerably more details and image stability, which reduces the need to sharpen the image much. It's still FAR from perfect and what amount of details native 1440p can bring, especially with animated textures, transparency etc. - but it's still a very good improvement for sure.
I do think you need to add a bit of sharpness back to the image from an upscale otherwise it looks good but not as good as native IMO. Obviously too much as it goes beyond that and adds "detail" that's not meant to be there.
 
Sharpness doesn't add details though, it just makes things more... sharp. :) Not the same thing. Just playing in CP2077 I can see new model brings up considerably more details and image stability, which reduces the need to sharpen the image much. It's still FAR from perfect and what amount of details native 1440p can bring, especially with animated textures, transparency etc. - but it's still a very good improvement for sure.
I'm keen to see the changes, I've not played CP2077 since the latest patch but I've been playing through on DLSS performance so it'll be interesting!
 
I'm keen to see the changes, I've not played CP2077 since the latest patch but I've been playing through on DLSS performance so it'll be interesting!

It looks absolutely superb after the latest patch.
 
I’m not defending the 5080, it looks like a distinctly terrible price/perf from the existing 4080. But I do need to ask how being ~10% slower than a 4090 is “significant”, while being 10% faster than a 4080 is “terrible”? Technically both are not significant.

A 4080 (non super) is 25% slower than a 4090 as it is.

Like so say, not saying the 5080 is good, but even 10% - 15% slower than a 4090 for a lot less money is still a better price/perf than a 5090.


This will really be game specific

For example if I look at Cyberpunk with RT on at 4k, adding 10% performance for the 4080 still leaves it it 30% behind the 4090. The more demanding the game is the wider the gap between these cards will be as the 4090 has more bandwidth and cores
 
Last edited:
And here's me, happy with 100Hz! :D:P
Tbh depending on the game I'm more than happy with 60hz as long as frame pacing is absolutely perfect :D my main screen is a LG OLED at 4k/120 and more than happy on that 90% of the time, the 480hz monitor is a 2nd screen and used for when I fancy playing at much higher fps although there is a very very noticeable difference in motion clarity between the 2 but will only notice that on a back to back test :cry:
 
I’m not defending the 5080, it looks like a distinctly terrible price/perf from the existing 4080. But I do need to ask how being ~10% slower than a 4090 is “significant”, while being 10% faster than a 4080 is “terrible”? Technically both are not significant.

A 4080 (non super) is 25% slower than a 4090 as it is.

Like so say, not saying the 5080 is good, but even 10% - 15% slower than a 4090 for a lot less money is still a better price/perf than a 5090.
Yeah - it's this kind of detail that could be painted in a far more positive light if the 5080 had more VRAM.
 
All y'all defending the 5080 for a 10% gen-on-gen price performance improvement need to take a hard look in the mirror.

But I guess compared to the 5090 where there's literally zero (or even negative) price/performance change, I suppose it's understandable. this is what it's come to.
 
I’m not defending the 5080, it looks like a distinctly terrible price/perf from the existing 4080. But I do need to ask how being ~10% slower than a 4090 is “significant”, while being 10% faster than a 4080 is “terrible”? Technically both are not significant.

A 4080 (non super) is 25% slower than a 4090 as it is.

Like so say, not saying the 5080 is good, but even 10% - 15% slower than a 4090 for a lot less money is still a better price/perf than a 5090.

Because you're making the mistake that 10% = 10%. It all depends are where you're taking your index. X is 10% more than Y =/= Y is 10% less than X.

Let's plug some figures in ;; We'll pull these from https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html and use the 4k RT diff (if you're buying this high end, chances are you aren't wanting to play 1080p low :p )

4090 is listed at 55.9fps
4080 is listed at 37.8fps

So let's at 10% to the 4080 (so 10% faster than a 4080) = 41.58fps
Let's now take 10% off the 4090 (so 10% slower than 4090) = 50.31fps <<<< Notice how different.

Let's do it for 1440p - just to give the 4080 a better shot.

103.9fps vs 76.0fps
76 + 10% = 83.6fps
103.9 - 10% = 93.51fps <<< Still very different.

This is where you made the mistake. Being 10% quicker than a 4080 does not mean it's 10% slower than a 4090. The difference is more "significant" than 10% ;)
 
Tbh depending on the game I'm more than happy with 60hz as long as frame pacing is absolutely perfect :D my main screen is a LG OLED at 4k/120 and more than happy on that 90% of the time, the 480hz monitor is a 2nd screen and used for when I fancy playing at much higher fps although there is a very very noticeable difference in motion clarity between the 2 but will only notice that on a back to back test :cry:
I can easily deal with 60hz on my 55 inch OLED telly, is surprisingly good but I sit much further away from that. Has to be at least 100hz on a monitor (mine does 165hz) but upwards of 100-120hz my old eyes don't see much difference at all.
 
Because you're making the mistake that 10% = 10%. It all depends are where you're taking your index. X is 10% more than Y =/= Y is 10% less than X.

Let's plug some figures in ;; We'll pull these from https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html and use the 4k RT diff (if you're buying this high end, chances are you aren't wanting to play 1080p low :p )

4090 is listed at 55.9fps
4080 is listed at 37.8fps

So let's at 10% to the 4080 (so 10% faster than a 4080) = 41.58fps
Let's now take 10% off the 4090 (so 10% slower than 4090) = 50.31fps <<<< Notice how different.

Let's do it for 1440p - just to give the 4080 a better shot.

103.9fps vs 76.0fps
76 + 10% = 83.6fps
103.9 - 10% = 93.51fps <<< Still very different.

This is where you made the mistake. Being 10% quicker than a 4080 does not mean it's 10% slower than a 4090. The difference is more "significant" than 10% ;)

lol, my baseline was the 4080, I know how percentages work. ~10% faster than a 4080 is indeed not great, but neither is the 4090 “significantly” faster than a 5080 if the difference is ~12%

My apologies, I’m not specifically calling out the percentages, but the hyperbole about OMG the 5080 is only 10% faster than a 4080. Yet at the same time calling it “significantly slower” than a 4090.
 
Last edited:
lol, my baseline was the 4080, I know how percentages work. ~10% faster than a 4080 Super is indeed not great, but neither is the 4090 “significantly” faster than a 5080 if the 5080 is 12% slower.

I'd call both 10% faster than 2 year old hardware and 10-15% slower than only one tier up of a 2 year old card terrible (on the 5080s part).. Then if you consider the 4080 is about 20% faster than the 3090, so if the 5080 is *only* 10% slower than the 4090.... the generational delta is VERY much significant, instead of a decent % uplift we're getting a notable ... downlift? My mind's just gone cooey.... Drop?
 
I'd call both 10% faster than 2 year old hardware and 10-15% slower than only one tier up of a 2 year old card terrible (on the 5080s part).. Then if you consider the 4080 is about 20% faster than the 3090, so if the 5080 is *only* 10% slower than the 4090.... the generational delta is VERY much significant, instead of a decent % uplift we're getting a notable ... downlift? My mind's just gone cooey.... Drop?

Oh no I totally agree, the 5000 series really isn’t great at all IMHO. Just clarifying that if 10% is seen as “poor”, then 15% is hardly “significant” in comparison. Even if it is technically 50% better ;)
 
Last edited:
Oh no I totally agree, the 5000 series really isn’t great at all IMHO. Just clarifying that if 10% is seen as “poor”, then 15% is hardly “significant” in comparison.

In context of what the % represents, I believe it is. Again, it's already 2 year old hardware... and check 3090vs4080 comparison point against 4090vs5080.

If you don't agree, that's cool. Is just what I said and how I feel :p
 
Back
Top Bottom