• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

I think its more a resolution arms race that has got out of control. Because people want to render a 4k image at max settings at 60 or 120 fps this is the route gaming has gone down, even though in reality it is always better for game developers to build better artwork, textures etc. rather than ensuring it runs at 4k60/120. It started with checkerboard rendering in the Playstation 4.
4k has been around for years though.
 
4k has been around for years though.

So? A 4k game from 5 years ago looks way worse than a 1440p game created today. This is the problem, the focus on resolution is out of control.

The Indiana Jones game at 1440p is breathtaking at max settings. Console developers have already worked this out and don't sacrifice image quality for resolution.
 
Last edited:
Amidst all the fervour surrounding frame generation, I've been thinking...

Are we caught in a kind of "FPS arms race"? I'm reminded of the daft megapixel races that have affected cameras (and phones) in the past.
Everything these days seems to be about ridiculous FPS numbers. CES is full of new monitors each boasting higher refresh rates than the previous models. Now all of Nvidia's 50-series marketing is touting framerates in the 200+ range.

Now I get that there are competitive gamers for whom framerate is everything but frankly I feel this is a relatively small demographic.
For the rest of us, such ludicrously high framerates really aren't necessary at all. No, I'm not one of those claiming "60 FPS is all you need" and I have been a proponent of high & variable refresh rates since the advent of G-Sync & Freesync many years ago.
I'm saying that high frame rates aren't the be-all and end-all to enjoying a game. In many cases, the variable refresh rates that G-Sync/Freesync facilitate actually negate the need for high refresh rates as the monitor can adapt better to lower FPS rather than needing to maintain FPS above the monitor's fixed refresh at all times.

As an example, take Cyberpunk which has been used a lot in Nvidia's marketing, clearly because it's such a difficult game to run. As such, it's clearly a great showcase for their new features. Throw DLSS Performance mode and 4x MFG at it and hey presto, you've got 4K at 200+ FPS. Sadly, this scenario means only 6.25% of what you're seeing was actually generated by the game engine.

Personally, I run Cyberpunk on my 3440x1440 screen with full path tracing and DLSS Quality mode and average in the 75 FPS region with my 4090.
Nvidia would have us believe that this isn't good enough and I should be using even stronger DLSS with MFG to get my framerates up might higher.

Frankly the game feels smooth enough for me and my G-Sync monitor ensures the monitor stays synced to the fluctuating FPS from the GPU.
The trade-off from using a stronger form of DLSS just isn't worth it to me, let alone using frame generation.

What I'm saying is that 75-ish FPS is absolutely fine for me and I suspect it would be for many others, despite what Nvidia would have us believe.
I think it’s better for 5070 buyers, at least they can have smoother gameplay if they want but at a latency cost. The 4090/5090 ‘enthusiasts’ will want raw performance
 
The cooler design is a marvel; even just the heatsink on the FE 5090 is amazing and is different to how anyone else makes heatsinks. Nvidia tried everything and spared no expense in designing this near on 600w card from fitting in two slots and small pc cases

After watching GN’s video of the 4 slot prototype and this one, it definitely seems there are lessons learnt: 1) flow through design, and 2) multiple PCBs. I am really curious about how it performs. Does the flow through design really negate the need for a 3.5 slot monster?
 
I think it’s better for 5070 buyers, at least they can have smoother gameplay if they want but at a latency cost. The 4090/5090 ‘enthusiasts’ will want raw performance
I agree, although the key to the success of frame gen is having a high base frame rate, so it may not be as useful on a low end card like a 5050. More generally, I believe having more options is better when it comes to DLSS (ignoring the challenge that can create for developers). I don’t have an issue with DLSS4, just the marketing of it. It’s as if all us buyers are idiots and don’t know the difference between raw performance and image smoothness.
 
After watching GN’s video of the 4 slot prototype and this one, it definitely seems there are lessons learnt: 1) flow through design, and 2) multiple PCBs. I am really curious about how it performs. Does the flow through design really negate the need for a 3.5 slot monster?

If you watched both like I did then yes. Its a really good design. The GN temps were amazing.

Would like to know the fan noise as the 3090fe was horrific

By having nothing in the way the noise should be far less. Although in a cramped case it might be less impressive.
 
Last edited:
So? A 4k game from 5 years ago looks way worse than a 1440p game created today. This is the problem, the focus on resolution is out of control.

The Indiana Jones game at 1440p is breathtaking at max settings. Console developers have already worked this out and don't sacrifice image quality for resolution.
What is a 4K game?
 
If nvidias fake frames really takes off it's going to come back to burn them

For example let's say you have a 4k 240hz OLED monitor, it's the best on the market right now. So all you need is 240fps. Nvidia comes out and says this 5070 or 5070ti will give you 240fps, great right, but now what is the point of the 5080 and 5090?

So where we are headed is in future Nvidia will sell us small and weak GPUs that create 100s of fake frames and no one will be buying high end GPUs anymore and they'll be kept and sold to data centers
 
Last edited:
If you watched both like I did then yes. Its a really good design. The GN temps were amazing.



By having nothing in the way the noise should be far less. Although in a cramped case it might be less impressive.
Sorry, to be clear I mean how the 5090 FE performs given the 575w power draw (although I doubt it hits that in gaming) and 2 slot design. That 4 slot prototype cooled very well as expected!
 
FPS depends on the game, I play Control at a locked 85 fps for example, as with other 3rd person games that dont' require super fast movements as this means I can game silently whereas if it was at say 120fpos then the GPU fans are a bit louder, the GPU load is a bit higher and the CPU is working a bit harder, all to sustain 120fps in a game that sees no obvious visual advantage vs 85fps, or 100fps or whatever the figure is for each game.

One thing is clear though now that I'm on 240Hz, 60fps even if locked does not feel like the smooth 60fps is once was on a lower refresh rate monitor, actually since going OLED really. The only time this isn't the case is when using a controller to play those games. With a mouse and keyboard 60fps isn't a completely nice experience otherwise on a modern high performance monitor.

The problem is gamers, too may casual PC gamers expect to be able to render games at high fps at 4K native yet seemingly have no clue as to the tech involved to get everything aligned perfect enough for that experience to be a good one vs a poor one. It only takes a casual glance at Steam forums any time a UE5 game launches :cry:
 
Sorry, to be clear I mean how the 5090 FE performs given the 575w power draw (although I doubt it hits that in gaming) and 2 slot design. That 4 slot prototype cooled very well as expected!

Yes I see, well unfortunately the thinner 2 slot would be great if the gen advancement meant a smaller die (like 30 > 40 series). So less surface area and more watts its going to be a poor adjustment. However the 80 and 70ti if on the same cooler design should be a little better.
 
If nvidias fake frames really takes off it's going to come back to burn them

For example let's say you have a 4k 240hz OLED monitor, it's the best on the market right now. So all you need is 240fps. Nvidia comes out and says this 5070 or 5070ti will give you 240fps, great right, but now what is the point of the 5080 and 5090?

So where we are headed is in future Nvidia will sell us small and weak GPUs that create 100s of fake frames and no one will be buying high end GPUs anymore and they'll be kept and sold to data centers
What if you have a 6k or 8k monitors?

4k is a pretty standard res in todays society. Every TV for example sold is mostly 4k.

4k res has been a resolution that has been in our society for a very long time now.

It should pretty much be teh base mark now
 
Any game rendered at 4k?
'A 4k game from 5 years ago looks way worse than a 1440p game created today.'

I didn't know what you were getting at with the wording above but presume now that you just mean newer games look better than older games even with newer at lower resolution.
 
What if you have a 6k or 8k monitors?

4k is a pretty standard res in todays society. Every TV for example sold is mostly 4k.

4k res has been a resolution that has been in our society for a very long time now.

It should pretty much be teh base mark now

So sacrifice image quality just to hit 4k?

Or are you complaining that your inflated expectations are rising more quickly than technology can actually do.

If Nvidia (or AMD) had never introduced RT, PT or made true GI possible etc., you'd have your Timespy quality looking games running at 4k 120 fps.

'A 4k game from 5 years ago looks way worse than a 1440p game created today.'

I didn't know what you were getting at with the wording above but presume now that you just mean newer games look better than older games even with newer at lower resolution.

I think everyone knew what I meant.

You did as well.
 
Last edited:
Back
Top Bottom