• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

The most interesting part of that HUb video is the native rendering rate declining with frame gen enabled. I thought if you had, say, a 100fps native rate, then that would persist across all levels of frame gen, but it declines the more frame gen you use. So at 4x your base fps is nearer 80.

The benefits just aren't there for me yet.
 
The most interesting part of that HUb video is the native rendering rate declining with frame gen enabled. I thought if you had, say, a 100fps native rate, then that would persist across all levels of frame gen, but it declines the more frame gen you use. So at 4x your base fps is nearer 80.

The benefits just aren't there for me yet.

Frame gen must create some overhead which is lowering the native framerate
 
Frame gen must create some overhead which is lowering the native framerate
It’s the same with lossless scaling although with that you can offset the overhead by using either a second gpu or the CPUs graphics to offload the FG processing.
 
Frame gen must create some overhead which is lowering the native framerate
Definitely, so it's not even "free" motion smoothing.

You have to ask - when adding all this hardware to facilitate upscaling and frame gen, at what point does it make more sense to just make use of that to faciliate better native rendering?

Upscaling and frame gen is great for aging cards that can't keep up natively - DLSS has been a godsend for my 3080 - but for newer cards, how are they going to age if DLSS is such a big factor in performance now?
 

I mean, I know we likely already realized this stuff but.. This guy's been fairly good at data consolidation/analysis so figured he's worth sharing.
Plus he got me to thinking about where the 5080 will stack up in the lineup of 4080 / Super / 4090 / 5090 ... and it looks like it'll be significantly slower than the 4090, and only giving us 5-10% (10% is best case scenario) over the 4080 super. It's basically just confirming our earlier predictions that the 5090 will be the only card that actually has any real uplift :/

(Also, pretty much confirming our suspicions that the 5080 is really just a mislabelled 5070)

5080 having nearly the same core count as 4080 and benchmarks showing the 5080 is only 5-8% faster than the 4080 confirms this, the 5080 really is a 5070 and the 5070 is a 5060
 
Last edited:
It's clear MFG marketing has been at best misleading and at worst, blatant lies. I would hazard a guess and say most single player gamers are fine in the 60 - 90fps range. Considering that's the baseline for MFG to get a managable frametime, what's the point for most people?

I absolutely understand enthusiasts wanting to max their hz and for them, MFG may be great. I do, however, think it is kinda useless for the average person. Of course the kind of person on a PC hardware forum isn't average, but the marketing wasn't aimed just at us, it was aimed at everyone.
 
Why does the left hand image show 77 FPS and ~31ms if there's no frame Gen? 77 FPS should be ~13ms.

Not to dunk on your post; I've seen the same youtube video so I know where it's from, but it's curious to see those numbers.
If you have an NV card can enable the overlay and have it show PC Latency which in theory should give you an idea of the total latency from input to display output rather than how long it takes to render a frame. 30 ms isn't too bad but closer to 20 is ideal for me. Some games are better than others in this respect and if you disable reflex in Cyberpunk the latency is awful (which I'm sure NV use to inflate the improvements with DLSS on :p)
 
Last edited:
I absolutely understand enthusiasts wanting to max their hz and for them, MFG may be great. I do, however, think it is kinda useless for the average person. Of course the kind of person on a PC hardware forum isn't average, but the marketing wasn't aimed just at us, it was aimed at everyone.
Seems self-defeating maxing out the frame rate but not getting the main benefit of that ie. better input latency. Imo, high refresh monitors, or at least the 4K ones, are kinda pointless until GPUs can match that performance using native raster/AI upscaling. Can be done at 1080p & 1440p .. but not 4K.
 
Yes? He said exactly what I said too - devs will always try to go for easier/cheaper option for them, that's no surprise. However, I would suggest to look into actual GPU requirement for that game and we see it will be playable on a wide variety of GPUs, which is good. The more interesting thing I see there is the vRAM requirements - it feels some of these GPUs could do with higher details if not for the vRAM. In addition, RT capable cards requirements can be listed for variety of reasons - they have a bunch of other capabilities built in than actual RT (mesh shaders etc.) the could be the source of the requirements.
 
Last edited:
Back to returning to my role as the ‘MFG defender’ I guess :o :p

Addressing several points raised…

Yes, Nvidia told a big porky pie with the 5070 = 4090 thing. Criticism of that sales pitch is fair. The raw raster uplifts of the 50 series isn’t as good as the 40 series, we all know.

Beyond that, everyone has said they’ve been impressed by the MFG tech whilst noting that it would be incorrect to suggest that it’s perfect. At least a couple have outright said that they ‘prefer it’.

The only professional reviewer I’ve heard that said he doesn’t intend to use it was Linus, remarking that he’s an absolute demon when it comes to pixel peeping because it’s his professional job. It was quite funny hearing about how he could instantly spot the differences in a blind upscaling test which upset Nvidia back in the day :p

My personal minimum for ‘good FPS’ is 90. If I’m hitting that at 4k, I’m happy. My new monitor is going to be able to push 240fps at 4k. So regardless of whether it’s ‘needed’, I’m personally pretty excited to try this out and experience it, which is completely impossible without FG in demanding titles. The major thing that everyone was critical of in the first place was the latency issue, which appears to have been practically solved.

Really, I think Nvidia have shot themselves in the foot on this, because it was THEM (not the consumer) that started comparing it to ‘real frames’.

At any specific settings, the real choices for everyone are:

(1) raw raster = frames
(2) raw raster + upscaling = more frames (1.2x) with some artefacts
(3) raw raster + upscaling + FG = much more frames (1.8 - 3.8x) with some more artefacts.

There is no choice of “1.8 - 3.8x” frames without FG, so I think making a comparison as if anyone can push these frame rates at 4k as a ‘real choice’ is misguided. Put another way, despite Nvidia’s flub IMO people shouldn’t dismissing MFG on the basis ‘imaginary options’. I think it has some value as a choice.

But if you would also rather either lower res and lower frame rates to avoid the artefacts (like Linus), that’s fine. It’s a case of picking your personal compromise that bothers you the least (on a per game basis, I guess).

I’ll do my best to be honest with myself and avoid ‘wow what I bought is the bestest!’-syndrome if I think it’s **** or distracting.
 
Last edited:
Very nice! Perfect for MFG, haha. I need that but at 4K. Maybe in a few years :D
Ah I have a few games that will run it native albeit older titles but I still play and enjoy them , blackops 6 with dlss was at 330-350fps map dependant on my 14900ks/ 4090 ... had a 240hz previously which would screen tear above 240 hence the upgrade. I'm not too fussed about mfg but will give it a try on single player games for sure :D
 
At any specific settings, the real choices for everyone are:

(1) raw raster = frames
(2) raw raster + upscaling = more frames (1.2x) with some artefacts
(3) raw raster + upscaling + FG = much more frames (1.8 - 3.8x) with some more artefacts.

There is no choice of “1.8 - 3.8x” frames without FG, so I think making a comparison as if anyone can push these frame rates at 4k as a ‘real choice’ is misguided. Put another way, despite Nvidia’s flub IMO people shouldn’t dismissing MFG on the basis ‘imaginary options’. I think it has some value as a choice.
DLSS perf will give more than 20% without FG
 
Ah I have a few games that will run it native albeit older titles but I still play and enjoy them , blackops 6 with dlss was at 330-350fps map dependant on my 14900ks/ 4090 ... had a 240hz previously which would screen tear above 240 hence the upgrade. I'm not too fussed about mfg but will give it a try on single player games for sure :D
I would love to try such a monitor. I think i could get not too far off that in Insurgency Sandstorm (my fav FPS). I get a near locked 240 in that, Blops 6 and Delta Force currently, so no complaints. No 5090 needed just yet ;)
 
Yes, I was just simplifying.

Personally I only use DLSS ‘quality’ because I think there is usually a significant drop in image quality below that.
I can see why some would go with that. For me 2160p DLSS Perf can look okay and get the frame boost needed. :) Also seen a few people claim that the new DLSS transformer mode is as good at perf mode as the old cnn at quality mode. Not yet convinced but I have only tried it in CP2077 and FF7Rebirth.
 
Last edited:
Back
Top Bottom