• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

Reading some comments elsewhere on the fake frames / latency stuff got me thinking… dangerous, I know :o

A common criticism I’m seeing is lol it’s double the frames and they are all fake!, but this comparison assumes that you are OK with and use ‘fake frames’ in the first place.

Totally ignoring comparisons with other cards, the true comparison of using / not using FG tech is actually ‘four’ times the frames, at the cost of latency.

If the 5090 is pulling 200fps with full path tracing (4xMFG), then turning off the MFG will plonk you back down to somewhere in the region of 50-60 FPS.

The true choice is:
- 50-60 FPS with slightly better latency.
- 200 FPS with slightly worse latency.

That is a pretty massive difference. The cyberpunk vid shared the other day showed it at 240fps maxed out, with less than 55ms latency, which isn’t non existent but it’s pretty impressive. As an aside, the artefacts mostly seem to come from DLSS and not MFG, unless I’m wildly mistaken.

I would initially think that the sweet spot lies somewhere in the middle, but apparently there is only a very small latency increase between 2x and 4x… at which point you might as well go with 4x if your monitor allows it.

As always, it remains to be seen how good I think the FG stuff is (I’ll only know by trying) but I think I’d be willing to accept some latency penalty from going from 55 to a whopping 200 fps, in some games.
 
  • Like
Reactions: ne0
Reading some comments elsewhere on the fake frames / latency stuff got me thinking… dangerous, I know :o

A common criticism I’m seeing is lol it’s double the frames and they are all fake!, but this comparison assumes that you are OK with and use ‘fake frames’ in the first place.

Totally ignoring comparisons with other cards, the true comparison of using / not using FG tech is actually ‘four’ times the frames, at the cost of latency.

If the 5090 is pulling 200fps with full path tracing (4xMFG), then turning off the MFG will plonk you back down to somewhere in the region of 50-60 FPS.

The true choice is:
- 50-60 FPS with slightly better latency.
- 200 FPS with slightly worse latency.

That is a pretty massive difference. The cyberpunk vid shared the other day showed it at 240fps maxed out, with less than 55ms latency, which isn’t non existent but it’s pretty impressive. As an aside, the artefacts mostly seem to come from DLSS and not MFG, unless I’m wildly mistaken.

I would initially think that the sweet spot lies somewhere in the middle, but apparently there is only a very small latency increase between 2x and 4x… at which point you might as well go with 4x if your monitor allows it.

As always, it remains to be seen how good I think the FG stuff is (I’ll only know by trying) but I think I’d be willing to accept some latency penalty from going from 55 to a whopping 200 fps, in some games.
I'm not technical enough but my assumption which could be totally wrong is that DLSS can cause the artifacting but the frame generation then makes it worse because it is using that as its base for the frames it creates.

We have to see how it is, we have limited knowledge, to me cyberpunk generally looks good from what we have seen in the videos but when watching PC Centric, in the live chat he said he wasn't very impressed with Black Myth Wukong and said the artifacting was bad with the hair.
 
I'm not technical enough but my assumption which could be totally wrong is that DLSS can cause the artifacting but the frame generation then makes it worse because it is using that as its base for the frames it creates.

We have to see how it is, we have limited knowledge, to me cyberpunk generally looks good from what we have seen in the videos but when watching PC Centric, in the live chat he said he wasn't very impressed with Black Myth Wukong and said the artifacting was bad with the hair.

On your first paragraph I think you’re probably right, although the impact seems to have been reduced by DLSS4 which all the cards will benefit from.

On the second one yup I saw that myself - he made a similar comment with the hair in Cyberpunk. The question is whether it’s bad enough that you need to turn FG and DLSS entirely. It could have been the case that it was using DLSS performance mode.

I only tend to bother with DLSS on quality mode, if I do use it. Balanced if I’m struggling… never performance mode though.
 
Might be because you are at 240hz? Won't a 4090 struggle at 4k at such high refresh? What I'm getting at is your monitor is refreshing 240 times per second but your gpu is only outputting say 200 frames per second so you have a 40fps difference. Try setting your monitor to 120hz and see if you feel latency is better with FG on. Irrespective of AI technologies being on or off, if you want smooth gameplay, your FPS needs to be matching or very close to your monitor refresh rate, otherwise you are going to experience problems
That shouldn't be a factor, unless i'm misunderstanding or very dense (which is quite possible). Even going from 100-160fps using DLSS3 in Stalker 2 feels awful. The extra latency no matter the base fps and the resulting fps feels terrible in a first person game. I get very smooth fps without DLSS3 in any game i play 150-240 fps and no frame gen & g-sync as I don't have the extra latency making me feel like i'm playing at 30fps on PS5.
I know, I've had my 4090 pulling over 600W for stability testing. Power limited 5090 at 575W??!
At least my gas bill will go down during the winters moving forwards.
 
Ok so pulling some screenshots from the PC centric vid, this is Cyberpunk maxed with DLSS4 on quality mode, 4xMFG. 48ms of latency.

2n8j3ne.jpeg


This is it with the DLSS but the MFG off. 35ms of latency.
AewcI9l.jpeg


So a 13ms (thousandth of a second) penalty for 140 fake frames.

I mean… it’s not ‘bad’ is it?
 
Reading some comments elsewhere on the fake frames / latency stuff got me thinking… dangerous, I know :o

A common criticism I’m seeing is lol it’s double the frames and they are all fake!, but this comparison assumes that you are OK with and use ‘fake frames’ in the first place.

Totally ignoring comparisons with other cards, the true comparison of using / not using FG tech is actually ‘four’ times the frames, at the cost of latency.

If the 5090 is pulling 200fps with full path tracing (4xMFG), then turning off the MFG will plonk you back down to somewhere in the region of 50-60 FPS.

The true choice is:
- 50-60 FPS with slightly better latency.
- 200 FPS with slightly worse latency.

That is a pretty massive difference. The cyberpunk vid shared the other day showed it at 240fps maxed out, with less than 55ms latency, which isn’t non existent but it’s pretty impressive. As an aside, the artefacts mostly seem to come from DLSS and not MFG, unless I’m wildly mistaken.

I would initially think that the sweet spot lies somewhere in the middle, but apparently there is only a very small latency increase between 2x and 4x… at which point you might as well go with 4x if your monitor allows it.

As always, it remains to be seen how good I think the FG stuff is (I’ll only know by trying) but I think I’d be willing to accept some latency penalty from going from 55 to a whopping 200 fps, in some games.
Problem is that it's not slightly better latency and we can already do even 10x MFG using 3rd party app on any GPU and in most games (LoselessScaling - it uses AI too for FG). We know how this works and feels and it's really not great if you only have input 60fps. It's been said the same and FH - it's good for already well working games to have them more fluid on high refresh screens, not to improve performance.
 
Last edited:
This is pretty funny… here’s the latency with no DLSS and no frame gen… for all those anti-fake frames elite gamerz that want the best graphics without the bells and whistles.

jb7FDN2.jpeg


:o - so this actually has the worst latency.

Obviously, I’m joking a bit in my spiel at the top of this post as nobody that’s wanting mega latency is going to be playing games with mega ray tracing on etc. You’d optimise your settings for better latency.
 
The biggest problem with fake frames is the very concept. The whole point of turning up resolution, effects, level of detail, shadow quality, lighting quality, and so on, is so that each rendered and displayed frame looks better.

Why would I then insert garbage frames in between them? How much of that extra fidelity is kept in each artificial frame? By definition it's lower quality than the rendered frame.

And since when did latency become ok to compromise on? Around the time I bought my first 144hz monitor 11-ish years ago, with freesync, the gaming community wouldn't shut up about latency. 'Set the frame limiter to your max refresh minus 3 frames for the best latency', 'it's all about frametime', framepacing was in there as well (SLI was still around).

Maybe I'm getting old, but I don't get it anymore. It seems more important to play at 'Max' settings with entirely inappropriate real-time lighting, at fake resolutions with fake frames...

(I'm not a bitter old man, I swear :cry:)
 
This is pretty funny… here’s the latency with no DLSS and no frame gen… for all those anti-fake frames elite gamerz that want the best graphics without the bells and whistles.

jb7FDN2.jpeg


:o - so this actually has the worst latency.

Obviously, I’m joking a bit in my spiel at the top of this post as nobody that’s wanting mega latency is going to be playing games with mega ray tracing on etc. You’d optimise your settings for better latency.

It's not rocket science. When you turn 'DLSS' off, you have a higher render resolution, therefore lower (real) framerate, therefore higher latency. It's not an apples to apples comparison of frame gen.

NVIDIA are geniuses, they've raised the complexity to the point that people are completely unable to figure it out for themselves :rolleyes:
 
It's not rocket science. When you turn 'DLSS' off, you have a higher render resolution, therefore lower (real) framerate, therefore higher latency. It's not an apples to apples comparison of frame gen.

NVIDIA are geniuses, they've raised the complexity to the point that people are completely unable to figure it out for themselves :rolleyes:

Uh, from what you quoted:

Obviously, I’m joking a bit in my spiel at the top of this post as nobody that’s wanting mega latency is going to be playing games with mega ray tracing on etc. You’d optimise your settings for better latency.

Maybe you had better get those ‘bitter old man’ eyes to specsavers :p
 
  • Haha
Reactions: TNA
This is pretty funny… here’s the latency with no DLSS and no frame gen… for all those anti-fake frames elite gamerz that want the best graphics without the bells and whistles.

jb7FDN2.jpeg


:o - so this actually has the worst latency.

Obviously, I’m joking a bit in my spiel at the top of this post as nobody that’s wanting mega latency is going to be playing games with mega ray tracing on etc. You’d optimise your settings for better latency.

Now turn off the RT so it runs at 100fps and check latency
 
Last edited:
Now turn off the RT so it runs at 100fps and check latency

Yeah, exactly. Nobody in their right mind would play at sub-30fps.

Is without ray tracing how you would opt to play Cyberpunk / Indy Jones? I’ve never played either myself, but will probably give it a whirl when I have a new card and monitor.
 
Back
Top Bottom