• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

Yeah I did laugh at this video earlier, 28fps vs 200fps. I think it's just a nice feature thats makes it look impressive on paper but if it bothers you, turn it off. I'm using DLSS for the Indy game and seems fine to me. I disabled it in RD2R because it maybe using an older version as it has smearing that I couldn't deal with.

If you haven't come from CRT gaming you won't know what input lag is ;)


 
At this time in 14 days, The DPD man will be driving away, having taken a photo of my front door, without even attempting to deliver my new 5090 and I'll be punching the walls and screaming blue murder (happens every time!)
Do what I do, redirect it to the local post office. I've done this for years and it's never been a problem. The hassle of picking it up far outweights lazy drivers.
 
I'm not technical enough but my assumption which could be totally wrong is that DLSS can cause the artifacting but the frame generation then makes it worse because it is using that as its base for the frames it creates.

We have to see how it is, we have limited knowledge, to me cyberpunk generally looks good from what we have seen in the videos but when watching PC Centric, in the live chat he said he wasn't very impressed with Black Myth Wukong and said the artifacting was bad with the hair.
DLSS upscaling is really good especially when running quality and to a lesser extent balanced. You render the game at a lower resolution and then upscales to a higher res.

Frame gen using AI to add frames in between 2 normally rendered frames and the AI is trying to predict what happens in between and drop in extra frames, the time gap between the 2 normally rendered frames which is where your latency and normal FPS is derived will actually increase slightly because the extra frames being inserted have a slight cost. The AI frames are pretty bad and have a lot of artifacts but if you have the normal frames high enough then this will mask the artifacts to some extent.
 
DLSS upscaling is really good especially when running quality and to a lesser extent balanced. You render the game at a lower resolution and then upscales to a higher res.

Frame gen using AI to add frames in between 2 normally rendered frames and the AI is trying to predict what happens in between and drop in extra frames, the time gap between the 2 normally rendered frames which is where your latency and normal FPS is derived will actually increase slightly because the extra frames being inserted have a slight cost. The AI frames are pretty bad and have a lot of artifacts but if you have the normal frames high enough then this will mask the artifacts to some extent.
Yeah i understand how DLSS and Frame gen work seperately.. I just wasn't sure whether Frame gen generates frames from the native image or the DLSS image when both are turned on... I assume it generates frames from the DLSS image which if that has any artifacting/errors then it would mean it would get even worse with FG plus the artifacting FG causes by itself anyway
 
Last edited:
DLSS upscaling is really good especially when running quality and to a lesser extent balanced. You render the game at a lower resolution and then upscales to a higher res.
That in itself doesn't say anything unless you also state resolution. :) DLSS Quality in 1440p is just 960p, which has plenty of issues in modern games that DLSS can't fix (TAA smearing details on effects like foliage, textures etc., large noise and lacking details in RT etc.). Anything below Quality and you quickly end up below 720p, which is rather tragic IMHO. :) 4k is a different story.
Frame gen using AI to add frames in between 2 normally rendered frames and the AI is trying to predict what happens in between and drop in extra frames, the time gap between the 2 normally rendered frames which is where your latency and normal FPS is derived will actually increase slightly because the extra frames being inserted have a slight cost. The AI frames are pretty bad and have a lot of artifacts but if you have the normal frames high enough then this will mask the artifacts to some extent.
This also depends on resolution - if it's DLSS upscaling from 960p (or lower!), already semi-smearing and with lower details, then you add FG to it, which just amplifies these issues and resulting quality is... quite meh, latency aside. In 4k, a different story, where latency hit is the biggest problem.
 
Yeah i understand how DLSS and Frame gen work seperately.. I just wasn't sure whether Frame gen generates frames from the native image or the DLSS image when both are turned on... I assume it generates frames from the DLSS image which if that has any artifacting/errors then it would mean it would get even worse with FG plus the artifacting FG causes by itself anyway
It's from DLSS image, as it's added after DLSS already did its job, to the final image. Then again, NVidia said they will give all DLSS enabled cards new algorithm that should much improve the quality - we'll see how that works.
 
Last edited:
K

One thing to take into account though is the 5090 will be running DLSS4.0 which improves FPS by 15% or so even for a x2 FG vs the 4090 running the older DLSS3.0
I was thinking more about the DLSS package in general, being able to be upgraded and work all the way back to 2xxx series, to its main functions it had at the time. The games coverage, easiness of upgrade and constant improvement represents (to me), that fine wine.

I know that some SKUs are a mix of fine wine and sour milk when it comes to vRAM, but overall... pretty darn good from a company that could do a lot worse consumer practices than it has done and get away with it due to its marketing share/position - stuff like changing the socket/motherboard with each gen or two of processors, with marginal improvements.
 
The biggest problem with fake frames is the very concept. The whole point of turning up resolution, effects, level of detail, shadow quality, lighting quality, and so on, is so that each rendered and displayed frame looks better.

Why would I then insert garbage frames in between them? How much of that extra fidelity is kept in each artificial frame? By definition it's lower quality than the rendered frame.

And since when did latency become ok to compromise on? Around the time I bought my first 144hz monitor 11-ish years ago, with freesync, the gaming community wouldn't shut up about latency. 'Set the frame limiter to your max refresh minus 3 frames for the best latency', 'it's all about frametime', framepacing was in there as well (SLI was still around).

Maybe I'm getting old, but I don't get it anymore. It seems more important to play at 'Max' settings with entirely inappropriate real-time lighting, at fake resolutions with fake frames...

(I'm not a bitter old man, I swear :cry:)

I'm interested in a comparison of latency at true 200fps vs frame generated 200fps. Most reviewers fail to highlight the fact that there will be a pretty big difference. While you may see 200 frames, the game will be similar in feel to the base frame rate. A true 200fps will feel much more responsive.
 
It's from DLSS image, as it's added after DLSS already did its job, to the final image. Then again, NVidia said they will give all DLSS enabled cards new algorithm that should much improve the quality - we'll see how that works.
Thanks for confirming, i assumed it was that way but wasn't 100% sure :)
 
I'm interested in a comparison of latency at true 200fps vs frame generated 200fps. Most reviewers fail to highlight the fact that there will be a pretty big difference. While you may see 200 frames, the game will be similar in feel to the base frame rate. A true 200fps will feel much more responsive.
Who cares! 200fps is 200fps. Bigger bar better. WOO!
 
I'm interested in a comparison of latency at true 200fps vs frame generated 200fps. Most reviewers fail to highlight the fact that there will be a pretty big difference. While you may see 200 frames, the game will be similar in feel to the base frame rate. A true 200fps will feel much more responsive.
60 FPS is 16.7ms. 6.9ms for 144, and 4.1ms for 240Hz, FG at 240 with a 60 base fps will be around 20ms so almost a 5x increase in latency than running native 240hz.
 
60 FPS is 16.7ms. 6.9ms for 144, and 4.1ms for 240Hz, FG at 240 with a 60 base fps will be around 20ms so almost a 5x increase in latency than running native 240hz.

I got those figures too for the non-FG frame rates but 20ms for the 240fps FG seems generous when some of the pics posted earlier in this thread show much higher latency.


2n8j3ne.jpeg
 
Last edited:
60 FPS is 16.7ms. 6.9ms for 144, and 4.1ms for 240Hz, FG at 240 with a 60 base fps will be around 20ms so almost a 5x increase in latency than running native 240hz.

Would matter to me of i played fps games online or something. Single player games which many I play with a controller, not so much of an issue.

I come from 8bit cartridge console games where everythingwas crap. 50ms latency really is not that bad imo.
 
Would matter to me of i played fps games online or something. Single player games which many I play with a controller, not so much of an issue.

I come from 8bit cartridge console games where everythingwas crap. 50ms latency really is not that bad imo.
Play a first person shooter on your AW3225QF with 50ms latency on M&K and tell me, with a straight face, that it feels 'not that bad'. Go on.
 
Would matter to me of i played fps games online or something. Single player games which many I play with a controller, not so much of an issue.

I come from 8bit cartridge console games where everythingwas crap. 50ms latency really is not that bad imo.
Probably helped that TVs back in the 50s also had a ton of delay & latency too.

50ms latency is getting to the point where you might as well be using a PlayStation Portal...
 
God my head hurts. I dont watch the numbers I adjust the game till it feels right. For example cp2077 i played on pretty high settings 4k and had a great time. No lag.
 
I'm interested in a comparison of latency at true 200fps vs frame generated 200fps. Most reviewers fail to highlight the fact that there will be a pretty big difference. While you may see 200 frames, the game will be similar in feel to the base frame rate. A true 200fps will feel much more responsive.

I do think this is an important thing and interesting as you say (“in the name of SCIENCE”) but it’s worth keeping in mind the practicalities: if you can already get a true 200fps without DLSS and frame gen anyway, then you aren’t going to be turning on either of those technologies. At least at 4k.

No game with demanding ray tracing can get anywhere near 200fps at 4k, so we’d be talking an oldish game.

In one sense, it’s really interesting and would be good to know. In another way, it’s meaningless… because there is no possible way of running the likes of Cyberpunk and Indy Jones maxed at 200fps with ‘true frames’.

If +200fps with low latency is what’s really desirable, then out of all of your choices available today you’d be better off with a low res potato monitor and old / low demanding games… for which you wouldn’t even need a 40 series card.

… I suppose this post is a really long winded way of saying that there’s better ways of getting amazing latency than having the latest and greatest graphics card - it’s a compromise either way.
 
Last edited:
Back
Top Bottom