Soldato
- Joined
- 2 Jan 2012
- Posts
- 12,218
- Location
- UK.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'm not technical enough but my assumption which could be totally wrong is that DLSS can cause the artifacting but the frame generation then makes it worse because it is using that as its base for the frames it creates.Reading some comments elsewhere on the fake frames / latency stuff got me thinking… dangerous, I know
A common criticism I’m seeing is “lol it’s double the frames and they are all fake!”, but this comparison assumes that you are OK with and use ‘fake frames’ in the first place.
Totally ignoring comparisons with other cards, the true comparison of using / not using FG tech is actually ‘four’ times the frames, at the cost of latency.
If the 5090 is pulling 200fps with full path tracing (4xMFG), then turning off the MFG will plonk you back down to somewhere in the region of 50-60 FPS.
The true choice is:
- 50-60 FPS with slightly better latency.
- 200 FPS with slightly worse latency.
That is a pretty massive difference. The cyberpunk vid shared the other day showed it at 240fps maxed out, with less than 55ms latency, which isn’t non existent but it’s pretty impressive. As an aside, the artefacts mostly seem to come from DLSS and not MFG, unless I’m wildly mistaken.
I would initially think that the sweet spot lies somewhere in the middle, but apparently there is only a very small latency increase between 2x and 4x… at which point you might as well go with 4x if your monitor allows it.
As always, it remains to be seen how good I think the FG stuff is (I’ll only know by trying) but I think I’d be willing to accept some latency penalty from going from 55 to a whopping 200 fps, in some games.
Yes but users shouldn't have been trying to cram them in cases where they didn't fit properly.
I'm not technical enough but my assumption which could be totally wrong is that DLSS can cause the artifacting but the frame generation then makes it worse because it is using that as its base for the frames it creates.
We have to see how it is, we have limited knowledge, to me cyberpunk generally looks good from what we have seen in the videos but when watching PC Centric, in the live chat he said he wasn't very impressed with Black Myth Wukong and said the artifacting was bad with the hair.
It is different in a way that this time Nvidia claim 5090D is not impacted in gaming at all in comparison to 5090.the 4090d was 6% slower in fps than the 4090 so not sure if that's the same case.
Clearly you are too slow with the barbed wire.At this time in 14 days, The DPD man will be driving away, having taken a photo of my front door, without even attempting to deliver my new 5090 and I'll be punching the walls and screaming blue murder (happens every time!)
I read it as "it's bare minimum, not best experience".Yes thats true. I just think what they recommend is on the low side of usability ngl.
That shouldn't be a factor, unless i'm misunderstanding or very dense (which is quite possible). Even going from 100-160fps using DLSS3 in Stalker 2 feels awful. The extra latency no matter the base fps and the resulting fps feels terrible in a first person game. I get very smooth fps without DLSS3 in any game i play 150-240 fps and no frame gen & g-sync as I don't have the extra latency making me feel like i'm playing at 30fps on PS5.Might be because you are at 240hz? Won't a 4090 struggle at 4k at such high refresh? What I'm getting at is your monitor is refreshing 240 times per second but your gpu is only outputting say 200 frames per second so you have a 40fps difference. Try setting your monitor to 120hz and see if you feel latency is better with FG on. Irrespective of AI technologies being on or off, if you want smooth gameplay, your FPS needs to be matching or very close to your monitor refresh rate, otherwise you are going to experience problems
At least my gas bill will go down during the winters moving forwards.I know, I've had my 4090 pulling over 600W for stability testing. Power limited 5090 at 575W??!
Yes, but the future is in the future and largely unknown. What's now it's not that.Eventually we get to a point where frame generation will have minimal to no latency
probably courtesy of Asynchronous reprojection or some other work arounds.
Problem is that it's not slightly better latency and we can already do even 10x MFG using 3rd party app on any GPU and in most games (LoselessScaling - it uses AI too for FG). We know how this works and feels and it's really not great if you only have input 60fps. It's been said the same and FH - it's good for already well working games to have them more fluid on high refresh screens, not to improve performance.Reading some comments elsewhere on the fake frames / latency stuff got me thinking… dangerous, I know
A common criticism I’m seeing is “lol it’s double the frames and they are all fake!”, but this comparison assumes that you are OK with and use ‘fake frames’ in the first place.
Totally ignoring comparisons with other cards, the true comparison of using / not using FG tech is actually ‘four’ times the frames, at the cost of latency.
If the 5090 is pulling 200fps with full path tracing (4xMFG), then turning off the MFG will plonk you back down to somewhere in the region of 50-60 FPS.
The true choice is:
- 50-60 FPS with slightly better latency.
- 200 FPS with slightly worse latency.
That is a pretty massive difference. The cyberpunk vid shared the other day showed it at 240fps maxed out, with less than 55ms latency, which isn’t non existent but it’s pretty impressive. As an aside, the artefacts mostly seem to come from DLSS and not MFG, unless I’m wildly mistaken.
I would initially think that the sweet spot lies somewhere in the middle, but apparently there is only a very small latency increase between 2x and 4x… at which point you might as well go with 4x if your monitor allows it.
As always, it remains to be seen how good I think the FG stuff is (I’ll only know by trying) but I think I’d be willing to accept some latency penalty from going from 55 to a whopping 200 fps, in some games.
This is pretty funny… here’s the latency with no DLSS and no frame gen… for all those anti-fake frames elite gamerz that want the best graphics without the bells and whistles.
- so this actually has the worst latency.
Obviously, I’m joking a bit in my spiel at the top of this post as nobody that’s wanting mega latency is going to be playing games with mega ray tracing on etc. You’d optimise your settings for better latency.
It's not rocket science. When you turn 'DLSS' off, you have a higher render resolution, therefore lower (real) framerate, therefore higher latency. It's not an apples to apples comparison of frame gen.
NVIDIA are geniuses, they've raised the complexity to the point that people are completely unable to figure it out for themselves
Obviously, I’m joking a bit in my spiel at the top of this post as nobody that’s wanting mega latency is going to be playing games with mega ray tracing on etc. You’d optimise your settings for better latency.
This is pretty funny… here’s the latency with no DLSS and no frame gen… for all those anti-fake frames elite gamerz that want the best graphics without the bells and whistles.
- so this actually has the worst latency.
Obviously, I’m joking a bit in my spiel at the top of this post as nobody that’s wanting mega latency is going to be playing games with mega ray tracing on etc. You’d optimise your settings for better latency.
Uh, from what you quoted:
Maybe you had better get those ‘bitter old man’ eyes to specsavers
Now turn off the RT so it runs at 100fps and check latency