• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
Yeah having read a fair amount of "user" reviews on several forums and subreddits now, fg/dlss 3 sounds extremely good especially with nvidia tackling the latency issues and sorting out the vsync/gsync issues. Looking to be another very nice to have feature.
 
Yeah having read a fair amount of "user" reviews on several forums and subreddits now, fg/dlss 3 sounds extremely good especially with nvidia tackling the latency issues and sorting out the vsync/gsync issues. Looking to be another very nice to have feature.

Yep. It will only get better with next gen cards too. Just like DLSS did.
 
Being lazy here - Have nVidia explained anywhere how they tackle the latency issues? It seems impossible to me, unless the true frames being generated are very high to begin with, it'll be interesting to see how they're approaching it.
 
Being lazy here - Have nVidia explained anywhere how they tackle the latency issues? It seems impossible to me, unless the true frames being generated are very high to begin with, it'll be interesting to see how they're approaching it.

My approach is just ignore it and most problems it has will be solved by the time the next gen cards are out, which is when I will be upgrading next :D

Just look at how far DLSS came in a couple of years.
 
It may well be, I'm just interested in how - DLSS I understand as it's essentially projecting a result from a known starting point. Latency is much more difficult to overcome (I imagine) as it will need to predict user input.
 
Being lazy here - Have nVidia explained anywhere how they tackle the latency issues? It seems impossible to me, unless the true frames being generated are very high to begin with, it'll be interesting to see how they're approaching it.
It is impossible - decoupling the framerate from the engine means that there's no way to achieve a halving of input latency by doubling the fps. What Nvidia *can* do is further optimize Reflex to squeeze every last latency advantage they can out of the system - these won't be huge gains though - a 30fps game using DLSS 3.0 to reach 60fps will still have 30fps-ish input latency. This'll be more evident once the lower end 40-series start appearing.

Spiderman-Remastered-Native-4K-vs-DLSS-3-1024x576.png


Reflex is mandatory with frame generation to mitigate (somewhat) the latency hit. You can see here that at native 4k with Reflex off we're getting 39ms. Turning on DLSS 2.x (still with Reflex off) has the engine create more (real) frames and the latency drops proportionately. Turning on frame generation (and Reflex) basically gets you the same input latency as native 4k - worse than just DLSS 2.x (this is perhaps where Nvidia can make some small gains).
 
Last edited:
So it will look, but not feel smoother? That is my understanding as well, but it'll be interesting to see how it does improve as it's a great idea in theory if latency can be overcome.

From a cynical viewpoint, latency doesn't matter when creating a sales video showing fps numbers though, does it...
 
So it will look, but not feel smoother? That is my understanding as well, but it'll be interesting to see how it does improve as it's a great idea in theory if latency can be overcome.

From a cynical viewpoint, latency doesn't matter when creating a sales video showing fps numbers though, does it...
Exactly - but if you're already hitting over 60fps (likely in pretty much every game at every resolution with a 4080/90) - how much you notice the input latency is likely to be extremely subjective (heck, some people don't notice shader compilation stutter and that's far more intrusive). As to what Nvidia can do - I think the best that we can expect is get the latency down to DLSS 2.x levels (which would definitely be an improvement - especially for lower-end hardware).

Re: marketing - yeah - the '2-4x the performance of a 3090 Ti' for the 4090 is very shady marketing - it's kinda true, but only currently for one game (Cyberpunk) and only with DLSS frame generation - the actual average improvement is 1.6-1.7x. Still a great generational uplift though.
 
Last edited:
I'm curious, What would GPU usage be with DLSS Quality and Frame Generation enabled if you set a cap of 60FPS in a game you can easily get double in.
Since frame generation is a lot more efficient than creating actual frames, I'd imagine GPU usage and power draw would be greatly reduced.
 
Just read apparently The Witcher 3's upgrade patch will include Frame Generation, Should be interesting, I'm more interested in Frame Generation, FPS cap of 60 mixed with DLSS and an undervolt, The results could be quite good.
 
Last edited:
As expected, delayed till Dec 8th :p But looks great, seems they have changed the lighting, colouring etc. to fit more with the original style of the series given complaints with the previous footage:


Without FG, expecting maybe 50-60 fps with dlss quality.
 
Most of these Nvidia RT/PT mods we've seen have had texture updates; it's cause the tools are smart enough to know if a texture is wood or metal, matte or gloss etc so if you want more visible light reflections off walls then the wall textures need too be of a more reflective material

Something I still want to see just for lols is a game or mod where we have multiple mirrors opposite one another
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom