Soldato
Yes, you will need a 4090/4080 to play it using fake frames
If I can't visually tell the difference I don't care
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Yes, you will need a 4090/4080 to play it using fake frames
Yeah having read a fair amount of "user" reviews on several forums and subreddits now, fg/dlss 3 sounds extremely good especially with nvidia tackling the latency issues and sorting out the vsync/gsync issues. Looking to be another very nice to have feature.
Being lazy here - Have nVidia explained anywhere how they tackle the latency issues? It seems impossible to me, unless the true frames being generated are very high to begin with, it'll be interesting to see how they're approaching it.
It is impossible - decoupling the framerate from the engine means that there's no way to achieve a halving of input latency by doubling the fps. What Nvidia *can* do is further optimize Reflex to squeeze every last latency advantage they can out of the system - these won't be huge gains though - a 30fps game using DLSS 3.0 to reach 60fps will still have 30fps-ish input latency. This'll be more evident once the lower end 40-series start appearing.Being lazy here - Have nVidia explained anywhere how they tackle the latency issues? It seems impossible to me, unless the true frames being generated are very high to begin with, it'll be interesting to see how they're approaching it.
Exactly - but if you're already hitting over 60fps (likely in pretty much every game at every resolution with a 4080/90) - how much you notice the input latency is likely to be extremely subjective (heck, some people don't notice shader compilation stutter and that's far more intrusive). As to what Nvidia can do - I think the best that we can expect is get the latency down to DLSS 2.x levels (which would definitely be an improvement - especially for lower-end hardware).So it will look, but not feel smoother? That is my understanding as well, but it'll be interesting to see how it does improve as it's a great idea in theory if latency can be overcome.
From a cynical viewpoint, latency doesn't matter when creating a sales video showing fps numbers though, does it...
Since frame generation is a lot more efficient than creating actual frames, I'd imagine GPU usage and power draw would be greatly reduced.I'm curious, What would GPU usage be with DLSS Quality and Frame Generation enabled if you set a cap of 60FPS in a game you can easily get double in.
Without FG, expecting maybe 50-60 fps with dlss quality.
Possibly...... Probably lower 50s/high 40s on the whole.At 4K on a 3080 ?
Possibly...... Probably lower 50s/high 40s on the whole.
Depends largely on if this will have 40xx specific RT optimisation i.e. SER based RT workload.
I'll be playing at 3440x1440 so hoping for decent perf. with dlss balanced.
Materials and textures also had to be updated I think, was necessary to get RT working properly with light bouncingI'm curious if it's just an RT lighting upgrade or if actual textures and materials have been given a polish.