Watched about 5 minutes. That guy sounds sensible to me. A rare thing online these days with so many people taking sides.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
More catching on to how great DLDSR is
Worth a click or two to enable wouldn't you say?
Yeah it's crazy how much better it looks than native/dlss quality. Such an underrated feature by nvidia.
Now you are seeing what I have been saying all along about 1440p being potato and 4K being so much better. DLDSR is what made this monitor's archaic resolution bearable for me. I use it whenever I can in games
I wouldn't say 1440p is potato, still looks great, actually think better than my 4k 55" and when with DLDSR, miles better than the 4k 55" oled
Also, 4k was absolutely naff back when it first came about as no games had high resolution assets/textures, all it was good for was anti aliasing. It's only in the last 3 years, it's become more noticeable especially because of TAA adoption where high res works/looks better with this TAA method, which is why DLDSR and DLSS performance works so well.
Last I've tested in CB77, while the FPS is similar, latency is higher while downsampling, so is not a clear cut win. At least it wasn't in that case.Yeah it's crazy how much better it looks than native/dlss quality. Such an underrated feature by nvidia.
Last I've tested in CB77, while the FPS is similar, latency is higher while downsampling, so is not a clear cut win. At least it wasn't in that case.
Can't say I have noticed it tbh, will check again at some point.
It took a while to make this video , many frames had to be manually separated and my editing program crashed multiple times splitting the originals raw video of over 11 minutes turned into about 40,000 individual framesThe Intent of this video was to see how would a playback of only generated frames look.and now that we have FSR 3 how will it stack up against DLSS Frame Gen
Yesterday, “fake frames” was meant to refer to classical black-box TV interpolation. It is funny how the mainstream calls them “fake frames”;
But, truth to be told, GPU’s are currently metaphorically “faking” photorealistic scenes via drawing polygons/triangles, textures, and shaders. Reprojection-based workflows is just another method of “faking” frames, much like an MPEG/H.26X video standard of “faking it” via I-Frames, B-Frames and P-Frames.
That’s why, during a bit of data loss, video goes “kablooey” and turns into garbage with artifacts — if a mere 1 bit gets corrupt in a predicted/interpolated frame in a MPEGx/H26x video stream. Until the next full non-predicted/interpolated frame comes in (1-2 seconds later).
Over the long-term, 3D rendering is transitioning to a multitiered workflow too (just like digital video did over 30 years ago out of sheer necessity of bandwidth budgets). Now our sheer necessity is a Moore’s Law slowdown bottleneck. So, as a shortcut around Moore’s Law — we are unable to get much extra performance via traditional “faking-it-via-polygons” methods.
The litmus test is going lagless and artifactless, much like the various interpolated frame subtypes built into your streaming habits, Netflix, Disney, Blu-Ray, E-Cinema, and other current video compression standards that use prediction systems in their compression systems.
Just as compressors have original knowledge of the original material, modern GPU reprojection can gain knowledge via z-buffers and between-frame inputreads. And “fake it” perceptually flawlessly, unlike year 1993’s artifacty MPEG1. Even the reprojection-based double-image artifacts disappear too!
TL;DR: Faking frames isn’t bad anymore if you remove the “black box” factor, and make it perceptually lagless and lossless relative to other methods of “faking frames” like drawing triangles and textures
Amazing how one method of doubling or even more your performance (multi GPU) is left out still.Good read this:
Frame Generation Essentials: Interpolation, Extrapolation, and Reprojection
More frame rate in gaming through various frame generation technologies. There is also talk about lagless and artifactless frame generation technologies.blurbusters.com
NVIDIA DLSS 3 Frame Generation was announced late last year along with the GeForce RTX 40-series "Ada" graphics cards, to some skepticism, since we'd seen interpolation techniques before, and knew it to be a lazy way to increase frame-rates, especially in a dynamic use-case like gaming, with drastic changes to the scene, which can cause ghosting. NVIDIA debuted its RTX 40-series last year with only its most premium RTX 4090 and RTX 4080 SKUs, which didn't really need Frame Generation given their product category. A lot has changed over the course of 2023, and we've seen DLSS 3 become increasingly relevant, especially in some of the lower-priced GPU SKUs, such as the RTX 4060 Ti, or even the RTX 4070. NVIDIA spent the year trying to polish the technology along with game developers, to ensure the most obvious holdouts to this technology—ghosting, is reduced, as is its whole-system latency impact, despite Reflex being engaged by default. We've had a chance to enjoy DLSS 3 Frame Generation with a plethora of games over 2023, including Cyberpunk 2077, Alan Wake 2, Hogwarts Legacy, Marvel's Spider-Man Remastered, Naraka: Bladepoint and Hitman 3.
Wasn't sure about the Nvidia FG at first (tried it on CP w/PT, fps too low and felt sluggish) but since trying it on both Starfield and The Finals (combined with great DLSS on both) I'm beginning to be a really big fan!
TruthYou're just justifying your 40xx purchase!!!!!