• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now

Dont talk BS, It's open source! If it was being blocked it would take devs/community probably seconds to find and fix/reverse!

I know you really hate AMD but come on that's low even for you.

Or was it a ploy to get a rise out of the usual people on here?

BS? you got anything to prove that amd aren't blocking the use of other upscaling tech in the official integrations so far? As so far it's only the mod where FSR 3/FG works with other upscaling tech..... Sorry that I don't fall for the "white knights" PR image that amd have you falling for.

Starting with that rubbish of "hating amd" again? Even though probably owned more amd hardware than most people here :rolleyes: If you can't stick to the discussion point without resorting to that rubbish, don't bother repsonding in the first place.

PS. I'll leave this here:


But I suppose that's a lie and a coincidence, right.....
 
Last edited:
So you mean when sponsored ?

Not blocked via software stack of FSR ?

Yes, what I said here:

Well the fact you can only enable fsr 3/frame gen when fsr 2 is enabled and the frame gen option is greyed out/disabled when other upscaling solutions are active proves it is being blocked. And we now have a mod showing that fsr 3 frame gen works with all upscaling tech.

So amd are from their end on the side of "sponsorship" are purposelly locking fsr 3 aka amds frame gen to be used only with their upscaling solution, either that or the devs are the ones blocking it but not sure what they have to gain from that when they are now including all upscaling options....

Side Q, When I have done that before ?

Not really aimed at you specifically, more just when point out something amd "potentially" do wrong/bad, that rhetoric gets thrown about.
 
Last edited:
Still failing to see how they can block it to fsr2 when the code is available to all .

Kind of makes your statement mute too when I've just read on here mods that enable AMD's frame gen with DLSS does it not ?

I think you're maybe missing the point I'm putting across here as you have just stated what I said about the modded fsr 3 fg hack method so lets change the approach here:

Why in amd sponsored games with fsr 3/frame gen can we not use other non-amd upscaling tech with fsr 3/fg?

EDIT:

And regarding that mod, it is technically injecting amds fsr 3/fg into nvidias FG toggle option. AFAIK, no one has got AMD FSR 3/FG toggle/option to work with other non amd upscaling tech but it's proven now that the different technology can work together through hacky methods.
 
Last edited:
I see what you are meaning however if they were blocking it it wouldn't work even via injecting would it if a non AMD card is being used.

That's not the point I'm making :p It clearly does work as I have said, however, in sponsored titles, it's not even an option because "reasons".
 
Has AMD actually stated this then ?

Always thought fsr and dlss was in some games already, or have they removed the frame gen bit if so ?

I think again either you are missing something or I'm not getting your point :p

FSR and DLSS upscaling coexists in most games now, in amds sponsored games with FSR 3 i.e. forgotten, immortals and now avatar, these games have FSR 3/frame gen and dlss, however, FSR 3/frame gen can only be enabled if you are using FSR upscaling, FSR 3/FG can not be enabled when using dlss.

The mod going around now basically injects fsr 3/fg via nvidias FG method so in effect you are enabling nvidias frame generation but it is really using amds fsr 3/fg method.

No one has gotten dlss to work with the FSR 3/FG setting/toggle in the sponsored amd games yet....
 
I

I see , though how do we know it's not a stipulation of DLSS usage? as in only Nvidia's FG is allowed to work with DLSS and thus blocks third party's from enabling DLSS if another vendors frame gen it in use.

Some of that makes no sense (I blame the poor naming convention if this tech!). If that where the case then it means that Intel would also be blocking the enablement of amds frame gen......
 
Now you are seeing what I have been saying all along about 1440p being potato and 4K being so much better. DLDSR is what made this monitor's archaic resolution bearable for me. I use it whenever I can in games :)

I wouldn't say 1440p is potato, still looks great, actually think better than my 4k 55" and when with DLDSR, miles better than the 4k 55" oled :)

Also, 4k was absolutely naff back when it first came about as no games had high resolution assets/textures, all it was good for was anti aliasing. It's only in the last 3 years, it's become more noticeable especially because of TAA adoption where high res works/looks better with this TAA method, which is why DLDSR and DLSS performance works so well.
 
Last edited:
Last I've tested in CB77, while the FPS is similar, latency is higher while downsampling, so is not a clear cut win. At least it wasn't in that case.

Can't say I have noticed it tbh, will check again at some point.








For those who don't want to use dlss tweaks, you can switch to 2.5.1 dlss to get rid of the ghosting in avatar.
 

It took a while to make this video , many frames had to be manually separated and my editing program crashed multiple times splitting the originals raw video of over 11 minutes turned into about 40,000 individual framesThe Intent of this video was to see how would a playback of only generated frames look.and now that we have FSR 3 how will it stack up against DLSS Frame Gen
 
Last edited:
Good read this:



Yesterday, “fake frames” was meant to refer to classical black-box TV interpolation. It is funny how the mainstream calls them “fake frames”;
But, truth to be told, GPU’s are currently metaphorically “faking” photorealistic scenes via drawing polygons/triangles, textures, and shaders. Reprojection-based workflows is just another method of “faking” frames, much like an MPEG/H.26X video standard of “faking it” via I-Frames, B-Frames and P-Frames.
That’s why, during a bit of data loss, video goes “kablooey” and turns into garbage with artifacts — if a mere 1 bit gets corrupt in a predicted/interpolated frame in a MPEGx/H26x video stream. Until the next full non-predicted/interpolated frame comes in (1-2 seconds later).
Over the long-term, 3D rendering is transitioning to a multitiered workflow too (just like digital video did over 30 years ago out of sheer necessity of bandwidth budgets). Now our sheer necessity is a Moore’s Law slowdown bottleneck. So, as a shortcut around Moore’s Law — we are unable to get much extra performance via traditional “faking-it-via-polygons” methods.
The litmus test is going lagless and artifactless, much like the various interpolated frame subtypes built into your streaming habits, Netflix, Disney, Blu-Ray, E-Cinema, and other current video compression standards that use prediction systems in their compression systems.
Just as compressors have original knowledge of the original material, modern GPU reprojection can gain knowledge via z-buffers and between-frame inputreads. And “fake it” perceptually flawlessly, unlike year 1993’s artifacty MPEG1. Even the reprojection-based double-image artifacts disappear too!
TL;DR: Faking frames isn’t bad anymore if you remove the “black box” factor, and make it perceptually lagless and lossless relative to other methods of “faking frames” like drawing triangles and textures
 
Last edited:
TPU voted DLSS 3/FG as the best gaming technology of the year:


NVIDIA DLSS 3 Frame Generation was announced late last year along with the GeForce RTX 40-series "Ada" graphics cards, to some skepticism, since we'd seen interpolation techniques before, and knew it to be a lazy way to increase frame-rates, especially in a dynamic use-case like gaming, with drastic changes to the scene, which can cause ghosting. NVIDIA debuted its RTX 40-series last year with only its most premium RTX 4090 and RTX 4080 SKUs, which didn't really need Frame Generation given their product category. A lot has changed over the course of 2023, and we've seen DLSS 3 become increasingly relevant, especially in some of the lower-priced GPU SKUs, such as the RTX 4060 Ti, or even the RTX 4070. NVIDIA spent the year trying to polish the technology along with game developers, to ensure the most obvious holdouts to this technology—ghosting, is reduced, as is its whole-system latency impact, despite Reflex being engaged by default. We've had a chance to enjoy DLSS 3 Frame Generation with a plethora of games over 2023, including Cyberpunk 2077, Alan Wake 2, Hogwarts Legacy, Marvel's Spider-Man Remastered, Naraka: Bladepoint and Hitman 3.
 
uAC2Xb5.gif


Imagine paying for 120Hz and only using it at 60Hz :cry:

Sure when FSR 3 launched, many said they didn't need VRR in the end :cry:

Do hope we see more solutions like this though as it will keep amd and nvidia on their toes, more choice is only a good thing.
 
Yes but at the same time it's totally subjective, and to get a final warning for such a small throwaway remark was a tad OTT imo. I do appreciate that this sub-forum had its share of idiots in the past so they had to do something.

Just a shame it seems to have come at a cost of having the fraction of the gfx forum we used to, you only have to see the complete lack of posts during a gfx card launch this place used to be *thriving*! Like a morgue now in comparison :(

I mean it's gotten that bad even Gibbo rarely has a new launch post now, even for the 4070S which is forecast to be a BIG seller!! That was unheard of even a couple of years ago...

Say it's more because this gen has been complete and utter **** for both sides and no real competition.
 
  • Like
Reactions: G J

For those who do use DLSS at 1080P, I recommend you update to later versions via dlss swapper and use DLSS tweaks to change the preset, unfortunately most newer games using 3.5.0+ use the D preset, which generally has ghosting issues, C is the best most of the time....

Preset A: Intended for Performance/Balanced/Quality modes. An older variant best suited to combat ghosting for elements with missing inputs, such as motion vectors.

Preset B: Intended for Ultra Performance mode. Similar to Preset A but for Ultra Performance mode.

Preset C (most preferred): Intended for Performance/Balanced/Quality modes. Generally favors current frame information; well suited for fast-paced game content.

Preset D (2nd preferred): Default preset for Performance/Balanced/Quality modes; generally favors image stability.

Preset E: A development model that is not currently used.

Preset F: Default preset for Ultra Performance and DLAA modes.

Either that or if you have the grunt, use DLDSR or get a higher res monitor or if possible, turn of TAA entirely :p

Also, always make sure you turn off post processing effects as evidenced so many times, which hub fails to highlight in these videos, it can impact the output quality and cause artifacting especially regarding DOF.
 
Last edited:
Back
Top Bottom