• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now

alan wake 2 has had some mega geometry update or something i do not own it so hopefully we will see some pics in the high res thread soon :D
also been told red dead 2 supports dlss4
 
Last edited:
Edit* From your phone video you are moving the camera around really fast which goes far and beyond "normal" gameplay camera movement
What? I am not moving camera AT ALL in my video. I am just looking down (and not even fully down, just a bit more towards the ground) and walking forward, back, left and right. :) And looking down only to show it better with foliage - in normal gameplay it's visible as well, just walking around with NO camera movement at all. That said, I've checked a bunch of other games and IJ is the only one in which I can see this behaviour so far, so I suspect it's engine related in some ways.
 
Last edited:
If he's moving very fast, you stay very still :)) it is amusing to me how every flaw is put to 11 for RT/PT/FGRR, while raster flaws are basically forgotten or forgiven. Or even older games are praised on how "good" they were / are.
The noticed flaw is a direct comparison of 2 DLSS models (new Transformer vs old GCN) - new one has it visible, old doesn't, though aside it new one shows generally considerably better quality. It's just not perfect and NVIDIA themselves said there's more work to be done to correct all issues with it. Till that's done, I wouldn't call it better than native. :) This is also why they whitelist only a bunch of games so far to do the swap, not all of them, and why already newer dll shown up with further fixes. You could take off your bias glasses now and then and look at things bit closer, but you do you. :)

And yes, rendering quality (accounting for the age of technology) was often better in older games, as gamers were putting pressure on developers to do a better job. Not so much these days anymore, so there are a lot of regressions in modern games introduced by lazy devs, masked by bad TAA (and yes, in raster even more than with RT). However, I am not getting into that discussion with you again, as enough was shown and told by plenty of people already, with real examples.
 
Bear in mind IJ is the only game using this engine and will remain so, plus it has not had the ray reconstruction update yet which is a core function of dlss4 path tracing.
 
Is frame gen really unusable for multiplayer games?
There are plenty of non-twitchy MP games, in such FG shouldn't be causing any issues. That said, I have noticed earlier today in Diablo 4 cutscenes FG makes sounds of effects go totally out of sync with image, though dialogs are still ok. Very weird. Turned FG off and everything works fine again. This is with NVIDIA App override to newest version, not sure how it works without it. On the other hand, D4 is in general a weird game visually - RT changes close to nothing visually (but kills FPS), FG introduces stutter every 10+ seconds that's not there without it (and cutscenes issues), etc. Blizz should really do better than mess it up like that
 
Last edited:
Bear in mind IJ is the only game using this engine and will remain so, plus it has not had the ray reconstruction update yet which is a core function of dlss4 path tracing.
This might well be it, yes. RR could be the cause of this more than DLSS itself. Hard to say at the moment.
 
I just checked War Thunder with new DLSS model override active and damn, it looks SO MUCH better. Biggest visual improvement out of all games (aside animated textures in CP2077) so far, as game finally looks normal, sharp, not like a blur-fest (both with TAA or GCN it looked awfully fuzzy). I could actually play it now. That said, they also added RT which works very well on 4090 - not that much FPS drop even with DLAA (still getting over 100 FPS on average in native res with DLAA and everything on max settings). Though it's a beta DX12 engine and crashed few times whilst playing, so they need to work more on it. However, good optimisation for sure! :)
 
The noticed flaw is a direct comparison of 2 DLSS models (new Transformer vs old GCN) - new one has it visible, old doesn't, though aside it new one shows generally considerably better quality. It's just not perfect and NVIDIA themselves said there's more work to be done to correct all issues with it. Till that's done, I wouldn't call it better than native. :) This is also why they whitelist only a bunch of games so far to do the swap, not all of them, and why already newer dll shown up with further fixes. You could take off your bias glasses now and then and look at things bit closer, but you do you. :)

And yes, rendering quality (accounting for the age of technology) was often better in older games, as gamers were putting pressure on developers to do a better job. Not so much these days anymore, so there are a lot of regressions in modern games introduced by lazy devs, masked by bad TAA (and yes, in raster even more than with RT). However, I am not getting into that discussion with you again, as enough was shown and told by plenty of people already, with real examples.
It's not about the "better than native", where DLSS through DLAA, I gues it can be. It's the whole RT/PT argument on how it falls short.

As for older games... as a simple example, those NPC look rather plastic, like a mannequin.

 
Last edited:
It's not about the "better than native"
So, you didn't read previous posts and what my response was to, got it. :) Hint: exactly that statement by mrk :)

Regarding older games example: /whoosh! As in, you still don't get what the argument is about, it seems. Hint: it's not about how old games look in 2025. It's about how well they use technology available in their respective time. This is why for example Crysis 2 received whole long articles and thousands of posts about abuse of tessellation by devs for no good reason at all. After that other games did not repeat that mistake. These days a lot of devs abuse TAA as a masking technique for artefacts caused by low resolution effects and we get blurry image in games for no good reason - but hardly anyone raised voice, so it got wide-spread.
 
Last edited:
So, you didn't read previous posts and what my response was to, got it. :) Hint: exactly that statement by mrk :)

Regarding older games example: /whoosh! As in, you still don't get what the argument is about, it seems. Hint: it's not about how old games look in 2025. It's about how well they use technology available in their respective time. This is why for example Crysis 2 received whole long articles and thousands of posts about abuse of tessellation by devs for no good reason at all. After that other games did not repeat that mistake. These days a lot of devs abuse TAA as a masking technique for artefacts caused by low resolution effects and we get blurry image in games for no good reason - but hardly anyone raised voice, so it got wide-spread.
It's better than (the "usual") native as it can be more sharp, resolve AA better. Partially due to "problematic" TAA AA as well. At the same time is not as good, since it's having its own issue, ergo my comment with people focusing on its shortcomings.

Old games had their own problems at their times. Tessellation was an issue outside of Crysis - TW3 had similar problems with Hairworks. Physix was used to show fluid dynamics and some general physics, while you could already do that through DX - Stalker Clear Sky.

I've added the screenshot with the "mannequin look" since it was rather popular and yet now people complain that in IJ people look like plastic :))

And it could have been done better - since the level of details in the characters of original Crysis.
 
It's better than (the "usual") native as it can be more sharp, resolve AA better. Partially due to "problematic" TAA AA as well. At the same time is not as good, since it's having its own issue, ergo my comment with people focusing on its shortcomings.

According to Nvidia's own words (and DF as well), CNN model from DLSS and DLAA 3.5 introduces aliasing to image in places where the was none, instead of antialiasing everything. TI also shown examples of that in his videos, showing why DLAA is not a good replacement of TAA. Just now Nvidia hopes Transformer model should finally fix it (not fully yet but less of such cases and they said it's work in progress). TAA never had such issues and if implemented properly, looked just fine (sadly more often than not it was implemented badly). The issue with TAA isn't the tech itself - tech works very well for AA - it's the Devs that messed up implementation of it way too many times. Ergo, comparing to proper implementation of TAA in 4k resolution - DLAA is still not better with AA, it falls short. It's strengths lie elsewhere. Thus was generally my biggest disappointment with DLAA on CNN model and I really hope Transformer will finally resolve it. After all, AI AA was what Nvidia advertised DLSS first, before upscaling was a thing (hence the name!). And it's still not fully there, after so many years.

Old games had their own problems at their times. Tessellation was an issue outside of Crysis - TW3 had similar problems with Hairworks. Physix was used to show fluid dynamics and some general physics, while you could already do that through DX - Stalker Clear Sky.

All examples you mention are just Nvidia shoddy marketing trying to punish their competition to look better themselves, sadly. Not even Devs just being bad but actually malicious behaviour, as there was no other reason to do it this way. That's not an argument showing problems in old games, that's an argument against Nvidia's shoddy practices in sponsored games.

I've added the screenshot with the "mannequin look" since it was rather popular and yet now people complain that in IJ people look like plastic :))

Faces in IJ are very bad imho too - it's a bad case of uncanny valley, which human brains are very very sensitive to (part of our survival instinct to recognise such things). It's very similar to the deepfakes shown by Nvidia recently too - horrible things, I hope they will never be implemented in games like that. A lot of people feel literally freaked out by such things, which is how most of us evolved to feel about it - uneasy, scared, not trusting etc. The closer we get to reality but not fully there, the more people will be put off by said uncanny valley. I rather have faces like in for example Witcher or CoD games or Horizon Forbidden West. Very well made faces but far enough from reality to not trigger negative reactions.
 
According to Nvidia's own words (and DF as well), CNN model from DLSS and DLAA 3.5 introduces aliasing to image in places where the was none, instead of antialiasing everything. TI also shown examples of that in his videos, showing why DLAA is not a good replacement of TAA. Just now Nvidia hopes Transformer model should finally fix it (not fully yet but less of such cases and they said it's work in progress). TAA never had such issues and if implemented properly, looked just fine (sadly more often than not it was implemented badly). The issue with TAA isn't the tech itself - tech works very well for AA - it's the Devs that messed up implementation of it way too many times. Ergo, comparing to proper implementation of TAA in 4k resolution - DLAA is still not better with AA, it falls short. It's strengths lie elsewhere. Thus was generally my biggest disappointment with DLAA on CNN model and I really hope Transformer will finally resolve it. After all, AI AA was what Nvidia advertised DLSS first, before upscaling was a thing (hence the name!). And it's still not fully there, after so many years.



All examples you mention are just Nvidia shoddy marketing trying to punish their competition to look better themselves, sadly. Not even Devs just being bad but actually malicious behaviour, as there was no other reason to do it this way. That's not an argument showing problems in old games, that's an argument against Nvidia's shoddy practices in sponsored games.



Faces in IJ are very bad imho too - it's a bad case of uncanny valley, which human brains are very very sensitive to (part of our survival instinct to recognise such things). It's very similar to the deepfakes shown by Nvidia recently too - horrible things, I hope they will never be implemented in games like that. A lot of people feel literally freaked out by such things, which is how most of us evolved to feel about it - uneasy, scared, not trusting etc. The closer we get to reality but not fully there, the more people will be put off by said uncanny valley. I rather have faces like in for example Witcher or CoD games or Horizon Forbidden West. Very well made faces but far enough from reality to not trigger negative reactions.
RDR2, for instance, worked pretty well with a combination of high resolution and DLSS to solve the aliasing issue on vegetation, at least for me. I had good results in other games, more so since I was playing mostly at either 3x1080p or just 1080p - that was about before getting the 4080.
Another aspect of "better than native" comes from the DLSS use itself. In order to get the same performance as I would with DLSS, I had to scale down details to a really low level where it would look worse than DLSS (but at least it was native :)) ). So it could, indeed, also solve the aliasing issue if I would be forced otherwise to play at a lower than native resolution. 720p "native" (without upscalers), looked worse and with more aliasing than DLSS. DLAA, at lower resolutions (as in around 1080p), also worked fined to me, compared to other "native" solution. Again, this was in my user case.

At the same time, stuff like some fog/smoke, could be pixaleted, as I guess DLSS didn't work that well on it and you'd also lose a bit of sharpeninig, which you could get partially back by a sharpening filter. Some of this got solved with later itterations and I think RR.

That mythical TAA could worked decently at native, I suppose. What happens when you need to upscale from a lower resoltuion, does it hold up the same as DLSS? I don't think so. Also, since it's mythical, it matters less of what it can be and more on what the state of affairs actually is.

Those were quick examples, valid, since nvidia didn't hold a gun to their head. But also take Bethesda, with their games, Starfield in particular where it sucked even with amd's support. Far Cry series, where the 2nd instalment worked fine in Eyefinity/Surround, while from 3 onwards was broken and never fixed, even though it says Surround/Eyefinty when selecting the required resoltuion - FOV was probably around 40-50... vomiting inducing. And that was regardless of who sponsered the game - amd or nvidia.
See also Vampires Bloodlines vs HL2, both made on Source engine. See original Crysis that was single threaded, meaning you ran into CPU limited scenarios, that wasn't properly fix even with the remake. ArmA 3 that was also mostly single threaded, only very recently (with its current beta branch) getting some multithreded support that boosted performance. GTA4? :))

All the games from Red Faction onwards that didn't use proper physics, at most just "baked" one (Battlefield). Or amd's demo froblins where you have thousands of AI running around, doing their thing, on basically ancient hardware, but didn't get traction. Mantle, that although it was a performance boosting solution, didn't get the support that it should have.

As for Indy, I don't see something complety wrong with their faces.
 
RDR2, for instance, worked pretty well with a combination of high resolution and DLSS to solve the aliasing issue on vegetation, at least for me. I had good results in other games, more so since I was playing mostly at either 3x1080p or just 1080p - that was about before getting the 4080.
Another aspect of "better than native" comes from the DLSS use itself. In order to get the same performance as I would with DLSS, I had to scale down details to a really low level where it would look worse than DLSS (but at least it was native :)) ). So it could, indeed, also solve the aliasing issue if I would be forced otherwise to play at a lower than native resolution. 720p "native" (without upscalers), looked worse and with more aliasing than DLSS. DLAA, at lower resolutions (as in around 1080p), also worked fined to me, compared to other "native" solution. Again, this was in my user case.
Well, your use case is super sampling, isn't it? Which is an ideal working environment for DLSS-like algorithms but not how NVIDIA advertises its use nor how majority of gamers use it. That anyone with high end GPU needs to go to 720p native to get proper FPS would be (or rather is) the sad state of affairs, though - lazy devs and badly made games. ;)

Now, fun fact - TI dude actually advocates in his latest vid a lot for DLDSR (or similar solution) to be standard in games instead of badly implemented AA, DLAA and the likes. It looks better, works well and produces very clear image without artefacts, little to no ghosting etc. He shown proper examples of quality differences etc. too. And he doesn't seem to be liking Transformer DLAA much (it still has issues, that he presented).
At the same time, stuff like some fog/smoke, could be pixaleted, as I guess DLSS didn't work that well on it and you'd also lose a bit of sharpeninig, which you could get partially back by a sharpening filter. Some of this got solved with later itterations and I think RR.
Improved yes. Solved - no. NVIDIA themselves said it, their AI models need work and further improvements and will get better over time, but they're not there yet - and that was about Transformer model, which already does a much better job than CNN one, in most cases.
That mythical TAA could worked decently at native, I suppose. What happens when you need to upscale from a lower resoltuion, does it hold up the same as DLSS? I don't think so. Also, since it's mythical, it matters less of what it can be and more on what the state of affairs actually is.
TAA based upscaler exist, it's the default one in UE5. It's not great, though. What would (and exist) help is the DLDSR+DLSS as you know, but that's fiddly and most people will just not use it unless that's native in games. But game devs follow what they think gamers will like, so if enough pressure on that would appear, the'd do it. Yet, NVIDIA doesn't advertise it, so they don't, as hardly anyone knows it exists.
Those were quick examples, valid, since nvidia didn't hold a gun to their head.
Figuratively they did - it's called sponsorship. They do as told, or they don't get monies/help.
But also take Bethesda, with their games, Starfield in particular where it sucked even with amd's support.
Why do you always bring up abominations? :D It's also very consistent with Bethesda's track record, sadly.
Far Cry series, where the 2nd instalment worked fine in Eyefinity/Surround, while from 3 onwards was broken and never fixed, even though it says Surround/Eyefinty when selecting the required resoltuion - FOV was probably around 40-50... vomiting inducing.
Yep, I couldn't play any of the later ones without modding for FOV. :/
And that was regardless of who sponsered the game - amd or nvidia.
The thing about NVIDIA and their sponsorship is a bit different - their tactic have been for many GPU generations now to create tech that only works on their hardware, push it through sponsorship into games and try to persuade everyone that it's the only way forth. That way, they are the only ones with hardware support for it and instantly win the market. They failed many times till they realised they will never get that through raster tech, as it's supported by all players and well known, so they pushed toward AI (including DLSS) and RT. It fit perfectly their enterprise R&D too. One can't really work without the other well. Still, it didn't take proper hold till consoles were supporting it well enough, but on the PC at least it finally worked. And here we are, a monopoly. :) FOV (and many other things) is not part of their tech, hence they don't care.

Fun fact here as well - TI dude again mentioned he sees only NVIDIA has any vision for the future of graphics evolution. It doesn't mean he agrees with their choices (AI push) but it's the only choice that exist currently, as both Intel and AMD only fight who of these 2 will be a better alternative for NVIDIA, following same steps. None of them actually have any vision about the future of graphics in games. Then again, AMD is a consoles and enterprise producer, with PC gaming being a super niche that they do just BTW APUs (for consoles and not only).
All the games from Red Faction onwards that didn't use proper physics, at most just "baked" one (Battlefield).
I still remember a bunch of devs blaming it on difficulties with lighting implementation in games. Sad bit - that's been resolved years ago with new algorithms, hybrid-RT or even full RT. And? And still no physics. :P Lazy devs and bad excuses. At least BF devs admitted at some point that they don't know how to make a well balanced, fun multiplayer game with proper physics, completely ignoring that people LOVED Bad Company 2 and full map destruction.
Or amd's demo froblins where you have thousands of AI running around, doing their thing, on basically ancient hardware, but didn't get traction. Mantle, that although it was a performance boosting solution, didn't get the support that it should have.
AMD made a bunch of really well looking and head of their time demos, like this one:

It still looks well today in 4k, even though it's been full of shortcuts and optimisations. But they never had much traction with game devs and NVIDIA often didn't support many of these technologies well enough to push for them to be used, so it took years before we could see some appear in games (like tessellation etc.). NVIDIA's marketing have always been so much better, not just with gamers but also with game devs.
As for Indy, I don't see something complety wrong with their faces.
Every person is different, most people might not like uncanny valley faces, but not everyone reacts the same. Also, playing a lot of games and realising it's a computer graphics and not reality, helps a lot in supressing such instincts. I see them as abominations, though. :)
 
Last edited:
Fun fact here as well - TI dude again mentioned he sees only NVIDIA has any vision for the future of graphics evolution. It doesn't mean he agrees with their choices (AI push) but it's the only choice that exist currently, as both Intel and AMD only fight who of these 2 will be a better alternative for NVIDIA, following same steps. None of them actually have any vision about the future of graphics in games. Then again, AMD is a consoles and enterprise producer, with PC gaming being a super niche that they do just BTW APUs (for consoles and not only).
I think he summarised this very well. Nvidia really are dictating the direction. What irks me is when enthusiasts point out the compromises and why they would prefer a different direction, the fans will say something along the lines of: "this is the future, get used to it because nvidia/DF/HUB/LTT said", now their opinion becomes fact :D
I'm convinced, if RT was an add in card and couldn't be upsold in a GPU, it wouldnt have so much hype and would've gone the way of VR for uptake.
 
Last edited:
Apologies if this is a stupid question, but I can use DLSS4 on my 3080 and I can force frame gen somehow now via the nvidia app? Previously I was using the DLSS swapper thing from GitHub to enable frame gen on CP2077 and Jedi Survivor.
 
Apologies if this is a stupid question, but I can use DLSS4 on my 3080 and I can force frame gen somehow now via the nvidia app? Previously I was using the DLSS swapper thing from GitHub to enable frame gen on CP2077 and Jedi Survivor.
No, you can't. 5k only allows frame gen forcing in the app, currently. I still use DLSS swapper (it works now with xess and fsr now too) plus profile inspector for global settings of diss profile (but those only works with games supported by the app or the ones I swap dlss version of using dlss swapper).
 
Last edited:
Back
Top Bottom