• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now

Saw this on Reddit, Thought someone may be interested.

DLSS Super Resolution SDK 3.1.13 released​


nvngx_dlss.dll for v3.1.13 can be found at https://github.com/NVIDIA/DLSS/raw/main/lib/Windows_x86_64/rel/nvngx_dlss.dll (will likely auto-download)
Changes from programming PDF:
  • Add DLAA as a Performance Quality Mode
  • Add NVSDK_NGX_DLSS_Hint_Render_Preset_G Enum
Seems DLAA was added as a new PerfQuality option next to the existing Quality/Balanced/Performance/UltraPerformance ones, so games can set it like a regular quality level now (previously DLAA was invoked by games just ignoring the DLSS resolutions and rendering at full-res)
Interesting change, hopefully adding it as part of the existing levels might help get more devs to implement it.

(The UI guidelines PDF doesn't seem to include DLAA as part of any quality choice UI yet though, seems they still want devs to add it as a separate option outside of any DLSS level choices, bit of a shame since that likely still means more work to add it = less motivation to include it...)
Preset G is mentioned later on in the PDF as "Unused".

Github release page also mentions "Bug Fixes & Stability Improvements" so hopefully might have some model/preset updates too, looking forward to comparisons with it.
 

The frame generation also leaves a very positive impression when used on the RTX 4060 Ti/8G. It is very remarkable how much DLSS 3.0 can increase the frame rates, with correctly selected settings and at best a suitable frame limit, the result is extremely pleasing. In our case - from the position of continuous tester/critic - it is also clearly visible that the AI is actually capable of learning. Disturbing DLSS 3.0 artifacts are significantly reduced over time, and we were able to see this fact in the titles Cyberpunk 2077 and Forza Horizon. The HUD display in both games when using DLSS 3.0 is currently much less error-prone than at the time of earlier tests. We recently criticized annoying artifacts in Forza when testing the DLSS 3.0 capabilities of the RTX 4070, when testing the RTX 4060 Ti now, this artefact formation is already significantly reduced. Bravo! In the image test video, you can still see some of these errors in the example of Cyberpunk 2077. The quest icon is particularly noticeable there. This circumstance is also much less conspicuous today.


In Diablo IV, both DLSS and FSR have a real shine, because both techniques circumvent the usual problems while taking full advantage of the expected advantages. As a result, DLSS and FSR in the "Quality" setting can work out the details a little more finely in one or the other object compared to the native resolution and apart from that show the identical image sharpness at higher FPS. It only drops minimally to slightly with the performance preset, but remains at a high level.


Image stability is also consistently better with DLSS and FSR than with native resolution. It doesn't matter whether Ultra HD or Full HD is the target resolution or whether you are working with Quality or Performance mode: The temporal upsampling from AMD and Nvidia delivers a quieter picture than the game's own TAA. In UHD, the result can be described as virtually flicker-free in both cases.

DLSS and FSR already perform very well compared to the native resolution, when comparing the same render resolution (native vs. target resolution) both technologies are far superior: Ultra HD with FSR/DLSS set to "Performance" (i.e. rendered in Full HD before upsampling) looks far better than the native Full HD resolution, which is fuzzy and flickering on top of that. Even compared to the native WQHD resolution, the upsampling is better.
DLSS 2 and FSR 2 engage in a close duel in Diablo IV. In quality mode, both technologies are to be regarded as equivalent. DLSS then performs a little better in performance mode, but this is complaining at a high level. In the hack 'n' slay, too, the fewer pixels there are, the better DLSS performs, but the differences in Diablo IV are surprisingly small.

Diablo IV also features Nvidia DLAA, which sets DLSS to native resolution without upsampling. AMD theoretically allows the same with FSR 2, but does not name the mode separately and Diablo IV does not offer this mode anyway.

As a result, DLAA has the best image quality. The image does not flicker at all, and the image sharpness is very good. Nevertheless, the difference compared to DLSS on "Quality" is minimal, especially in Ultra HD.
AMD FSR 2 and Nvidia DLSS 2 should definitely be enabled in Diablo IV if the graphics card supports the technology. The quality setting of both methods offers a better picture than the native resolution, regardless of the resolution in which upsampling is used. With the same render resolution, the advantage is very large, but at the same time there are no significant disadvantages.


The performance mode also works very well and can sometimes be used in Ultra HD without any loss of quality, although internal rendering is only in Full HD. With the same render resolution, FSR and DLSS produce the significantly better image at the same time. So in Diablo IV, temporal upsampling has a real parade appearance - which is probably due to the iso perspective of the game.

Nvidia DLSS 3 (Frame Generation) in analysis​

Aside from DLSS Super Resolution (aka DLSS 2), Diablo IV also offers DLSS Frame Generation (aka DLSS 3) on GeForce RTX 4000 graphics cards. ComputerBase also took a look at this technology, but it turned out to be shorter than planned.

There were no problems with the image quality, the images generated by AI could not be distinguished from the rendered ones. However, there was still a problem with the FPS measurements with DLSS 3 on the test system with different GeForce RTX 4000.

If Frame Generation was activated without Super Resolution, everything was fine. The GeForce RTX 4070 used for test purposes rendered 22 percent more FPS through the generated images - at least a small increase.

But when frame generation was combined with DLSS Super Resolution, FPS fell instead of rising. DLSS on "Quality" became 3 percent slower with FG, DLSS on "Performance" lost 6 percent in performance. That can't really be the case, on the test system it affected not only the GeForce RTX 4070 but also the GeForce RTX 4090 - so an initially suspected lack of memory can be ruled out.
It is currently completely unclear why frame generation works on its own on the test system and suddenly no longer works in connection with DLSS Super Resolution. It is probably a bug in the game, which, as has now been found, does not only affect the editorial team's test system.



Don't know about anyone else but I'm having a blast with diablo 4, plays so well and I'm using DLAA, basically 120+ fps for most of the time. Puts every other triple a title released this year to shame!
 
  • Like
Reactions: TNA












Don't know about anyone else but I'm having a blast with diablo 4, plays so well and I'm using DLAA, basically 120+ fps for most of the time. Puts every other triple a title released this year to shame!

Nice. How has your vram been holding up? Did you run out at any point? :D
 
Last edited:
"Looks like it is still paying of being a RTX gamer" seems bit misleading given than 2xxx and 3xxx cards cant use DLSS3.

The curious in me wonders if these games get nerfs to non DLSS mode with the updates, notably cyberpunk 2077 on day 1 runs faster than the newest version without DLSS.
 
Last edited:
Nice. How has your vram been holding up? Did you run out at any point? :D

Still smooth sailing here :D :cry: Diablo 4 allocates 9/10gb but only 5gb is in use, that's at 3440x1440 maxed with DLAA, game looks great.

"Looks like it is still paying of being a RTX gamer" seems bit misleading given than 2xxx and 3xxx cards cant use DLSS3.

The curious in me wonders if these games get nerfs to non DLSS mode with the updates, notably cyberpunk 2077 on day 1 runs faster than the newest version without DLSS.

I think they are referring to the dlss quality it provides as well as DLAA, which in itself is definitely a very nice pro, certainly would be nice to get frame generation though given how sites are now rating it highly.

Not sure what other games you are referring to but cp 2077 has had graphical improvements especially with ray tracing i.e. adding local cast rt shadows to objects.
 
  • Like
Reactions: TNA
Still smooth sailing here :D :cry: Diablo 4 allocates 9/10gb but only 5gb is in use, that's at 3440x1440 maxed with DLAA, game looks great.

Played it on a mates 4090 rig using his Alienware OLED and at 3440x1440 max settings it was using 15.1GB out of a total of 15.80GB allocated.

The devs definitely put some good work into memory management and texture streaming.
 
Last edited:
Played it on a mates 4090 rig using his Alienware OLED and at 3440x1440 max settings it was using 15.1GB out of a total of 15.80GB allocated.

The devs definitely put some good work into memory management and texture streaming.

Defo not seeing anything like that here:

zcalmyPh.jpg


EqPNRHPh.jpg


Shock horror that kind of thing can be "optimised" to work on a variety of hardware :p :D

Game was made for oled, the dark areas looks soooo good, may fire it up on the 4k 55 oled later on but don't want to use a controller for it! Will have to also give DLDSR a go, no doubt it will really shine in this game, that and plenty of perf. on the table.
 
Last edited:
Defo not seeing anything like that here:

Shock horror that kind of thing can be "optimised" to work on a variety of hardware :p :D

Game was made for oled, the dark areas looks soooo good, may fire it up on the 4k 55 oled later on but don't want to use a controller for it! Will have to also give DLDSR a go, no doubt it will really shine in this game, that and plenty of perf. on the table.

It's odd seeing a new game that not only looks very pretty but is actually well optimized and works well across different hardware... other studios need to take a page out of Blizzards book.
 
Played it on a mates 4090 rig using his Alienware OLED and at 3440x1440 max settings it was using 15.1GB out of a total of 15.80GB allocated.

The devs definitely put some good work into memory management and texture streaming.
yesterday i was playing d4 and had 21gb allocated and 15gb in use according to afterburner with ultra texture pack
 
I wonder what the devs have done, maybe something in the background to detect gpu vram as barely see mine go above 7gb let alone 5gb dedicated usage (also got the high res. pack installed)
 
That's why imma holding out for the 5090 Super Duper Ti 48Gb model, 24Gb cards just ain't gonna be enough soon.

You joke but unless pc starts to get proper direct storage utilisation and games keep coming out broken, 24gb won't be enough for launch day gaming :)
 
You joke but unless pc starts to get proper direct storage utilisation and games keep coming out broken, 24gb won't be enough for launch day gaming :)

Only half joking! :p

I'm old enough to remember when 4Mb cards came out, Diamond (IIRC) released a 6Mb version and there were some musings on the interwebz that it had too much VRam!

That aged like sour milk!

My next GPU will have at least 24Gb, but I'd sure be happier if it had double that.
 
Last edited:
yesterday i was playing d4 and had 21gb allocated and 15gb in use according to afterburner with ultra texture pack

I wonder what the devs have done, maybe something in the background to detect gpu vram as barely see mine go above 7gb let alone 5gb dedicated usage (also got the high res. pack installed)


The reason modern games have a terrible bang-per-buck when it comes to VRAM usage is because in D3D11 and earlier there were 128 shader resource slots max, and so for every draw call the system knows which textures it will reference. OS would page textures in and out

Then came D3D12 and bindless APIs, and the developers were in charge of this process, known as residency management. Almost everyone seems to have dropped the ball here, and they just make all textures resident all the time in VRAM, and you end up with this...

The games certainly have enough info about which textures might be referenced by which drawcalls, but without actively informing the OS about this via Evict() / MakeResident() everything stays in VRAM all the time and artists end up lowering texture res to hit perf targets.

It seems that the game programmer answer to "how do you fit a large dataset in memory" is "buy more memory lol". We're all suffering as a result.
 
Back
Top Bottom