Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
nvngx_dlss.dll for v3.1.13 can be found at https://github.com/NVIDIA/DLSS/raw/main/lib/Windows_x86_64/rel/nvngx_dlss.dll (will likely auto-download)
Changes from programming PDF:
Seems DLAA was added as a new PerfQuality option next to the existing Quality/Balanced/Performance/UltraPerformance ones, so games can set it like a regular quality level now (previously DLAA was invoked by games just ignoring the DLSS resolutions and rendering at full-res)
- Add DLAA as a Performance Quality Mode
- Add NVSDK_NGX_DLSS_Hint_Render_Preset_G Enum
Interesting change, hopefully adding it as part of the existing levels might help get more devs to implement it.
(The UI guidelines PDF doesn't seem to include DLAA as part of any quality choice UI yet though, seems they still want devs to add it as a separate option outside of any DLSS level choices, bit of a shame since that likely still means more work to add it = less motivation to include it...)
Preset G is mentioned later on in the PDF as "Unused".
Github release page also mentions "Bug Fixes & Stability Improvements" so hopefully might have some model/preset updates too, looking forward to comparisons with it.
The frame generation also leaves a very positive impression when used on the RTX 4060 Ti/8G. It is very remarkable how much DLSS 3.0 can increase the frame rates, with correctly selected settings and at best a suitable frame limit, the result is extremely pleasing. In our case - from the position of continuous tester/critic - it is also clearly visible that the AI is actually capable of learning. Disturbing DLSS 3.0 artifacts are significantly reduced over time, and we were able to see this fact in the titles Cyberpunk 2077 and Forza Horizon. The HUD display in both games when using DLSS 3.0 is currently much less error-prone than at the time of earlier tests. We recently criticized annoying artifacts in Forza when testing the DLSS 3.0 capabilities of the RTX 4070, when testing the RTX 4060 Ti now, this artefact formation is already significantly reduced. Bravo! In the image test video, you can still see some of these errors in the example of Cyberpunk 2077. The quest icon is particularly noticeable there. This circumstance is also much less conspicuous today.
In Diablo IV, both DLSS and FSR have a real shine, because both techniques circumvent the usual problems while taking full advantage of the expected advantages. As a result, DLSS and FSR in the "Quality" setting can work out the details a little more finely in one or the other object compared to the native resolution and apart from that show the identical image sharpness at higher FPS. It only drops minimally to slightly with the performance preset, but remains at a high level.
Image stability is also consistently better with DLSS and FSR than with native resolution. It doesn't matter whether Ultra HD or Full HD is the target resolution or whether you are working with Quality or Performance mode: The temporal upsampling from AMD and Nvidia delivers a quieter picture than the game's own TAA. In UHD, the result can be described as virtually flicker-free in both cases.
DLSS and FSR already perform very well compared to the native resolution, when comparing the same render resolution (native vs. target resolution) both technologies are far superior: Ultra HD with FSR/DLSS set to "Performance" (i.e. rendered in Full HD before upsampling) looks far better than the native Full HD resolution, which is fuzzy and flickering on top of that. Even compared to the native WQHD resolution, the upsampling is better.
DLSS 2 and FSR 2 engage in a close duel in Diablo IV. In quality mode, both technologies are to be regarded as equivalent. DLSS then performs a little better in performance mode, but this is complaining at a high level. In the hack 'n' slay, too, the fewer pixels there are, the better DLSS performs, but the differences in Diablo IV are surprisingly small.
Diablo IV also features Nvidia DLAA, which sets DLSS to native resolution without upsampling. AMD theoretically allows the same with FSR 2, but does not name the mode separately and Diablo IV does not offer this mode anyway.
As a result, DLAA has the best image quality. The image does not flicker at all, and the image sharpness is very good. Nevertheless, the difference compared to DLSS on "Quality" is minimal, especially in Ultra HD.
AMD FSR 2 and Nvidia DLSS 2 should definitely be enabled in Diablo IV if the graphics card supports the technology. The quality setting of both methods offers a better picture than the native resolution, regardless of the resolution in which upsampling is used. With the same render resolution, the advantage is very large, but at the same time there are no significant disadvantages.
The performance mode also works very well and can sometimes be used in Ultra HD without any loss of quality, although internal rendering is only in Full HD. With the same render resolution, FSR and DLSS produce the significantly better image at the same time. So in Diablo IV, temporal upsampling has a real parade appearance - which is probably due to the iso perspective of the game.
Nvidia DLSS 3 (Frame Generation) in analysis
Aside from DLSS Super Resolution (aka DLSS 2), Diablo IV also offers DLSS Frame Generation (aka DLSS 3) on GeForce RTX 4000 graphics cards. ComputerBase also took a look at this technology, but it turned out to be shorter than planned.
There were no problems with the image quality, the images generated by AI could not be distinguished from the rendered ones. However, there was still a problem with the FPS measurements with DLSS 3 on the test system with different GeForce RTX 4000.
If Frame Generation was activated without Super Resolution, everything was fine. The GeForce RTX 4070 used for test purposes rendered 22 percent more FPS through the generated images - at least a small increase.
But when frame generation was combined with DLSS Super Resolution, FPS fell instead of rising. DLSS on "Quality" became 3 percent slower with FG, DLSS on "Performance" lost 6 percent in performance. That can't really be the case, on the test system it affected not only the GeForce RTX 4070 but also the GeForce RTX 4090 - so an initially suspected lack of memory can be ruled out.
It is currently completely unclear why frame generation works on its own on the test system and suddenly no longer works in connection with DLSS Super Resolution. It is probably a bug in the game, which, as has now been found, does not only affect the editorial team's test system.
DLSS 3.0, Frame Generation und die RTX 4060 Ti - Frametimes und Latenzen
Wie wir angekündigt hatten, möchten wir DLSS 3.0, die Frame Generation und die Performance der RTX 4060 Ti nochmals detaillierter unter die Lupe nehmen.www-pcgameshardware-de.translate.goog
Diablo IV im Technik-Test: AMD FSR 2 und Nvidia DLSS 2 & 3 im Detail
Diablo IV im Technik-Test: AMD FSR 2 und Nvidia DLSS 2 & 3 im Detail / AMD FSR 2 und Nvidia DLSS 2 in der Analysewww-computerbase-de.translate.goog
Don't know about anyone else but I'm having a blast with diablo 4, plays so well and I'm using DLAA, basically 120+ fps for most of the time. Puts every other triple a title released this year to shame!
Nice. How has your vram been holding up? Did you run out at any point?
"Looks like it is still paying of being a RTX gamer" seems bit misleading given than 2xxx and 3xxx cards cant use DLSS3.
The curious in me wonders if these games get nerfs to non DLSS mode with the updates, notably cyberpunk 2077 on day 1 runs faster than the newest version without DLSS.
Still smooth sailing here Diablo 4 allocates 9/10gb but only 5gb is in use, that's at 3440x1440 maxed with DLAA, game looks great.
Played it on a mates 4090 rig using his Alienware OLED and at 3440x1440 max settings it was using 15.1GB out of a total of 15.80GB allocated.
The devs definitely put some good work into memory management and texture streaming.
Defo not seeing anything like that here:
Shock horror that kind of thing can be "optimised" to work on a variety of hardware
Game was made for oled, the dark areas looks soooo good, may fire it up on the 4k 55 oled later on but don't want to use a controller for it! Will have to also give DLDSR a go, no doubt it will really shine in this game, that and plenty of perf. on the table.
yesterday i was playing d4 and had 21gb allocated and 15gb in use according to afterburner with ultra texture packPlayed it on a mates 4090 rig using his Alienware OLED and at 3440x1440 max settings it was using 15.1GB out of a total of 15.80GB allocated.
The devs definitely put some good work into memory management and texture streaming.
yesterday i was playing d4 and had 21gb allocated and 15gb in use according to afterburner with ultra texture pack
Kinda scary to think not long ago a game using 4GB was seen as huge, Now we're seeing usage creeping passed 20GB.
That's why imma holding out for the 5090 Super Duper Ti 48Gb model, 24Gb cards just ain't gonna be enough soon.
You joke but unless pc starts to get proper direct storage utilisation and games keep coming out broken, 24gb won't be enough for launch day gaming
Some screenshot comparisons between dlaa and dlss quality at 3440x1440. Ignore some of the warpy effects on the ground, it's because of one of my skill/powers
X-Man?because of one of my skill/powers
X-Man?
yesterday i was playing d4 and had 21gb allocated and 15gb in use according to afterburner with ultra texture pack
I wonder what the devs have done, maybe something in the background to detect gpu vram as barely see mine go above 7gb let alone 5gb dedicated usage (also got the high res. pack installed)
The reason modern games have a terrible bang-per-buck when it comes to VRAM usage is because in D3D11 and earlier there were 128 shader resource slots max, and so for every draw call the system knows which textures it will reference. OS would page textures in and out
Then came D3D12 and bindless APIs, and the developers were in charge of this process, known as residency management. Almost everyone seems to have dropped the ball here, and they just make all textures resident all the time in VRAM, and you end up with this...
The games certainly have enough info about which textures might be referenced by which drawcalls, but without actively informing the OS about this via Evict() / MakeResident() everything stays in VRAM all the time and artists end up lowering texture res to hit perf targets.
It seems that the game programmer answer to "how do you fit a large dataset in memory" is "buy more memory lol". We're all suffering as a result.