• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now

Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
Again if amd are such the white knight that people believe in then why not support a solution that benefits everyone..... I refuse to use FSR due to how awful it is, until that changes, amd are literally punishing likes of myself and other nvidia/intel gamers by forcing us to use their lazy attempt at an inferior upscaling solution and it will be the same when their fake frame tech arrives i.e. pointless claiming anyone can use it when it will be **** and most people won't use it due to that :cry:
 
Last edited:
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
Yup, it's borderline terrible now and badly in need of a soft re-launch imo.

Can't count on amd to do that though as it's not their style i.e. they don't want to be stuck with anything hence why they always throw things over the fence to devs/community to take care of. The best thing amd can do is have some form of QC/approval check like they did with freesync i.e. freesync 2 then premium given the **** show of the first iteration of freesync monitors :o

Still makes me laugh when I think of all the "dlss is dead" articles and posts when FSR 1 then 2 launched, aged well that.... :p

You say it works on many GPU's but what good is that when one, DLSS is in far more games and two AMD can't be asked to make sure that when paying to lock out Nvidia they release a top notch quality FSR for the said game?

We already know when done right FSR can be just as good as DLSS. So why don't they make sure of this in sponsored titles.

Would I like to see FSR win out and be in all games, hell yea! But not in it's current form. They need to make sure it is in all the big games and be consistantly good.

Yeah it's laughable claiming how FSR is "free" and a big win as anyone can use it when reality is:

1. it's the only option amd have when they're last to the market and with an inferior option
2. if you have somewhat good eyesight, you will want to be switching FSR off entirely so completely moot point it being usable by everyone

:cry:

I'm holding off for some patches before continuing with jedi survivor as I don't want to play the game without ray tracing but I can't use FSR 2 due to how much it worsens IQ, had dlss been in this game, I could have used what works best for my hardware but nah, amd have decided what is best for me.....
 
  • Haha
Reactions: TNA
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
Agreed. I test FSR in most games I play where it's available and I've never kept it on, because the results are always poor. I'm currently playing the new Star Wars game and would rather lose 30% performance than enable FSR because even on my 4k TV, FSR in its highest quality mode looks like a dogs breakfast and if you dare to need more performance and use FSR performance mode, lmao, then game objects and textures in the background literally just disappear

Yup I tried it out again there on the first planet you land on but it was even worse than the opening mission because of all the foliage so shimmering/fuzziness galore and don't get me started on the pools of water around the planet, they looked beyond bad :o
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
Yup. AMD keep shooting themselves in the foot.

Taking that into account how can one blame AMD for missing an goal by Nvidia? :cry:

Funny thing is when it was first shown in deathloop, it was a good starting point and I probably wouldn't have missed dlss that much had it....

- been consistently good implementations, which improved each time.... instead..... it has arguably got worse with time
- uptake far quicker.... supposedly we were going to be seeing loads of games adopting fsr 2+ instantly including consoles but reality again.... it's in what 2 games for console? and whilst uptake for pc has improved now, it took a long time but even then, a lot of the implementations are using <2.2 version with no way to update like we can with dlss....

Don't even get me started on the lower presets where dlss balanced, performance and ultra performance mode nukes fsr equivalent modes from orbit and then some :p

As said above though, it's more on the developers to do better than it is on amd but amd need to start having a certification badge or something to ensure standards are met especially if they are going to remove the choice/option of people being able to use superior methods for their hardware.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
With the way Tommy bigs up amd solutions, I would have said he has no standards but given he keeps buying nvidia, he must actually have some :p :D

FSR is better in some games purely because of implementation, the same as DLSS. Last of Us has sharper image quality using FSR, and DLSS has broken AA in it. FSR also has higher fps gains than DLSS at the same quality setting. That's just one example.

So don't discount any upscalers for what it is alone, because it all comes down to how it is implemented by a dev.

Disagree on TLOU FSR, it's also terrible in my experience, both dlss and fsr have sharpening galore, thankfully last patch fixed dlss sharpening so I could drop it right down to 0.

And then there is this:


Can't say I have ever noticed a great deal in performance difference, if any difference, my experience has been < 5 fps in either dlss or fsr favour.

I think the only game where I would happily use fsr is cp 2077, it's the one good implementation I've tried out of several games, still not as good as dlss but it's usable.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
It is amd that benefits from her technology working on pascal and maxwell. They didnt do it out of the kindness of their hearts.

Yup if they really wanted to benefit devs and all gamers, they would have attributed to streamline....

c6654Ho.png

Can hardware vendor 3 step forward please

zXlGQBQ.gif
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
Yeah FSR is 3fps higher, not anything to write about especially when you're at 100fps+ anyway :cry:

But there is broken AA using DLSS in last of us (see my screenshot comparisons in the thread), which is why FSR looks better.

And native, even though the framerate remains the same due to the lack of optimisation, looks softer than both FSR and DLSS.

Isn't that with a patch which broke aa/dlss?
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
For whoever is playing that abomination game:


The in-game TAA solution has a very blurry overall image across all resolutions, even at 4K, and very poor rendering of small object-detail—thin steel objects and power lines, tree leaves, and vegetation in general. Also, the in-game TAA solution has shimmering issues on the whole image, even when standing still, and it is especially visible at lower resolutions like 1080p, for example. All of these issues with the in-game TAA solution were resolved as soon as DLAA, DLSS or XeSS were enabled, due to better quality of their built-in anti-aliasing solution. Also, the sharpening filters in the DLAA and DLSS render path helped. With DLSS you can expect an improved level of detail rendered in vegetation and tree leaves in comparison to the in-game TAA solution, and small details in the distance, such as wires or thin steel objects, are rendered more correctly and completely in all Quality modes. With DLAA enabled, the overall image quality improvement goes even further, rendering additional details compared to the in-game TAA solution and DLSS. However, DLSS has some issues that the in-game TAA solution does not. At lower resolutions such as 1080p and 1440p for example, the DLSS implementation has ghosting when flying birds are in the player's view and during evening or night time these birds even have black smearing behind them, which can be quite distracting. However, these ghosting issues can be fixed if you manually update the DLSS version to 3.1 instead of version 2.3, which is used by this game natively.

Speaking of FSR 2.1 image quality, there are a few important issues of note. In Redfall, sometimes the trees are in motion due to dynamic winds and other weather effects. The in-game TAA solution, DLAA, DLSS and XeSS implementations are handling moving trees and vegetation just fine, but the FSR 2.1 implementation is completely different. FSR 2.1 temporal stability completely falls apart when moving trees are in the player's view, as if motion blur effects were enabled at the highest value, which even our screenshots reveal, and it is visible even when standing still across all resolutions and quality modes. Thin steel objects, wires and power lines are also losing temporal stability at medium and far distances and create noticeable shimmering issues or disappear completely at lower resolutions.

The NVIDIA DLSS Frame Generation implementation is excellent in this game, the overall image quality is quite impressive. Even small flying particle effects, such as different varieties of the player's magic abilities, are rendered correctly, even during fast movement. Many other DLSS Frame Generation games that we've tested had issues with the in-game on-screen UI, which had a very jittery look—the DLSS Frame Generation implementation in Redfall does not have this issue. Also, the DLSS Frame Generation implementation in this game does not force you to enable DLSS Super Resolution first in order to utilize DLSS Frame Generation, as some other Unreal Engine 4 games do, and you can use DLAA and DLSS Frame Generation without any issues if you want to maximize your image quality.

Don't you just love to see it, "better than native" :cool: Sorry I forget, it's such a hardship to switch out dlss version :cry:
 
Last edited:
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338

6sKzMSP.gif
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338

In Cyberpunk 2077, the DLSS, DLAA, XeSS and FSR 2.1 implementations all use a sharpening filter in the render path, and the game gives the user the ability to tweak the sharpening values through separate sliders. Each upscaling solution is using different sharpening values set by the developers, but in the FSR 2.1 implementation, even when sharpening is set to the 0 value, some level of sharpening is still applied in the FSR 2.1 render path. During this round of testing, we used the zero value for the DLSS, DLAA, XeSS and FSR 2.1 sharpening filters. What's also important to note is that before the RT Overdrive patch, sharpening filters in the DLSS render path caused negative side effects in this game, such as excessive shimmering in motion, but with the latest version of DLSS now implemented, the developers fixed all issues with the sharpening filters in the DLSS render path and implemented their own sharpening solution for DLSS and DLAA, which works very well.

XeSS comes with three upscaling kernels that are optimized for various architectures. The first is the kernel that gets used on Intel Arc GPUs with XMX engines. This is the most advanced model too, that not only performs better in terms of FPS, but also offers the best upscaling quality, Intel calls this the "Advanced XeSS upscaling model." Intel also provides an optimized kernel for Intel Integrated Graphics, and another compatibility kernel, used for all other architectures that support Shader Model 6.4, e.g. all recent AMD and NVIDIA cards. These use the "Standard XeSS upscaling model," which is somewhat simpler, with lower performance and quality compared to what you get on Arc GPUs (we use the compatibility model on our RTX 4080). If DP4a instructions aren't available, as on the Radeon RX 5700 XT, slower INT24 instructions are used instead.

The in-game TAA solution has very poor rendering of small object detail—thin steel objects and power lines, tree leaves, and vegetation in general. The in-game TAA solution also has shimmering issues on the whole image, even when standing still, and it is especially visible at lower resolutions such as 1080p, for example. All of these issues with the in-game TAA solution are resolved when DLAA, DLSS or XeSS are enabled, due to the better quality of their built-in anti-aliasing solution. Also, the sharpening filters in the DLAA, DLSS and XeSS render path can help to improve overall image quality. With DLSS and XeSS you can expect an improved level of detail rendered in vegetation and tree leaves in comparison to the in-game TAA solution. Small details in the distance, such as wires or thin steel objects, are rendered more correctly and completely in all Quality modes. With DLAA enabled, the overall image quality improvement goes even higher, rendering additional details, such as higher fidelity hair for example, compared to the in-game TAA solution, DLSS and XeSS. Also, both DLSS 3.1 and XeSS 1.1 handle ghosting quite well, even at extreme angles.

The FSR 2.1 implementation comes with noticeable compromises in image quality—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering on vegetation, tree leaves and thin steel objects; they are shimmering even when standing still and it is visible even at 4K FSR 2.1 Quality mode, which might be quite distracting for some people. Once you're switching from FSR 2.1 Quality mode to Balanced or Performance, the whole image will start to shimmer even more. The anti-aliasing quality is also inferior, as the overall image has more jagged lines in motion, especially visible behind cars while driving through the world and in vegetation. Also, in the current FSR 2.1 implementation ghosting issues are worse than both DLSS and XeSS at day time, and it is even more pronounced when there is a lack of lighting in the scene, as the FSR 2.1 image may have some black smearing behind moving objects at extreme angles.

The NVIDIA DLSS Frame Generation implementation is excellent in this game, the overall image quality is mostly impressive. However, there's also a few important issues of note. We spotted excessive ghosting on shadows in motion, specifically behind cars when driving through the world and on main character shadows. The visibility of these ghosting issues will vary depending on the scene, time of day and your framerate. For example, these ghosting issues are more visible at 60 FPS, whereas at 120 FPS they are way less noticeable. The DLSS Frame Generation implementation in the current state also has some issues with the in-game on-screen UI, specifically the mini-map, which had a very jittery look in motion. Interestingly, you can fix these jittering issues by manually downgrading the DLSS Frame Generation version from 3.1.1 to 1.0.7.

Speaking of performance, the XeSS implementations usually have around 10-13% lower performance gain while using the compatibility kernel instruction set that works with all GPU architectures, when compared to competing upscaling solutions from NVIDIA and AMD. However, in Cyberpunk 2077, the DLSS, FSR 2.1 and XeSS 1.1 upscalers are practically identical in terms of performance gain over native TAA across all resolutions. For XeSS 1.1 especially it is a quite welcome upgrade to receive improved performance gains while using the DP4a instruction set, while also producing image quality better than FSR 2.1 with essentially the same performance. Overall, the DLSS, XeSS and FSR 2.1 performance uplift at 4K and 1440p is a great improvement to the game, you can expect almost doubled performance in "Quality" mode, with all graphics settings maxed out, helping to cushion the performance penalty of enabling path tracing. Interestingly, when using the DLAA solution, the performance across all resolutions is identical to the TAA solution.
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
Good examples of dlss with DLDSR offering noticeably better IQ than native whilst offering more performance:

DLSS Quality is actually more detailed than native, even at 2160p - Check it out, especially on the screens on the walls:


Very impressive. Now this is how DLSS and optimisations etc are done
:cool:
Just posting this here as probably best place but people should use DLDSR more:


DLSS performance preset with a higher res. can often look better than dlss quality, obviously fps is hit a bit more but if you got performance to spare, well worth it.

Be good to see more examples of this. Personally I have found myself using dlss performance with dldsr a bit more than just dlss quality (if I have room for performance) now. If anyone has spiderman, curious to see how it looks, would do it myself but no longer installed.
 
Last edited:
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
Tlou uses TAA I imagine? Native looks blurry. Not terribly blurry, but it's still there.

But yes, if you don't care about boosting framerate but image quality is what you are after, DLDSR + DLSS Q is unbeatable. It feels like cheating honestly :D

Yup it uses TAA, native looks decent tbf, definitely one of the better games for TAA implementation. Unfortunately, the dlss implementation was botched, AA was working fine when dlss was enabled when the game launched but the sharpness setting was always on which meant you got all those lovely over sharpening artifacting issues then the devs released an update that fixed this i.e. the sharpness slider worked but in the process of fixing that, it somehow has disabled aa when dlss is enabled so you get some shimmering and aliasing now, still looks better than native and FSR though but could be better.

Another game where DLDSR + DLSS looks great is rdr 2. You need to make sure you are using the >2.5.1 version though, sadly rock* locked down the modification of files with their social club launcher so you'll be unable to switch out the dlss here but if you have it on steam, no issues:

 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338

The frame generation also leaves a very positive impression when used on the RTX 4060 Ti/8G. It is very remarkable how much DLSS 3.0 can increase the frame rates, with correctly selected settings and at best a suitable frame limit, the result is extremely pleasing. In our case - from the position of continuous tester/critic - it is also clearly visible that the AI is actually capable of learning. Disturbing DLSS 3.0 artifacts are significantly reduced over time, and we were able to see this fact in the titles Cyberpunk 2077 and Forza Horizon. The HUD display in both games when using DLSS 3.0 is currently much less error-prone than at the time of earlier tests. We recently criticized annoying artifacts in Forza when testing the DLSS 3.0 capabilities of the RTX 4070, when testing the RTX 4060 Ti now, this artefact formation is already significantly reduced. Bravo! In the image test video, you can still see some of these errors in the example of Cyberpunk 2077. The quest icon is particularly noticeable there. This circumstance is also much less conspicuous today.


In Diablo IV, both DLSS and FSR have a real shine, because both techniques circumvent the usual problems while taking full advantage of the expected advantages. As a result, DLSS and FSR in the "Quality" setting can work out the details a little more finely in one or the other object compared to the native resolution and apart from that show the identical image sharpness at higher FPS. It only drops minimally to slightly with the performance preset, but remains at a high level.


Image stability is also consistently better with DLSS and FSR than with native resolution. It doesn't matter whether Ultra HD or Full HD is the target resolution or whether you are working with Quality or Performance mode: The temporal upsampling from AMD and Nvidia delivers a quieter picture than the game's own TAA. In UHD, the result can be described as virtually flicker-free in both cases.

DLSS and FSR already perform very well compared to the native resolution, when comparing the same render resolution (native vs. target resolution) both technologies are far superior: Ultra HD with FSR/DLSS set to "Performance" (i.e. rendered in Full HD before upsampling) looks far better than the native Full HD resolution, which is fuzzy and flickering on top of that. Even compared to the native WQHD resolution, the upsampling is better.
DLSS 2 and FSR 2 engage in a close duel in Diablo IV. In quality mode, both technologies are to be regarded as equivalent. DLSS then performs a little better in performance mode, but this is complaining at a high level. In the hack 'n' slay, too, the fewer pixels there are, the better DLSS performs, but the differences in Diablo IV are surprisingly small.

Diablo IV also features Nvidia DLAA, which sets DLSS to native resolution without upsampling. AMD theoretically allows the same with FSR 2, but does not name the mode separately and Diablo IV does not offer this mode anyway.

As a result, DLAA has the best image quality. The image does not flicker at all, and the image sharpness is very good. Nevertheless, the difference compared to DLSS on "Quality" is minimal, especially in Ultra HD.
AMD FSR 2 and Nvidia DLSS 2 should definitely be enabled in Diablo IV if the graphics card supports the technology. The quality setting of both methods offers a better picture than the native resolution, regardless of the resolution in which upsampling is used. With the same render resolution, the advantage is very large, but at the same time there are no significant disadvantages.


The performance mode also works very well and can sometimes be used in Ultra HD without any loss of quality, although internal rendering is only in Full HD. With the same render resolution, FSR and DLSS produce the significantly better image at the same time. So in Diablo IV, temporal upsampling has a real parade appearance - which is probably due to the iso perspective of the game.

Nvidia DLSS 3 (Frame Generation) in analysis​

Aside from DLSS Super Resolution (aka DLSS 2), Diablo IV also offers DLSS Frame Generation (aka DLSS 3) on GeForce RTX 4000 graphics cards. ComputerBase also took a look at this technology, but it turned out to be shorter than planned.

There were no problems with the image quality, the images generated by AI could not be distinguished from the rendered ones. However, there was still a problem with the FPS measurements with DLSS 3 on the test system with different GeForce RTX 4000.

If Frame Generation was activated without Super Resolution, everything was fine. The GeForce RTX 4070 used for test purposes rendered 22 percent more FPS through the generated images - at least a small increase.

But when frame generation was combined with DLSS Super Resolution, FPS fell instead of rising. DLSS on "Quality" became 3 percent slower with FG, DLSS on "Performance" lost 6 percent in performance. That can't really be the case, on the test system it affected not only the GeForce RTX 4070 but also the GeForce RTX 4090 - so an initially suspected lack of memory can be ruled out.
It is currently completely unclear why frame generation works on its own on the test system and suddenly no longer works in connection with DLSS Super Resolution. It is probably a bug in the game, which, as has now been found, does not only affect the editorial team's test system.



Don't know about anyone else but I'm having a blast with diablo 4, plays so well and I'm using DLAA, basically 120+ fps for most of the time. Puts every other triple a title released this year to shame!
 
  • Like
Reactions: TNA
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
Nice. How has your vram been holding up? Did you run out at any point? :D

Still smooth sailing here :D :cry: Diablo 4 allocates 9/10gb but only 5gb is in use, that's at 3440x1440 maxed with DLAA, game looks great.

"Looks like it is still paying of being a RTX gamer" seems bit misleading given than 2xxx and 3xxx cards cant use DLSS3.

The curious in me wonders if these games get nerfs to non DLSS mode with the updates, notably cyberpunk 2077 on day 1 runs faster than the newest version without DLSS.

I think they are referring to the dlss quality it provides as well as DLAA, which in itself is definitely a very nice pro, certainly would be nice to get frame generation though given how sites are now rating it highly.

Not sure what other games you are referring to but cp 2077 has had graphical improvements especially with ray tracing i.e. adding local cast rt shadows to objects.
 
  • Like
Reactions: TNA
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
Played it on a mates 4090 rig using his Alienware OLED and at 3440x1440 max settings it was using 15.1GB out of a total of 15.80GB allocated.

The devs definitely put some good work into memory management and texture streaming.

Defo not seeing anything like that here:

zcalmyPh.jpg


EqPNRHPh.jpg


Shock horror that kind of thing can be "optimised" to work on a variety of hardware :p :D

Game was made for oled, the dark areas looks soooo good, may fire it up on the 4k 55 oled later on but don't want to use a controller for it! Will have to also give DLDSR a go, no doubt it will really shine in this game, that and plenty of perf. on the table.
 
Last edited:
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
I wonder what the devs have done, maybe something in the background to detect gpu vram as barely see mine go above 7gb let alone 5gb dedicated usage (also got the high res. pack installed)
 
Caporegime
OP
Joined
4 Jun 2009
Posts
31,338
That's why imma holding out for the 5090 Super Duper Ti 48Gb model, 24Gb cards just ain't gonna be enough soon.

You joke but unless pc starts to get proper direct storage utilisation and games keep coming out broken, 24gb won't be enough for launch day gaming :)
 
Back
Top Bottom