• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now

@Calin Banc

Great info there! Would explain why my dedicated vram usage is hardly every going above 6/7gb at most in diablo even when using native 3440x1440 with dlaa, still haven't seen any issues with textures not rendering/loading either, at least nothing is jumping out. I have switched to dlss quality now though since want to get as close to 175 fps as possible, currently getting now 150/160 instead of 120 with dlaa and tbh don't notice a huge difference in IQ, shows how good dlss is :cool:

BTW Alex post below that too:

HceGrDW.png



Basically what some of us knew all along. Hence this meme I love as I get this lovely experience working with software developers every day :cry:

Ugjujvc.png





Also, BF 20242 got it's dlss updated to 2.5.0 with the update earlier in week, shame they didn't use the 2.5.1 version but still nice to have it now as couldn't switch the file out yourself without dlss option disabling since game uses EAC.
 
Last edited:
@Calin Banc

Great info there! Would explain why my dedicated vram usage is hardly every going above 6/7gb at most in diablo even when using native 3440x1440 with dlaa, still haven't seen any issues with textures not rendering/loading either, at least nothing is jumping out. I have switched to dlss quality now though since want to get as close to 175 fps as possible, currently getting now 150/160 instead of 120 with dlaa and tbh don't notice a huge difference in IQ, shows how good dlss is :cool:

BTW Alex post below that too:

HceGrDW.png



Basically what some of us knew all along. Hence this meme I love as I get this lovely experience working with software developers every day :cry:

Ugjujvc.png
Or maybe it would be more cost effective to add a few more GB of memory rather than spending hundreds or thousands of software developer man hours trying to squeeze a quart into a pint pot.
 
Or maybe it would be more cost effective to add a few more GB of memory rather than spending hundreds or thousands of software developer man hours trying to squeeze a quart into a pint pot.

And why should customers have to pay to avoid/brute force through issues that could be fixed as shown? Especially when the vast majority of the "customers" have 8gb or less.......

ZCBXJRs.png


The only gpus with 16gb or more of vram are:

6800/xt
6900xt/6950xt
7900xt(x)
3090
4080
4090

With the exception of RDNA 2 gpus, all those other gpus cost £800+..... I don't think that is appropriate pricing just to get more vram to avoid issues that could be fixed.

Having seen calin bancs post, it just shows that software/game developers as per usual don't want to optimise (hence why they now also rely on upscaling tech to avoid optimising too), although this most likely is not their fault directly but more project management/stake holders pushing for features to get mvp build out over optimisation.

In my workplace, one of our main roles is to scale hardware appropriately but also to optimise the app to work as efficiently as possible with appropriate specced infra. otherwise projects get scrapped because the cloud costs of the infra hosting a highly unoptimised app are extortionate.
 
Or maybe it would be more cost effective to add a few more GB of memory rather than spending hundreds or thousands of software developer man hours trying to squeeze a quart into a pint pot.
This right here, complaining about higher vram usage/maintaining low vram usage in a thread dedicated to DLSS3 and RT that in itself increases vram usage...
 
Last edited:
This right here, complaining about higher vram usage/maintaining low vram usage in a thread dedicated to DLSS3 and RT that in itself increases vram usage...

No one is complaining, simply pointing out that as usual, it's silly to only be pointing the fingers at one company to fit the narrative rather than the real culprit but hey, people obviously love to complain about the issue yet they seem to enjoy paying ££££ to these companies worth billions to avoid/brute force issues....

PS. DLSS (deep learning super sampling) in itself does not increase vram usage.... it's the frame generation aspect which does.

The day we start getting textures which actually look good/4k and a variety of unique assets/textures rather than the same old copy and paste is the day these higher vram gpus will show actual benefit, until then, high vram gpus are nothing but a band aid.
 
Last edited:
Depending on area of game, dlss provides a huge uplift in fps with basically no hit on IQ :cool: native dlaa vs dlss


Tried out some DLDSR with dlss and it works well but not a huge difference in IQ since the dlss implementation in this game is very good so I can't be assed with the faff of changing res. in windows first.
 
  • Like
Reactions: TNA
This right here, complaining about higher vram usage/maintaining low vram usage in a thread dedicated to DLSS3 and RT that in itself increases vram usage...
It's not so much that for me, it's we have hard evidence that adding an extra 4GB can cost as little as $30 (see RX 5500 XT 4/8GB), we also have hard evidence that profit margins are at record highs. But apparently adding an extra $50-60 of memory and maybe eating into your profit margin by a tiny amount is unacceptable so they charge an extra $100 and/or pass the costs onto developers.
 
@Calin Banc

Great info there! Would explain why my dedicated vram usage is hardly every going above 6/7gb at most in diablo even when using native 3440x1440 with dlaa, still haven't seen any issues with textures not rendering/loading either, at least nothing is jumping out. I have switched to dlss quality now though since want to get as close to 175 fps as possible, currently getting now 150/160 instead of 120 with dlaa and tbh don't notice a huge difference in IQ, shows how good dlss is :cool:

BTW Alex post below that too:

HceGrDW.png



Basically what some of us knew all along. Hence this meme I love as I get this lovely experience working with software developers every day :cry:

Ugjujvc.png





Also, BF 20242 got it's dlss updated to 2.5.0 with the update earlier in week, shame they didn't use the 2.5.1 version but still nice to have it now as couldn't switch the file out yourself without dlss option disabling since game uses EAC.

Apparently the man knows what he's talking about:
Yeah I'd also love to know how so, as my experience writing graphics drivers for a major IHV for the past nine years has had me dealing with this issue all the time...
coinciding perhaps with D3D12 being used in ports? D3D11 automated residency management and thus didn't run into this.

And also apparently he's schooling software engineers too

Anyway, some stuff


He: And now it's the false understanding that the CPU and GPU are running in sync, while in reality GPU frames are queued ahead of time and the texture data copy starts as soon as the frame is queued, ahead of its execution.
Other guy: Cool, we target 30 fps, no problem adding another 0.5ms We target 160 (maybe 320) fps and barely fit in 6.25 (3.125) ms budget with all needed textures in VRAM already.
He: The time between initiating the copy from the UMD and the hardware executing it is _under five hundred microseconds_


I'm not telling you "let's do X", I'm telling you "X is the way things are being done and have been done" - PC wouldn't work otherwise as apps can and commonly do overcommit vidmem.

Reminds me of this dev and his "don't worry about it, putt all into memory"...
And if vRAM is helping devs, how much will 16GB last? Not the least, DLSS and RT/PT should help devs even more as will shorten the development time, but I don't see this postered all over.
:D
 
I really like the idea of DLAA being a preset option in games that have DLSS support. It's the best of both worlds so am looking forward to seeing it selectable using this new mode, it seems to be different to just being DLAA inside DLSS, rather all the benefits of DLAA whilst still getting some of the benefits of DLSS too thanks to all the AI stuff.

Yesterday's TPU link:https://www.techpowerup.com/309779/...aa-adoption-to-turn-it-into-a-dlss-preset?amp

Personally I'll still use DLDSR where possible for that sweet 5120x2160, but not all games, inc many new, support DLDSR as they don't have fullscreen presentation, or if they do, don't list a DSR res in the options selectable.
 
The funniest part is that gamers who have low vram cards can't play so many games day one while they moan about bad optimization for several months until eventually the dev gives in and lowers the graphics quality for them and then they can finally play the game. This then rinse and repeats while gamers who don't have vram problems have finished the game and are playing a new game already having fun.



 
Last edited:
The funniest part is that gamers who have low vram cards can't play so many games day one while they moan about bad optimization for several months until eventually the dev gives in and lowers the graphics quality for them and then they can finally play the game. This then rinse and repeats while gamers who don't have vram problems have finished the game and are playing a new game already having month.

Well to be fair VRAM isn't the only thing that may be causing problems with new games. Plenty of stutter struggle stuff lately and the mouse movement on TLOU was botched on launch for example. Not to mention waiting often means cheaper games too. Currently waiting on a sale on Diablo 4 :p
 
Well to be fair VRAM isn't the only thing that may be causing problems with new games. Plenty of stutter struggle stuff lately and the mouse movement on TLOU was botched on launch for example. Not to mention waiting often means cheaper games too. Currently waiting on a sale on Diablo 4 :p

Exactly, utterly stupid comment to make when sites even point out issues on the very best pc hardware there is.... Or/and botched settings such as in hogwarts where ray tracing was and to my knowledge is still broken or/and using one preset lower actually provides better visuals.... Not to mention that what is it, 80+% of steam gamers have 8gb or less of vram? Also lower graphical quality? Alex has already proven that is not the case and in fact the complete opposite.

Diablo 4 is a breath of fresh air, runs absolutely fantastic compared to the **** show of games released recently.
 
Last edited:
  • Like
Reactions: TNA
Dlss2 is brill, i keep it on in any game that supports it and set to quality mode and i play with v-sync on 60fps so noticed lower power consumption and fan noise too.
I cant tell the difference with image quality with dlss on vs off but i don't go lower than the quality preset.
Im sure if i too screenshots and zoomed them in i may notice a difference but during gameplay i just enjoy the extra frames.
Its a shame old cards cant use it and fsr is not up to scratch yet but maybe in time it will improve and even a gt 730 will run cyberpunk maxed out with fsr all the way to giga chad performance?
 
Dlss2 is brill, i keep it on in any game that supports it and set to quality mode and i play with v-sync on 60fps so noticed lower power consumption and fan noise too.
I cant tell the difference with image quality with dlss on vs off but i don't go lower than the quality preset.
Im sure if i too screenshots and zoomed them in i may notice a difference but during gameplay i just enjoy the extra frames.
Its a shame old cards cant use it and fsr is not up to scratch yet but maybe in time it will improve and even a gt 730 will run cyberpunk maxed out with fsr all the way to giga chad performance?
the only place i really notice it is wires/fences can look shimmery but it varies depending on game and its not something im not used to seeing being a gamer for many years so it doesnt take me out of the experience as much as low/choppy fps does
 

In Atomic Heart, the TAA, DLSS and FSR implementations all use a sharpening filter in the render path without the ability to tweak the sharpening values through separate sliders. Each upscaling and anti-aliasing solution is using high sharpening values, set by the developers, which might look oversharpened in most sequences of the game, especially at lower resolutions, however, the sharpening filter does not cause any negative side effects or artifacts during gameplay.

Atomic Heart is a fast paced first person shooter, so when using any temporal upscaling solutions, the temporal stability of the image is key to enjoyable gameplay. When using DLSS, the image was stable in motion in Quality modes and the level of detail rendered in vegetation and tree leaves is improved in comparison to the in-game TAA solution, however, the DLSS implementation struggles to render the power lines correctly, resulting in quite noticeable shimmering across all resolutions and quality modes, and shimmering is visible even when standing still. Only the FSR 2.2 image manages to render the power lines correctly in this game.

Speaking of FSR 2.2 image quality, the FSR 2.2 implementation comes with noticeable compromises in image quality—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering in motion on vegetation, tree leaves and thin steel objects, which might be quite distracting for some people. While the amount of shimmering is less pronounced in comparison to the average FSR 2.1 implementation, shimmering is clearly more visible than in either the in-game native TAA or DLSS image output. Also, there is quite noticeable shimmering issues on weapon scopes, which glow brightly and blink in motion, especially at lower resolutions. The second-most-noticeable difference in the FSR 2.2 implementation compared to the in-game TAA or DLSS solution is a softer and less detailed overall image quality, which is especially visible with grass and vegetation in general.

The NVIDIA DLSS Frame Generation implementation is excellent in this game, the overall image quality is mostly impressive. Even small particle effects, such as different varieties of the enemy's laser abilities, are rendered correctly, even during fast movement. However, as with some other DLSS Frame Generation games that we've tested having issues with the in-game on-screen UI, which had a very jittery look—the DLSS Frame Generation implementation in Atomic Heart is also suffering from this issue. Interestingly, you can reduce these jittering problems by manually upgrading the DLSS Frame Generation version from 1.0.3 to 1.0.7. On a positive note, the DLSS Frame Generation implementation in Atomic Heart does not force you to enable DLSS Super Resolution, as some other Unreal Engine 4 games do.

Speaking of performance, compared to DLSS, FSR 2.2 has slightly smaller performance gains across all resolutions, while also producing more image quality issues. Most of the recent Unreal Engine 4 games have poor CPU multi-threaded performance, as the CPU usage is mostly single-threaded, and high-powered GPUs such as the GeForce RTX 4080 can end up CPU bottlenecked in some sequences of the game, even at 4K. In such a CPU limited scenario, very welcome help comes from the DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect almost doubled performance at 1440p and 1080p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with input latency.
 

They still haven't even released RT have they? What a joke. They were used to showcase RT when the 2080Ti came out.
 
Back
Top Bottom