• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DLSS Momentum Continues: 50 Released and Upcoming DLSS 3 Games, Over 250 DLSS Games and Creative Apps Available Now

@Calin Banc

Great info there! Would explain why my dedicated vram usage is hardly every going above 6/7gb at most in diablo even when using native 3440x1440 with dlaa, still haven't seen any issues with textures not rendering/loading either, at least nothing is jumping out. I have switched to dlss quality now though since want to get as close to 175 fps as possible, currently getting now 150/160 instead of 120 with dlaa and tbh don't notice a huge difference in IQ, shows how good dlss is :cool:

BTW Alex post below that too:

HceGrDW.png



Basically what some of us knew all along. Hence this meme I love as I get this lovely experience working with software developers every day :cry:

Ugjujvc.png





Also, BF 20242 got it's dlss updated to 2.5.0 with the update earlier in week, shame they didn't use the 2.5.1 version but still nice to have it now as couldn't switch the file out yourself without dlss option disabling since game uses EAC.
 
Last edited:
Or maybe it would be more cost effective to add a few more GB of memory rather than spending hundreds or thousands of software developer man hours trying to squeeze a quart into a pint pot.

And why should customers have to pay to avoid/brute force through issues that could be fixed as shown? Especially when the vast majority of the "customers" have 8gb or less.......

ZCBXJRs.png


The only gpus with 16gb or more of vram are:

6800/xt
6900xt/6950xt
7900xt(x)
3090
4080
4090

With the exception of RDNA 2 gpus, all those other gpus cost £800+..... I don't think that is appropriate pricing just to get more vram to avoid issues that could be fixed.

Having seen calin bancs post, it just shows that software/game developers as per usual don't want to optimise (hence why they now also rely on upscaling tech to avoid optimising too), although this most likely is not their fault directly but more project management/stake holders pushing for features to get mvp build out over optimisation.

In my workplace, one of our main roles is to scale hardware appropriately but also to optimise the app to work as efficiently as possible with appropriate specced infra. otherwise projects get scrapped because the cloud costs of the infra hosting a highly unoptimised app are extortionate.
 
This right here, complaining about higher vram usage/maintaining low vram usage in a thread dedicated to DLSS3 and RT that in itself increases vram usage...

No one is complaining, simply pointing out that as usual, it's silly to only be pointing the fingers at one company to fit the narrative rather than the real culprit but hey, people obviously love to complain about the issue yet they seem to enjoy paying ££££ to these companies worth billions to avoid/brute force issues....

PS. DLSS (deep learning super sampling) in itself does not increase vram usage.... it's the frame generation aspect which does.

The day we start getting textures which actually look good/4k and a variety of unique assets/textures rather than the same old copy and paste is the day these higher vram gpus will show actual benefit, until then, high vram gpus are nothing but a band aid.
 
Last edited:
Depending on area of game, dlss provides a huge uplift in fps with basically no hit on IQ :cool: native dlaa vs dlss


Tried out some DLDSR with dlss and it works well but not a huge difference in IQ since the dlss implementation in this game is very good so I can't be assed with the faff of changing res. in windows first.
 
  • Like
Reactions: TNA
Well to be fair VRAM isn't the only thing that may be causing problems with new games. Plenty of stutter struggle stuff lately and the mouse movement on TLOU was botched on launch for example. Not to mention waiting often means cheaper games too. Currently waiting on a sale on Diablo 4 :p

Exactly, utterly stupid comment to make when sites even point out issues on the very best pc hardware there is.... Or/and botched settings such as in hogwarts where ray tracing was and to my knowledge is still broken or/and using one preset lower actually provides better visuals.... Not to mention that what is it, 80+% of steam gamers have 8gb or less of vram? Also lower graphical quality? Alex has already proven that is not the case and in fact the complete opposite.

Diablo 4 is a breath of fresh air, runs absolutely fantastic compared to the **** show of games released recently.
 
Last edited:
  • Like
Reactions: TNA

In Atomic Heart, the TAA, DLSS and FSR implementations all use a sharpening filter in the render path without the ability to tweak the sharpening values through separate sliders. Each upscaling and anti-aliasing solution is using high sharpening values, set by the developers, which might look oversharpened in most sequences of the game, especially at lower resolutions, however, the sharpening filter does not cause any negative side effects or artifacts during gameplay.

Atomic Heart is a fast paced first person shooter, so when using any temporal upscaling solutions, the temporal stability of the image is key to enjoyable gameplay. When using DLSS, the image was stable in motion in Quality modes and the level of detail rendered in vegetation and tree leaves is improved in comparison to the in-game TAA solution, however, the DLSS implementation struggles to render the power lines correctly, resulting in quite noticeable shimmering across all resolutions and quality modes, and shimmering is visible even when standing still. Only the FSR 2.2 image manages to render the power lines correctly in this game.

Speaking of FSR 2.2 image quality, the FSR 2.2 implementation comes with noticeable compromises in image quality—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering in motion on vegetation, tree leaves and thin steel objects, which might be quite distracting for some people. While the amount of shimmering is less pronounced in comparison to the average FSR 2.1 implementation, shimmering is clearly more visible than in either the in-game native TAA or DLSS image output. Also, there is quite noticeable shimmering issues on weapon scopes, which glow brightly and blink in motion, especially at lower resolutions. The second-most-noticeable difference in the FSR 2.2 implementation compared to the in-game TAA or DLSS solution is a softer and less detailed overall image quality, which is especially visible with grass and vegetation in general.

The NVIDIA DLSS Frame Generation implementation is excellent in this game, the overall image quality is mostly impressive. Even small particle effects, such as different varieties of the enemy's laser abilities, are rendered correctly, even during fast movement. However, as with some other DLSS Frame Generation games that we've tested having issues with the in-game on-screen UI, which had a very jittery look—the DLSS Frame Generation implementation in Atomic Heart is also suffering from this issue. Interestingly, you can reduce these jittering problems by manually upgrading the DLSS Frame Generation version from 1.0.3 to 1.0.7. On a positive note, the DLSS Frame Generation implementation in Atomic Heart does not force you to enable DLSS Super Resolution, as some other Unreal Engine 4 games do.

Speaking of performance, compared to DLSS, FSR 2.2 has slightly smaller performance gains across all resolutions, while also producing more image quality issues. Most of the recent Unreal Engine 4 games have poor CPU multi-threaded performance, as the CPU usage is mostly single-threaded, and high-powered GPUs such as the GeForce RTX 4080 can end up CPU bottlenecked in some sequences of the game, even at 4K. In such a CPU limited scenario, very welcome help comes from the DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect almost doubled performance at 1440p and 1080p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with input latency.
 
AMD the majority of the time blocks competitor upscaling tech citing the BS excuse of "we want to be open"... there's nothing open about blocking other upscaling solutions from also being implemented... AMD not so much of "the good guy" after all.

rQQAzQ7.jpg

Yup the fact they refused to support an "open source solution" that would have benefitted not just gamers, developers (something they like to keep re-iterating) but also themselves (given how poor FSR 2.2+ adoption still is.....) says it all but sadly some people just can't see through amds "good guy" pr image
 

Shame FSR is still dog ****, if only amd would improve this, a lot of nvidia gamers would probably switch over to them.
 
Yup, it's gotta be borderline embarrassing for AMD now to find out XeSS is overtaking them in the software-upscaling stakes. An improved version is desperately needed asap! Hoping something drops at Gamescom in a few weeks :)

Also FSR 3....

Gl5hM7d.gif


Not to mention it will more than likely release in a shocking state too if history is anything to go by.

Sadly it seems amd fans/customers are perfectly happy to wait months/years for advancements though :o

Only saving grace for those who are forced to use fsr due to amd blocking dlss is that nvidias DLDSR is so good, it makes FSR somewhat more usable especially if using balanced and performance presets.
 
It'll go 1 of 2 ways... either be very well implemented and a true game changer up there with Nvidia's FG.... or it will be a mess and we may see 1 game eventually that has an ok'ish implementation.

No doubt the first game to have it will look great but the question will be, can it be a consistently good implementation/experience or will it be a complete mess for like 90% of the games and most importantly, will the adoption rate for it be good or also a bit of a mess for the first 1-2 years as has been the case with fsr so far.
 

Quite interesting I thought, obviously dlss reigning supreme here but notice the vram usage difference and more often than not, textures are loading in slower or not at all on amd side.... Maybe a game issue though?
 
  • Haha
Reactions: TNA
I'm not a developer by any stretch but after experimenting with enabling ReBar in the Nvidia driver through Nvidia Inspector I noticed textures would take much longer to load in than when it was not enabled in Spiderman Remastered... maybe, Again not a developer, FSR2 in this title is also enabling ReBar ?

Rebar is completely separate thing to upscaling tech but yeah could be down to rebar maybe, not sure if bang4buck has it enabled for this or not on nvidia side. I was wondering if it might be a FSR upscaling thing but there is no way it would render textures like that.
 
If I was cynical I would think, Bethesda is a studio wholly owned by Microsoft and nvidia have an awful relationship with the parent company due to past collaborations. Then I don’t suppose there is much of an impetus to put any of there technologies into a single game engine. Especially as Microsoft has a very good relationship with their main competitor whose tech is also in Microsoft’s console.

Microsoft and Nvidia are working extremely close together on various things, not to mention their partnership on open ai.

It's pretty obvious who is the culprit here in the absence of Nvidia tech ;)
 
There were also instances in that video where textures on NV seem blurry. It looks like the game has bugs and randomly picks the wrong level of texture. I think that is what has happened to that page in the book, rather than it being FSR related.
See the table texture here.
That said, the NV one was better. Check the texture on the arm in the opening cut-scene for example. Much better detail on DLSS.
Capture12.jpg

Obviosuly 24GB vram is not enough! :D
 
Back
Top Bottom