• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia announces DLSS 3.5, with AI-enhanced Ray-Tracing available on all RTX GPUs.

Associate
Joined
12 Nov 2022
Posts
136
Location
Nottingham
I typically scoff at upscaling but this seems to actually increase image quality. Interesting. Thoughts and opinions?

From what I've gathered though, I don't think frame generation would be included in this and would still be for 4000 series only.


 
DLSS3 is only for 4000 series is it not?

Here I believe they're referencing to the Ray Reconstruction element of it. Like currently, if a game has DLSS3, you can still run it on an older Nvidia card, but the frame generation toggle (which is usually seperate from what I've seen) is disabled. I'm assuming this will still remain the case?
 
*glances at his 7900xtx*

Need... to fight... FOMO...

It's easy to fight right now. The tech isn't out yet and you don't know how many games will even support it past cyberpunk (which is becoming a tech demo now).

Obviously if it comes out swinging and there's a lot of support for it - give in to the FOMO :p
 
DLSS3 is only for 4000 series is it not?
gKCtOia.jpg
 
Not generally a fan of these AI implementations - often it produces something which superficially looks like the thing it is supposed to be but the more you see it the more you realise it is just an imposter of the real thing. Hope it won't be the case here as I'm a big fan of ray tracing when done properly.
 
Ah, when I said that brute-force RT will never work long-term*, I expect something clever to reduce the work or even more dedicated hardware offload some of (think of the difference T&L made back in the day), I was not thinking of this kind of thing.

Instead of better hardware or a rendering staging which culls unneeded things, we get... Well more proprietary upscaling / re-construction stuff. Still it looks like Nvidia have yet again hijacked Cyberpunk for their PR purposes. You'd think a game with a budget in the $100s millions wouldn't be that easy to subvert into a tech demo, but what do I know?

* The power required for brute-force compared to what is state-of-the-art now probably requires a transistor leap similar as from 28nm to 3nm - and that took 10-13 years before nodes slowed down as they have recently.
 
Nvidia really should have just named Frame Generation separately instead of DLSS 3 because it just ends up confusing people.

Anyone looking at the new DLSS 3.5 would assume that it only works on the new 40 series cards when in actual fact the Ray Reconstruction works on all RTX cards including the 20 & 30 series.
 
I've been out of the loop on this stuff for a long while now, does this mean the RTX 2xxx cards have gotten an improvement with these featuress or is it a nice to have?

EDIT: Doesn't look like much new has come to older cards :(
 
Last edited:
I've been out of the loop on this stuff for a long while now, does this mean the RTX 2xxx cards have gotten an improvement with these featuress or is it a nice to have?

EDIT: Doesn't look like much new has come to older cards :(
Yeah, you'll still get the new ray tracing improvements on the 20xx cards minus the frame generation.
 
Nvidia really should have just named Frame Generation separately instead of DLSS 3 because it just ends up confusing people.

Anyone looking at the new DLSS 3.5 would assume that it only works on the new 40 series cards when in actual fact the Ray Reconstruction works on all RTX cards including the 20 & 30 series.
Really it's DLSS 2.5 but that wouldn't sound great when 3 is out.
 
Really it's DLSS 2.5 but that wouldn't sound great when 3 is out.
Lol, it's like Nvidia are going out of their way to confuse people by naming it DLSS 3.5.

I'm actually surprised they have allowed the new ray reconstruction to be used on older cards and not made it exclusive to the upcoming 50 series.
 
Lol, it's like Nvidia are going out of their way to confuse people by naming it DLSS 3.5.

I'm actually surprised they have allowed the new ray reconstruction to be used on older cards and not made it exclusive to the upcoming 50 series.
Aren't those cards scheduled for 2025 or something? RR seems to be coming out in the next month or 2 and it's something that runs on existing tensor cores - smart move to allow all of the RTX series to use it - adds longevity.
 
Lol, it's like Nvidia are going out of their way to confuse people by naming it DLSS 3.5.

I'm actually surprised they have allowed the new ray reconstruction to be used on older cards and not made it exclusive to the upcoming 50 series.
Maybe the have tapered performance. If it becomes another talking point that everyone must have (don't know why but Nvidia are experts at marketing), then if Turing gains 10%, Ampere gains 20%, Ada gains 30%, and their next gen gains 60% then it will have done its job without alienating current owners too much.

If it somehow uses more VRAM then even better - give with the one hand and take with the other for anyone not un a x90 tier card as Nvidia are VRAM stingy as usual.
 
Last edited:
The demos shown looked really good, but then they would, as they’re from Nvidia.

I’d like to see some 3rd party reviews of its effectiveness.
Additionally, it “appeared” that this was a result of better meta data. Does that mean that developers won’t need to do anything, or will there be a need to commit a whole bunch of time to implement?
 
The demos shown looked really good, but then they would, as they’re from Nvidia.

I’d like to see some 3rd party reviews of its effectiveness.
Additionally, it “appeared” that this was a result of better meta data. Does that mean that developers won’t need to do anything, or will there be a need to commit a whole bunch of time to implement?

Basically, all RT uses some kind of denoising as they can't process every individual ray in real time, so they take a low sample count. This sample count would be grainy looking as you would see the individual points of light that were calculated rather easily.
So a denoising filter is always used in combination with RT, this is just replacing that denoiser with an AI based denoiser. It shouldn't be difficult to implement- and from what's been said the performance with it turned on is almost the same as it turned off, it's just meant to increase fidelity.
To be honest, this has zero reason to be part of DLSS as technically, it's completely independent of upscaling.
 
Back
Top Bottom