• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Now that DLSS is dead can that HW be re-purposed?

Status
Not open for further replies.
Associate
Joined
28 Sep 2018
Posts
2,259
DLSS is beyond a dud but supposedly cores were dedicated for it. For those more technical, can that die space be used in other areas or are they so specialized for DLSS that they’re useless going forward?

I’d be great to get more performance out of the card and not waste it.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
DLSS is beyond a dud but supposedly cores were dedicated for it. For those more technical, can that die space be used in other areas or are they so specialized for DLSS that they’re useless going forward?

I’d be great to get more performance out of the card and not waste it.

It's funny how amd as just created a driver toggle lol
And it's supposed to be better than DLSS
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
DLSS is not dead, it is just getting started. Already proving how useful they are for denoising as well.

But to answer your other question, the Tensor cores are used to provide 4xFP16 performance. The Gefore 1660 GTX card had to have dedicated FP16 added back in.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
It really is dead, if AMDs showcasing is to be believed.

No need to wait for devs, no performance reduction and still giving a better image quality?

And best of all nvidia can use it lol its on GPU Open
 
Associate
OP
Joined
28 Sep 2018
Posts
2,259
So far the issue seems to be that running at native res and then putting the resolution down to say 85% is better than DLSS

Exactly. For DLSS to work the developers have to send their game to nvidia for them to process it on their render farms to build a "DLSS profile." Even then a simple resolution slider gives you better image quality.

As a 2080ti owner, I rather those resources be accessible for more general usage.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
So far the issue seems to be that running at native res and then putting the resolution down to say 85% is better than DLSS

But that has shown not to be true. Lowering the resolution does increase the FPS, but DLSS provides better performance and better visuals. You dont;l get full 4K quality, but you certainyl get soemthing closer to the original 4K image.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
It really is dead, if AMDs showcasing is to be believed.

No need to wait for devs, no performance reduction and still giving a better image quality?

And best of all nvidia can use it lol its on GPU Open


AMD don;t have anything comparable, they just have a run of the mill upscaler like any TV has.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
I can't agree or disagree without trying it first but all I can say is so what? If it achieved better? Does it matter how its done its reached a similar goal.


well I can categorically say it wont achieve better. Look up any of the numerous research papers on deep-learning based up-scaling. There is no debate to be had here.
 
Soldato
Joined
22 Nov 2006
Posts
23,362
So far the issue seems to be that running at native res and then putting the resolution down to say 85% is better than DLSS

Yea, you get more fps and it looks better as you don't get the horrible blurring and other distortions. It made the whole feature a waste of time.

It's basically a really long winded way of lowering the rendering res and adding loads of anti-aliasing. After the initial list of games which have agreed to support it (most being pretty crap anyway), I've not seen any since.
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,053
I'm surprised at the direction they've gone with DLSS - with the hardware they are using it should be possible to use AI to recreate MSAA at native res without the normal hit of MSAA retaining image quality and increasing performance without resorting to trying to recreate a higher resolution image and what older information suggested they were attempting.
 
Soldato
Joined
24 Aug 2013
Posts
4,549
Location
Lincolnshire
I wouldn’t say it was dead. I’ve used DLSS and RTX on my 4k TV for the extra performance benefit where at a distance you can’t tell any difference and it still looks amazing.

It still has a long way to go however. RTX and DLSS are early tech which over the next few generations will probably improve 10 fold.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
Hang on a minute, no where have I seen any information that AMD is even upscaling the image ? I assumed they were doing a basic bicubic or sincos transform and then using an unsharp mask to sharpen the final image. But AMD never mention image scaling, simply sharpening qwhcih is why they call it sharpening.

In which case Nvidia have been doing this in their drivers for nearly 2 years now, on Pascal, Maxwell, Kepler as well as Turing.


Anyone have any information AMD that they are doing anything other than sharpening?
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
I'm surprised at the direction they've gone with DLSS - with the hardware they are using it should be possible to use AI to recreate MSAA at native res without the normal hit of MSAA retaining image quality and increasing performance without resorting to trying to recreate a higher resolution image and what older information suggested they were attempting.


In theory DLSS can do that, that is the DLSS mode 2, where you an take say a native 4k image and apply DLSS to icnrease IQ beyond MSAA and native 4k. The problem is I think people don;t want to have a performance hit for improved image quality.

The standard mode of DLSS allows people with a 2060 or 2070 to run at close to 4K image quality without lower too many important setting like draw distance.
 
Status
Not open for further replies.
Back
Top Bottom