• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Anybody else resenting AMD because of DLSS?

Status
Not open for further replies.
I think its laughable that the once great PC gaming master race is fighting for adaptive image quality control, the whole premise of PC gaming was a few things, options to dial in your settings for your hardware or on the high end, no compromise.

When did people start lapping up compromise? people have no right to say they are running teh game at 4K and then switch on a setting like DLSS, you are not playing at 4K no longer, that bragging right is gone.

I don't upgrade to the really high end anymore cos quite frankly most console games look better then PC games, please no one mention Cyberpunk, that game is trash.

GOW with its latest patch and Spiderman on PS5 looks way better then whats out on PC right now as an overall presentation + they are good games which control I don't find interesting at all.

All of the Ray tracing with RTX games have been so far used on underwhelming games and no true big title, what people here are peeved is that all of the good games are on consoles or likely going AMD for raytracing and they just can't deal with it.

Ratchet and Clank is an amazing game and I'm sure the new one will be too and so far from the previews it looks amazing, I can't think of one game on PC that has RTX that will be as good or actually looks as good and DLSS will just make it worse.

Are the so called PC master race only clamour for PC gaming just FPS and FPS only which DLSS delivers? is that it?

PC gaming used to be about a balanced delivery on the 2, the days where most console games will now look way better then the PC games is upon us and not to mention, be at 60 FPS which is the standard....

I will let the DLSS fan boys fight amongst themselves in this segment.

never gave a stuff about bragging rights tbh. As long as it looked great, i didnt care how it got there. It should be about the gaming, not splashing cash on expensive hardware just to brag to your mates.
 
So just spent some time flicking between dlss on and off in control, not sure if it's broken or what as there is so little difference but you can see some difference on the edges, also, was sure to take plenty of screenshots of photos/paintings on the wall with both dlss on and off and see how there is no difference in some of the blurry paintings, even with dlss off, they are still blurry (couldn't get back to the very start of the game) so looks like this is more of a game issue perhaps than a dlss issue?

But still I'm wondering if my DLSS toggle is broken.... But then again, people rate control as being the best for show casing DLSS 2.0

yUMTvSW.png


PK0l8Un.png


N0VB4Ip.jpg


gxZspCK.jpg


GBq0xbP.png


rOKpYaK.png


6tBIhOK.png


s9kk4zz.png


NRNOj3p.jpg


dj3ySuA.jpg


s2RGimq.jpg


HgboHvJ.jpg


KCLwCew.png


XhpWI53.png
 
But as shown in the gamers nexus video, DLSS is actually better than using some forms of AA because it doesn't introduce the problems of poor AA methods i.e. jaggies and so on. As noted in the cyberpunk video, you need to make sure you aren't using other forms of post processing effects like film grain, motion blur, chromatic aberration (which these should always be turned off anyway) etc. as these in combination with dlss can cause terrible artifacting/issues.

Yes, in movement/motion, you can see ghosting/trailing like effect, I imagine this will be improved as the tech matures further but ultimately, it's down to how the final image is presented/rendered and I fully expect the open source method from microsoft/amd/nvidia to have the same issues here.

madvr/mpc-hc for media playback has similar issues depending on what scaling algorithm you use, some have more halo'ing than others, other's have more jaggies and so on.

So, to summarise - in current state it's not a perfect, flawless solution like so many believe. It has issues, though so do for example TAA (in some games it causes horrible blur of the whole image). As is now, whichever method we choose, it's full of compromises and each person will have different preferences. Personally I prefer real resolution than upscaling on static images, but having checked DLSS, CAS, TAA - all in dynamic games - I can't really tell a difference., even though on static images it's clear. Unless, as I said, we have some huge and obvious artifacts - but these happen in 1440p and lower res usually, not in 4k (we have enough pixels there for DLSS to generate very nice image).
 
You can because it's not lower res, DLSS is trained at 16k

That is not how it works. The whole training per game in 16k was DLSS 1.0 and we all know how it looked after (blur-fest). Currently DLSS 2 is universal and not trained for speciffic games anymore. It still has to "think" and "imagine" (in very human words) what pixels should be in place of missing pixels. Often it gets it right but there are times when it's just more or less wrong. Then when you compare 4k native VS DLSS in 4k and look closely you can often see that both might look good but in many places these 2 images are just different. Not better or worse but different - as DLSS filled in pixels that wouldn't be there but that fit the algorytm.
 
I don’t understand why people get their knickers in a twist about DLSS. You don’t HAVE to use it. It’s an option, a choice. You know that thing we used to have before 2020....

Not exactly. The more popular it gets and the more people have GPUs supporting it, the more devs will not care about optimising their product and instead will say "Just turn on DLSS for higher FPS". There's a risk soon enough we might need DLSS on just to get 60FPS again, as devs just won't care anymore to even try, if DLSS can save them time and money.
 
Not exactly. The more popular it gets and the more people have GPUs supporting it, the more devs will not care about optimising their product and instead will say "Just turn on DLSS for higher FPS". There's a risk soon enough we might need DLSS on just to get 60FPS again, as devs just won't care anymore to even try, if DLSS can save them time and money.

Things like this already happen.

You don't need to control file sizes like you used to and faster hardware can brute force garbage code.
 
Not exactly. The more popular it gets and the more people have GPUs supporting it, the more devs will not care about optimising their product and instead will say "Just turn on DLSS for higher FPS". There's a risk soon enough we might need DLSS on just to get 60FPS again, as devs just won't care anymore to even try, if DLSS can save them time and money.

I think this would be a real danger if both AMD and nvidia had similar enough technology where it was easy to flip that setting on and it would cover all the modern titles (or some with soon to be released supporting it too). As it stands AMD are not going to be ready till at least the summer, and nvidia have to work on the game then process it to be trained so it can be put into a game ready driver. Currently not enough games are out to make this a lazy developers dream IMO but I understand where you are coming from.
 
I think this would be a real danger if both AMD and nvidia had similar enough technology where it was easy to flip that setting on and it would cover all the modern titles (or some with soon to be released supporting it too). As it stands AMD are not going to be ready till at least the summer, and nvidia have to work on the game then process it to be trained so it can be put into a game ready driver. Currently not enough games are out to make this a lazy developers dream IMO but I understand where you are coming from.

DLSS 2.0 doesn’t require training per game. A developer can implement it on a new game and it’ll just work, but it still requires testing to make sure it’s working well enough to include.
 
Things like this already happen.

You don't need to control file sizes like you used to and faster hardware can brute force garbage code.

Agreed. DLSS helps the consumer make poorly optimised and rushed games more playable, something which isn’t going to change anytime soon. Companies will keep posting ridiculous deadlines at the expense of quality.
 
Agreed. DLSS helps the consumer make poorly optimised and rushed games more playable, something which isn’t going to change anytime soon. Companies will keep posting ridiculous deadlines at the expense of quality.
Why would a poorly optimised, rushed game have a good implementation of DLSS? Surely it will have a poor implementation of DLSS?
 
DLSS 2.0 doesn’t require training per game. A developer can implement it on a new game and it’ll just work, but it still requires testing to make sure it’s working well enough to include.

You are correct, I absorbed that from 1.0. So I was sure I read about it still not being so easy integrated. This is a snippet of the summary that integration is not as simple as you made it though:

The catch to DLSS 2.0, however, is that this still requires game developer integration, and in a much different fashion. Because DLSS 2.0 relies on motion vectors to re-project the prior frame and best compute what the output image should look like, developers now need to provide those vectors to DLSS. As many developers are already doing some form of temporal AA in their games, this information is often available within the engine, and merely needs to be exposed to DLSS. None the less, it means that DLSS 2.0 still needs to be integrated on a per-game basis, even if the per-game training is gone. It is not a pure, end-of-chain post-processing solution like FXAA or combining image sharpening with upscaling.
- source anandtech
 
Anybody else resenting AMD because of DLSS?

No, but it was one of the reasons I stuck with NVIDIA. :(
That and the lower overall RT performance.

It's a real shame because I was more than ready to jump ship, but AMD just isn't there yet.
I just hope the stars align for RDNA3, as I'd really like to see NVIDIA smacked around and brought down a peg or two just like Intel has been tbh.
 
TAA itself already destroys the image quality... I don't know what to say in this matter. I'll keep gaming at 1080p and avoid DLSS for a time now, but I really look forward to see how dlss quality works at such a low resolution.
 
Why would a poorly optimised, rushed game have a good implementation of DLSS? Surely it will have a poor implementation of DLSS?

I wouldn’t call it a poor implementation of DLSS in that situation as it’s only as good as the game it’s attached to.
 
Developers won't care about poor implementations. Because majority of gamers do not care about what image quality they get, actually.

Shadow of Tomb Raider and Metro Exodus have superb TAA implementation, and both games look sharp and clean even at 1080p and 1440p.

RDR 2 however? It's the worst TAA implementation ever.

But both TAA implementations does their work, eliminates jaggies...

Be it bad, poor, good, or superb, DLSS will always increase the FPS a lot... And that's what matters for a lot of "gamers".
 
Developers won't care about poor implementations. Because majority of gamers do not care about what image quality they get, actually.

Shadow of Tomb Raider and Metro Exodus have superb TAA implementation, and both games look sharp and clean even at 1080p and 1440p.

RDR 2 however? It's the worst TAA implementation ever.

But both TAA implementations does their work, eliminates jaggies...

Be it bad, poor, good, or superb, DLSS will always increase the FPS a lot... And that's what matters for a lot of "gamers".

I somewhat agree heavily with you here. The chunk of 'gamers' out there are not on bleeding edge hardware, and most will be on a mixture of things that do not maximise their settings at all.
 
After owning AMD cards for the last 15 years, I got my RTX 3080FE yesterday and tested DLSS in a few games. It is a great feature if you require some more fps especially for RT enabled games but I won't be using it in games where I can get over 70fps. It has a slight blur which I can notice even in Quality mode. The performance modes are not worth using at all unless you have some seriously low fps or a low end system.

It may be better when upscaling from 1440P to 4K but when using a 1440P screen it is upscaling from a lower resolution hence why I probably notice the softer image. When not moving the camera, the image quality is very good but when you move the mouse slowly you can notice a distinct blur effect (a bit like ghosting on a low refresh screen) which is not noticeable with DLSS OFF. I have high hopes for the AMD FXSR since I liked using the Fidelity CAS in the games that supported it so better upscaling would seriously improve that feature.
 
Status
Not open for further replies.
Back
Top Bottom