• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Anybody else resenting AMD because of DLSS?

Status
Not open for further replies.
DLSS is a must have for me along with RTX. I finally got around to downloading Watchdogs Legion the other day and the game looked absolutely stunning on my 4k LG CX OLED at 55-65 FPS on tweaked settings with Max RTX and DLSS performance. For curiosity I turned off DLSS and couldn’t see a difference approx 4 feet from the TV so that makes it a win in my book.

AMD should come out with their AMDLSS ASAP as ray tracing is useless without them. The consoles are running Watchdogs Legion at a locked 30 fps with RTX on because of no DLSS. Not just that it looks worse than NVIDIA’s RTX with low resolution ray tracing.

I think its laughable that the once great PC gaming master race is fighting for adaptive image quality control, the whole premise of PC gaming was a few things, options to dial in your settings for your hardware or on the high end, no compromise.

When did people start lapping up compromise? people have no right to say they are running teh game at 4K and then switch on a setting like DLSS, you are not playing at 4K no longer, that bragging right is gone.

I don't upgrade to the really high end anymore cos quite frankly most console games look better then PC games, please no one mention Cyberpunk, that game is trash.

GOW with its latest patch and Spiderman on PS5 looks way better then whats out on PC right now as an overall presentation + they are good games which control I don't find interesting at all.

All of the Ray tracing with RTX games have been so far used on underwhelming games and no true big title, what people here are peeved is that all of the good games are on consoles or likely going AMD for raytracing and they just can't deal with it.

Ratchet and Clank is an amazing game and I'm sure the new one will be too and so far from the previews it looks amazing, I can't think of one game on PC that has RTX that will be as good or actually looks as good and DLSS will just make it worse.

Are the so called PC master race only clamour for PC gaming just FPS and FPS only which DLSS delivers? is that it?

PC gaming used to be about a balanced delivery on the 2, the days where most console games will now look way better then the PC games is upon us and not to mention, be at 60 FPS which is the standard....

I will let the DLSS fan boys fight amongst themselves in this segment.

I have both a PS5 and a PC with a 3080 and I disagree. Firstly, Miles Morales on the PS5 lower resolution textures which I easily noticed at 4K compared to PC games like Valhalla and Watch dogs legion. Also it does not run at native 4K. If you are playing in performance mode, it upscales from 1440p constantly while in performance RT mode, the game runs at roughly 1080p upscales to 4K and manages to look worse than DLSS 1.0. If you want native 4K, the FPS is locked to 30 and it feels horrible to use. Watch dogs legion on PS5 runs at a locked 30 FPS with no option to turn off RTX.

I don’t know why people keep saying the consoles have better optimisation. They don’t. There is no sharpening filter or high resolution texture assets and the game will always upscale from 1440p and it would never run at native 4K unlike PC

BTW DLSS is image reconstruction and not up scaling. There is a difference, easily visible if you compare the performance RT mode in Spider-Man vs 1080p dlss performance.
 
Because you are making up this assumption based on already released games. GAMES that where not designed with AMDs RT design that is why Nvidia out perform AMD in these games.

So again you making up rubbish without even giving AMD a chance to showcase up and coming titles RT performance.
I agree with Shanks on this. In that recent Metro Exodus developer interview, he talked about AMDs RT approach / capability being quite different compared to Nvidia: "AMD’s hybrid raytracing approach is inherently different in capability, particularly for divergent rays. On the plus side, it is more flexible, and there are myriad (probably not discovered yet) approaches to tailor it to specific needs"

That's plenty of reason to believe AMD can achieve an equally or better looking implementation of ray tracing done in their own way, without the same penalties to performance using Nvidia's implementation of it. We'll find out soon enough.

Source link
 
I don't believe it's at all possible for AMD to do raytracing better than Nvidia with current cards. Needs 1-2 more generations and stability in RT standards used in games.

Their goal for a game they are involved in should be to make sure that appropriate cards can run some level of raytracing at a playable fps to prove they can.
 
DLSS is a must have for me along with RTX. I finally got around to downloading Watchdogs Legion the other day and the game looked absolutely stunning on my 4k LG CX OLED at 55-65 FPS on tweaked settings with Max RTX and DLSS performance. For curiosity I turned off DLSS and couldn’t see a difference approx 4 feet from the TV so that makes it a win in my book.

AMD should come out with their AMDLSS ASAP as ray tracing is useless without them. The consoles are running Watchdogs Legion at a locked 30 fps with RTX on because of no DLSS. Not just that it looks worse than NVIDIA’s RTX with low resolution ray tracing.



I have both a PS5 and a PC with a 3080 and I disagree. Firstly, Miles Morales on the PS5 lower resolution textures which I easily noticed at 4K compared to PC games like Valhalla and Watch dogs legion. Also it does not run at native 4K. If you are playing in performance mode, it upscales from 1440p constantly while in performance RT mode, the game runs at roughly 1080p upscales to 4K and manages to look worse than DLSS 1.0. If you want native 4K, the FPS is locked to 30 and it feels horrible to use. Watch dogs legion on PS5 runs at a locked 30 FPS with no option to turn off RTX.

I don’t know why people keep saying the consoles have better optimisation. They don’t. There is no sharpening filter or high resolution texture assets and the game will always upscale from 1440p and it would never run at native 4K unlike PC

BTW DLSS is image reconstruction and not up scaling. There is a difference, easily visible if you compare the performance RT mode in Spider-Man vs 1080p dlss performance.


playing a game at 4K DLSS is not playing the game at 4K....

But guess this is where we disagree + those are not PC games, perhaps with the 2 games you compared you should compare them to the PS5 version cos they exist on there to play, at least Valhalla is ( I would say the PS5 version actually looks better then DLSS version on PC).
 
Resgamingtech says AMDs DLSS competitor called SuperResolution requires per game programming and integration by the developer

https://www.reddit.com/r/Amd/comments/lzvmqd/redgamingtech_amd_super_resolution_doubles/


And? DLSS requires nvidia to get the game to get it working, they tried to say it was a quick procedure back with turing but it still took months for it to appear in their rtx showcase game battlefield v, and even then it was a pathetic implementation that they never improved on.

As for "resmangaming" he's no more clued in than anyone else, the feature is not out yet so its speculation at best.
 
Resgamingtech says AMDs DLSS competitor called SuperResolution requires per game programming and integration by the developer

https://www.reddit.com/r/Amd/comments/lzvmqd/redgamingtech_amd_super_resolution_doubles/

Of course it will need code inside the game, just like DLSS needs code inside the game. The good news is once they bring it in, it will also work on Xbox so it will be easier to be implemented in games without AMD having to pay the devs to put the code inside the game. :D
Nvidia's DLSS should be safe by simply having the bigger marketshare but idk how long it will take before the devs will start doing the work for free after others got paid to put DLSS inside their games.
Anyway let's hope AMD SR is good and is coming this year, because that will force all companies to adopt a standard. And let's hope the next gen will be strong enough so that it will not need upsampling in the first few months after release. :D
 
And? DLSS requires nvidia to get the game to get it working, they tried to say it was a quick procedure back with turing but it still took months for it to appear in their rtx showcase game battlefield v, and even then it was a pathetic implementation that they never improved on.

As for "resmangaming" he's no more clued in than anyone else, the feature is not out yet so its speculation at best.

DLSS doesn’t require training per game, it just needs to be implemented/switched on, which will differ depending on the engine.
 
And? DLSS requires nvidia to get the game to get it working, they tried to say it was a quick procedure back with turing but it still took months for it to appear in their rtx showcase game battlefield v, and even then it was a pathetic implementation that they never improved on.

As for "resmangaming" he's no more clued in than anyone else, the feature is not out yet so its speculation at best.

And it's news worth mentioning because some people here thought it would just be a dynamic driver feature and therefore worked automatically in every game and therefore would wipe out dlss - instead it seems to require quite a bit of work in each game with no game engine integration, essentially putting it directly up against dlss in a battle it can't hope to win. AMD people who keep making posts saying AMD is going to destroy Nvidia features with its own industry adopted feature continue to take L's
 
Apparently DLSS 3.0 will be much better, given DLSS 2.0 is shaping up to be better than the AMD version, it seems like a further lead for Nvidia.
 
And it's news worth mentioning because some people here thought it would just be a dynamic driver feature and therefore worked automatically in every game and therefore would wipe out dlss - instead it seems to require quite a bit of work in each game with no game engine integration, essentially putting it directly up against dlss in a battle it can't hope to win. AMD people who keep making posts saying AMD is going to destroy Nvidia features with its own industry adopted feature continue to take L's

No from what people have said its going to be an open feature unlike DLSS so developers will have the option to use it without a 3rd party. Just like every other AMD derived software. Lets hope it's good so everyone can benefit.
 
Apparently DLSS 3.0 will be much better, given DLSS 2.0 is shaping up to be better than the AMD version, it seems like a further lead for Nvidia.

Where are you even getting this from? There has been bugger all about the amd version yet you're talking like its out and all.
 
instead it seems to require quite a bit of work in each game with no game engine integration, essentially putting it directly up against dlss in a battle it can't hope to win. AMD people who keep making posts saying AMD is going to destroy Nvidia features with its own industry adopted feature continue to take L's

I love the absolute ignorance of this post. "Taking L's" wtf? Is that what 12 year olds are saying these days?

As for the rest of this post it puts it against dlss in a battle it cant hope to win? Says who exactly? A shill on a forum? You would think by now that people that used to underestimate amd would have their heads finally pulled out of their asses given the thunderous boot in the balls that intel have received. Before rdna 2 came out people were reckoning it would be a bit faster than 2080ti and that was it, instead they utterly leapfrogged that performance and for the most part it trades blows with the nvidia cards in a single jump from a position that they were far behind in. RT performance will improve with future iterations as well, you'd be a total fool to think they can't compete with the upscaling tech.
 
My understanding is that there are going to be 3 main options for developers to use for upscaling, nvidia (dlss), microsoft (directml), AMD (FFX). The positive thing is that nvidia 3000 series and AMD 6000 series are both directx 12 ultimate compatible so I wouldn't be surprised if using directx12 raytracing and directml become the main standard as they also work on the new xbox. I suspect that in the future most PC games that support raytracing and upscaling will do so on both nvidia and amd cards so its not something I am concerned about

Possible AMD upscaling performance https://youtu.be/gcagGbi1FcY
 
Last edited:
My understanding is that there are going to be 3 main options for developers to use for upscaling, nvidia (dlss), microsoft (directml), AMD (FFX). The positive thing is that nvidia 3000 series and AMD 6000 series are both directx 12 ultimate compatible so I wouldn't be surprised if using directx12 raytracing and directml become the main standard as they also work on the new xbox. I suspect that in the future most PC games that support raytracing and upscaling will do so on both nvidia and amd cards so its not something I am concerned about

Possible AMD upscaling performance https://youtu.be/gcagGbi1FcY
No there are only two: AMD based on Direct ML and Nvidia DLSS ( can most likely also be made to be based on Direct ML ).
If AMD will be open source, good enough and will work on Nvidia cards, then it can become the new standard. Or if Nvidia makes DLSS open source and Direct ML compatible and it is better than AMD version, that can become the new standard.

Anyway the future of upscaling is open source and based on Direct ML that will also work on Xbox consoles.
 
Beating tech which isn't out yet :eek:

It's that good.

Well first off, Nvidia has physical hardware, AMD doesn't. We all know physical hardware is faster than software.

There will be an overhead on AMD cards that the Nvidia cards don't have due to the extra hardware, so any performance improvement will be less because of the physical overhead.
 
Well first off, Nvidia has physical hardware, AMD doesn't. We all know physical hardware is faster than software.

There will be an overhead on AMD cards that the Nvidia cards don't have due to the extra hardware, so any performance improvement will be less because of the physical overhead.

Give over, you have no idea how or what AMD is doing with theirs or how much real tangible difference in any direction is going to exist.

This is all because... it hasn't seen the light of day... so making claims is a joke.
 
Status
Not open for further replies.
Back
Top Bottom