• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Anybody else resenting AMD because of DLSS?

Status
Not open for further replies.
RDR 2 when you use the extra sharpening isn't too bad. I could run MSAA on it but MSAA doesn't get rid of all the jaggies :( Jaggies are the most distracting thing for me in games.

Can I just say, but Control doesn't even look like a next gen game at all or good at all?, why do people care about RT and DLSS in it?

Depends what we're talking about, textures and looks on the whole, it is pretty bland but character models, lighting, reflections, particle effects then throw in the destruction in a lot of scenes is fantastic.

Ray tracing does add a lot to the atmosphere in this game so naturally DLSS is required if you want to keep your fps high and as per the screenshots above, you can see there is virtually nearly no difference at all unless pixel peeping at the picture frames etc. In motion, the ghosting is still there though, however, in my experience, much less noticeable compared to cyberpunk, only time I noticed it is with shattered windows/glass
 
Control is another example where TAA/DLSS and other stuff destroy the image quality

Game looks gorgeous on native 4k. Really gorgeous. But at 1440p and 1080p, it looks like a hideous blurry vaseline smeared myopic beast.

The majority of games are like that. Same for Gears 5. I saw it running on a Xbox One X at native 4k and i was like ":O" When I played it on my 1080p monitor at 1080p rendering with max settings I was like ";(". Then I increase the resolution scale and voila: game looked like One X.

1080p/1440+TAA is ruining the base level of graphics, because at 1440p, games look like they're at 720p, and at 1080p, they look like 540p. Only at 2160p they resemble the claritiy of actual 1080p games pre-2016...

And now with consoles running at 1200-1300p, Nvidia and amd gunning for low res+upscale combos, I completely lost faith in sharp and clean images in games... Every reviewer out there blabbering about "native 4k is useless!1 upscaling is the future!1 dynamic resolution is godsend!1 native is unnecessary!1" They can't even accept the fact that games look hideous at anything below 1800p with TAA.

I only hope Nvidia and AMD can manage to make their upscaling tech a ton better alternative than TAA. But I have low hopes..

I guess people really forgot what games actually looked like before TAA...
 
RDR 2 when you use the extra sharpening isn't too bad. I could run MSAA on it but MSAA doesn't get rid of all the jaggies :( Jaggies are the most distracting thing for me in games.



Depends what we're talking about, textures and looks on the whole, it is pretty bland but character models, lighting, reflections, particle effects then throw in the destruction in a lot of scenes is fantastic.

Ray tracing does add a lot to the atmosphere in this game so naturally DLSS is required if you want to keep your fps high and as per the screenshots above, you can see there is virtually nearly no difference at all unless pixel peeping at the picture frames etc. In motion, the ghosting is still there though, however, in my experience, much less noticeable compared to cyberpunk, only time I noticed it is with shattered windows/glass

See this is the thing, no matter what screenshot or footage I look at Control, GOW on the PS5 utterly destroys it on all of these fronts, it looks better in all possible metric and to me its a better game.

What I am trying to say is, no matter what 'best implementation' there is, what good is that if the example is a game that doesn't actually look good compared to the market.

Lets see RTX + DLSS used on a game where the base fidelity or ultra settings to the game actually warrant next gen visuals, so far the games on the PS5 that I have seen look far better then anything on PC right now and no amount of bickering if DLSS is great or not, PC visuals have just got trounced.
 
No, but it was one of the reasons I stuck with NVIDIA. :(
That and the lower overall RT performance.

It's a real shame because I was more than ready to jump ship, but AMD just isn't there yet.
I just hope the stars align for RDNA3, as I'd really like to see NVIDIA smacked around and brought down a peg or two just like Intel has been tbh.

I actually would have bought a 6800XT if there was stock but managed to get a 3080FE this week through a discord alert. After using the 3080FE I would still go for a 6800XT since it gets more fps which is more important to me than RT. The 3080FE experience is not as great as many people like to claim. AIB cards would be better. It outputs a lot of heat and has coil whine. Also pushes heat directly into my cpu intake fan. The driver suite is also quite poor compared to AMD's adrenalin.
 
Last edited:
See this is the thing, no matter what screenshot or footage I look at Control, GOW on the PS5 utterly destroys it on all of these fronts, it looks better in all possible metric and to me its a better game.

What I am trying to say is, no matter what 'best implementation' there is, what good is that if the example is a game that doesn't actually look good compared to the market.

Lets see RTX + DLSS used on a game where the base fidelity or ultra settings to the game actually warrant next gen visuals, so far the games on the PS5 that I have seen look far better then anything on PC right now and no amount of bickering if DLSS is great or not, PC visuals have just got trounced.


God of War on PS4 looks a million times better than lots of "rtx dlss" titles.

Even at 1080p, it looks clean, sharp and definite.

Sony exclusive games are simply superior in terms of optimizaton, graphics and overall level of quality. I was mesmerized by the graphics on that 2013 device. People mock it by saying "lol 750ti gpu, lol tablet cpu", yet it runs games such as HZ:D, GoW and TloU2 part 2 gorgeously.

And unlike 3rd party games, where 2020 games mostly look blurry at 1080p (control, fallen order, and any other games which utilizes taa), ps4 exclusives looked clean and not blurry at 1080p.

I'm pretty sure those exquisite developers will keep providing good image quality even at 1300p, while other 3rd party games that targets multiplatforms will look blurry, vaselined at 1440p/1300p, and only look okayish at 4K.

Sony gaming environment is a really different beast...
 
See this is the thing, no matter what screenshot or footage I look at Control, GOW on the PS5 utterly destroys it on all of these fronts, it looks better in all possible metric and to me its a better game.

What I am trying to say is, no matter what 'best implementation' there is, what good is that if the example is a game that doesn't actually look good compared to the market.

Lets see RTX + DLSS used on a game where the base fidelity or ultra settings to the game actually warrant next gen visuals, so far the games on the PS5 that I have seen look far better then anything on PC right now and no amount of bickering if DLSS is great or not, PC visuals have just got trounced.

That's what happens when developers can focus purely on one platform. Better comparison is to compare the likes of death stranding and horizon zero dawn then days gone when it comes out to what the PS versions look like and their performance.

Also, just because a game has RTX/ray tracing featured in a game doesn't automatically mean it will improve graphics across the board.... Digital foundrys video on spiderman shows where ray tracing/rtx comes into its own i.e. certain games will show ray tracing of better


I actually would have bought a 6800XT if there was stock but managed to get a 3080FE this week through a discord alert. After using the 3080FE I would still go for a 6800XT since it gets more fps which is more improtant to me than RT. The 3080FE experience is not as great as many people like to claim. AIB cards would be better. It outputs a lot of heat and has coil whine. Also pushes heat directly into my cpu intake fan. The driver suite is also quite poor compared to AMD's adrenalin.

Likewise and only if there was some chance of getting a 6800xt for MSRP, no way would I be paying £700+ for a 6800xt when you can get a 3080 for £650 so that alone ruled out amd cards as there was zero chance of that happening.

However, I have been super impressed with the 3080FE, it runs extremely quiet and cool in my pc and temps on the whole are less, even for the cpu, it dropped by 10 degrees since I switched it out from my sapphire vega 56 pulse. Throw in an undervolt and you get even better results on the temp/fan speed/noise front with a 100W less. Thankfully no coil whine on my model (paired up with an evga platinum superova 850w PSU)

Drivers do suck big time compared to amd though.
 
Likewise and only if there was some chance of getting a 6800xt for MSRP, no way would I be paying £700+ for a 6800xt when you can get a 3080 for £650 so that alone ruled out amd cards as there was zero chance of that happening.

However, I have been super impressed with the 3080FE, it runs extremely quiet and cool in my pc and temps on the whole are less, even for the cpu, it dropped by 10 degrees since I switched it out from my sapphire vega 56 pulse. Throw in an undervolt and you get even better results on the temp/fan speed/noise front with a 100W less. Thankfully no coil whine on my model (paired up with an evga platinum superova 850w PSU)

Drivers do suck big time compared to amd though.

The card is quiet but I was used to me trusty Sapphire 5700XT Nitro+ (sold it for £540 :p) which was incredibly well cooled and had top notch components. I guess an FE is not gonna have the best components compared to the higher priced AIB's. It is a very nice card indeed but was expecting more for £650 (which, ignoring the current insanity, is still a very high price for most people)
 
I actually would have bought a 6800XT if there was stock but managed to get a 3080FE this week through a discord alert. After using the 3080FE I would still go for a 6800XT since it gets more fps which is more important to me than RT. The 3080FE experience is not as great as many people like to claim. AIB cards would be better. It outputs a lot of heat and has coil whine. Also pushes heat directly into my cpu intake fan. The driver suite is also quite poor compared to AMD's adrenalin.
My 3080 FE has bad coilwhine too, maybe we are just unlucky. I too would probably have got a 6800XT at MSRP if it had been available despite using a 1440P Gsync monitor. I do like playing Cyberpunk with max raytracing (DLSS on balanced) and Quake 2 RTX at decent framerates though ;).
 
Resident Evil is an AMD title meaning the RT effects will be very minimal so that AMD cards don't tank in performance compared to Nvidia cards.
Due to this, I highly doubt DLSS will be needed
 
Nope, not unless the developers wanted to add DLSS 2.0 and AMD said "no".....



SAM is not something that AMD came up with, this has been around for a long time, it's only now that hardware manufacturers are getting around to implementing it. Also, SAM is AMDs marketing term, resize bar is the correct term.

Same way AMD did not create "freesync" or rather the correct term, "adaptive sync", they took something that was already there in the displayport and hdmi connection port and enabled it then called it freesync. Of course nvidia took a closed source approach with adding extra hardware to enable their version of "adaptive sync" i.e. gsync (also because their cards at the time did not have the correct hdmi/DP specifications to use adaptive sync)

Actually, the HDMI implementation of Freesync is entirely on AMD, not the 2.1 version but prior. It was not using a standard to make it work but some hack-together weird sauce and while Nvidia was certainly the first of the 2 to release an Adaptive-Sync technology to the market there has been previous players like Apple Promotion displays on the market using VRR, not that it matters really. In the end, it's an arms race nothing more nothing less. Feature sets win over customers.

Typical ignorant responses from people who I'd wager have never actually used it.

And if you have actually used it, maybe you've only seen 1.0? DLSS 2.0 at quality setting is incredibly good. I was skeptical myself, until I actually tried it for myself.
DLSS 3.0 will be ridiculously good I've no doubt.

Being an RTX 3070 owner I must say that the one time I was hoping for some extra performance from DLSS I was very disappointed with the image quality. The game was Cyberpunk 2077 and artifacts were plenty and in my face.
 
Its not common sense though. You just making up rubbish!

Explain why it is rubbish..?

It's an AMD sponsored game
AMD do not perform as well as Nvidia with heavy amounts of RT
AMD are advertising RT

Common sense would suggest the RT will be minimal otherwise their AMD sponsored title will perform better on Nvidia cards :rolleyes:
 
That's what happens when developers can focus purely on one platform. Better comparison is to compare the likes of death stranding and horizon zero dawn then days gone when it comes out to what the PS versions look like and their performance.

Also, just because a game has RTX/ray tracing featured in a game doesn't automatically mean it will improve graphics across the board.... Digital foundrys video on spiderman shows where ray tracing/rtx comes into its own i.e. certain games will show ray tracing of better

Yep and exactly, which really makes me question so many people about the notion of having DLSS, they all should know its gonna be drip fed into a few titles only to the market due to the way it needs to be added into the game as its own thing and requires I am pretty sure Nvidia validation.

Capcom knows damn well this game will sell more on consoles and the current console boom is evident there are more hungry gamers for hardcore titles like RE then on PC who only seem to be bickering about DLSS in Control and only slightly touch other games where its not implemented in a great way, Cyber punk is a title that needs to be thrown out of the window until its actually a full functioning game then we can use it in comparisons.

I can't think of another title coming to PC that is getting RT that is a big game and previewed with amazing visuals, there is not one and I think people need to look into this more before clamouring over these features that are in games that don't look as great to whats on the Market right now.

I don't rate Control and isn't a game changer to the market.

Death stranding prob had more hype leading upto the release then the actual game it self....

Cyberpunk well lulz

metro looks alright but its very grey and brown but I never rated the series to be a good game but I would rate Gears of War looking better then that and it doesnt have ray tracing + to me its a better game.

Minecraft

Quake....

Some indie titles?

Can people see the issue here?

WoW has RT but in a really small amount but to say that game has next gen fidelity is a stretch.
 

just a case of history repeating; all AMd sponsored Games with rayvtracing so far have had incredibly reduced and muted rayvtracing effects. I'm actually surprised for RE8 they doing RT reflections since reflections tank performance on AMD (more than they do on Nvidia) - in saying that, I couldn't actually see any reflections in the video they put up, so it's highly likely that the reflections only apply to a absolutely tiny amount of assets like the one or two mirrors in the mansion

AMD seems to be decent at RT shadows but not reflections, depth of field or global illumination
 
Yep and exactly, which really makes me question so many people about the notion of having DLSS, they all should know its gonna be drip fed into a few titles only to the market due to the way it needs to be added into the game as its own thing and requires I am pretty sure Nvidia validation.

Capcom knows damn well this game will sell more on consoles and the current console boom is evident there are more hungry gamers for hardcore titles like RE then on PC who only seem to be bickering about DLSS in Control and only slightly touch other games where its not implemented in a great way, Cyber punk is a title that needs to be thrown out of the window until its actually a full functioning game then we can use it in comparisons.

I can't think of another title coming to PC that is getting RT that is a big game and previewed with amazing visuals, there is not one and I think people need to look into this more before clamouring over these features that are in games that don't look as great to whats on the Market right now.

I don't rate Control and isn't a game changer to the market.

Death stranding prob had more hype leading upto the release then the actual game it self....

Cyberpunk well lulz

metro looks alright but its very grey and brown but I never rated the series to be a good game but I would rate Gears of War looking better then that and it doesnt have ray tracing + to me its a better game.

Minecraft

Quake....

Some indie titles?

Can people see the issue here?

WoW has RT but in a really small amount but to say that game has next gen fidelity is a stretch.
Each to their own and all that but cyberpunk had very few bugs/issues for me and is easily the best game I've played in the last few years, not hard since the gaming industry in general has been ****. Just waiting for the next patch in case it adds more content etc. before I finish my second playthrough.

As for upcoming games with dlss/rtx, personally for myself:

- metro
- dying light 2, pretty sure this has been confirmed to get rtx/dlss?
- maybe ark 2? Think the first one was a nvidia sponsored title and think this one might be too
- on air looks interesting
- maybe outriders.... demo left a lot to be desired though

And with dlss being added as an actual feature in unreal engine 4 now, there could be more titles.

With consoles adding ray tracing, I am expecting this to be a lot more common going forward too.
 
Explain why it is rubbish..?

It's an AMD sponsored game
AMD do not perform as well as Nvidia with heavy amounts of RT
AMD are advertising RT

Common sense would suggest the RT will be minimal otherwise their AMD sponsored title will perform better on Nvidia cards :rolleyes:

Because you are making up this assumption based on already released games. GAMES that where not designed with AMDs RT design that is why Nvidia out perform AMD in these games.

So again you making up rubbish without even giving AMD a chance to showcase up and coming titles RT performance.
 
Because you are making up this assumption based on already released games. GAMES that where not designed with AMDs RT design that is why Nvidia out perform AMD in these games.

So again you making up rubbish without even giving AMD a chance to showcase up and coming titles RT performance.

This is just LOL but carry on believing that if you want to :D
 
Status
Not open for further replies.
Back
Top Bottom