• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Ampere vs AMD RDNA2, Who Won The GPU Generation (So Far)?

I find it odd how upscaling is being touted as a feature when it's really a bodge to keep frame rates up because the cards aren't fast enough. Clever marketing. To me the answer is use a resolution that doesn't need very expensive GPUs that still struggle at native resolution. Seems like a way to milk the whales ;) Max out 1440p for a reasonable amount or make lots of compromises at 4K for a lot more? Still seems too early to move to 4K to me.
Before the RTX cards you would have said real-time RT is impossible, I am not sure about your mind set, time to get with the times. :)

I also had your sentiments and is why I went with the RX 5700-XT when I had to decide between GeForce 2000 or Radeon 5000. Sadly AMD's team were terrible with drivers for that time so ended up on a 2060 with the stupid market increase, then managed to move to a 3070.

The reason I am with Nvidia are Open GL performance and driver stability, but now I have access to DLSS and RT, it's a game changer in the way we see visuals, for some games, early ones especially it's not worth while. But time has moved on mate.
 
I've got more faith in the trajectory of AMD. As far as I can see they are more innovative and prefer open standards. Considering they are competing on two fronts and giving both Intel and Nvidia a good run I'd be very worried as they grow. Explains why the competition like locked in proprietary technology, they struggle to compete otherwise. Given increasing resources available to AMD engineers and the internal collaboration I'm expecting good things over the next few years.
 
Before the RTX cards you would have said real-time RT is impossible, I am not sure about your mind set, time to get with the times. :)

I also had your sentiments and is why I went with the RX 5700-XT when I had to decide between GeForce or Radeon.

I wouldn't say it's all that now, it's still not that sophisticated and if you require dlss for decent frame rates it's still early. I'm happy to wait and let the early adopters fund it. I think we'll need MCM GPUs before it's ubiquitous.

I remember when frame rates were everything on here and as soon as ray tracing appeared that went out of the window. People would rubbish a card that was only 5% slower. Now they will take a massive drop in frame rate or drop in IQ. I'm happy to give it a pass until it becomes the standard at native resolution.
 
With RT, Doom Eternal was playable at 2560x1440 all thanks to DLSS whilst looking indistinguishable with their bottom of the barrel card.

 
With RT, Doom Eternal was playable at 2560x1440 all thanks to DLSS whilst looking indistinguishable with their bottom of the barrel card.


I don't doubt it's clever stuff. I don't like that it's proprietary and it's game specific. If it was able to be vendor agnostic and applied to every game without having to have some tie in to Nvidia I'd be more receptive. Anything that fragments the PC landscape is a bad thing in my book. I'm sure that will be seen as controversial by some ;)
 
I don't doubt it's clever stuff. I don't like that it's proprietary and it's game specific. If it was able to be vendor agnostic and applied to every game without having to have some tie in to Nvidia I'd be more receptive. Anything that fragments the PC landscape is a bad thing in my book. I'm sure that will be seen as controversial by some ;)
It fully depends upon the hands it goes into who develop it, Nvidia making it open source were likely trying to avoid it being a lesser feature.. it is a gamble, which keeps GeForce owners happy which for a company is much more important than what AMD owners want since you know... you don't own a GeForce GPU, I am sure that is how Nvidia is working it, also Nvidia are a face, so if anything goes wrong it makes them look bad, not some 3rd party.
There is real passion behind what they do, just in argument people use hap-hazard sentiments without really understanding the situation, which I don't fully, I am not an employee.

Open source anything is more a cop-out as we have no reliable way of knowing the quality of such a thing, AMD have done great things like Freesync, but their upscaling technique is still quite a bit behind Nvidia's, though the future looks ripe for AMD on that front.

FSR is also only available in select games and some have outright stopped DLSS being implemented. Far Cry 6? AMD sponsored title? This is a rabbit hole though and a time sink as it gets pettier and pettier, I hate even bringing up the fault of each company as people are so stuck in trying to argue their side they blind themselves to the larger image.

In fact it is fault finding which blinds us with products.
 
Last edited:
Coming across as a bit biased there. Considering I own a geforce gpu, I wonder if anyone thinks differently since they were breached/hacked, there will be more proprietary information (and other stuff) out there and now in the hands of the competitors.
 
Coming across as a bit biased there. Considering I own a geforce gpu, I wonder if anyone thinks differently since they were breached/hacked, there will be more proprietary information (and other stuff) out there and now in the hands of the competitors.
My subjective experience is different to yours, implying there is a bias, well yes there is, I spent money which don't grow on tree's mate. A shill is different, having a bias, well my subjective experience, 5700-XT left a sour taste, it cost money, I had to get a lower performance GPU as the prices went sky high, then I was able to get a 3070, the reason I stuck with GeForce here is driver related but also OpenGL, you can't even run PCSX2 emulator with Open GL with a Radeon GPU, FPS is terrible.

https://www.reddit.com/r/PCSX2/comments/im2ho3/opengl_on_amd_still_an_issue/

Wisen up or forever be stuck pointing fingers.
 
With RT, Doom Eternal was playable at 2560x1440 all thanks to DLSS whilst looking indistinguishable with their bottom of the barrel card.

I don't doubt it's clever stuff. I don't like that it's proprietary and it's game specific. If it was able to be vendor agnostic and applied to every game without having to have some tie in to Nvidia I'd be more receptive. Anything that fragments the PC landscape is a bad thing in my book. I'm sure that will be seen as controversial by some ;)
I just downloaded Doom Eternal this morning @Troezar as I wanted to capture some gameplay footage to test a hitching issue I saw reported on these forums.

Here is a side by side example of what you are missing out on in Doom Eternal without Ray Tracing (RT) effects enabled. Imgsli (RT on vs off)
 
I just downloaded Doom Eternal this morning @Troezar as I wanted to capture some gameplay footage to test a hitching issue I saw reported on these forums.

Here is a side by side example of what you are missing out on in Doom Eternal without Ray Tracing (RT) effects enabled. Imgsli (RT on vs off)
And your point is Matt?

You are actually a shill so there is no winning with you.

Re-read what I typed and understand the context written, the person before mentioned performance for RT and DLSS bringing down image quality, it was not a show case of how good DOOM looks with RT on or Off.

I expected more from you, don't do this.

I can excuse you if you have not had your morning brew yet though.
 
And your point is Matt?

You are actually a shill so there is no winning with you.

Re-read what I typed and understand the context written, the person before mentioned performance for RT and DLSS bringing down image quality, it was not a show case of how good DOOM looks with RT on or Off.

I expected more from you, don't do this.

I can excuse you if you have not had your morning brew yet though.

I expected less from you and you delivered, well done. Now wind your neck in please.

Troezar was clearly talking about RT not being that sophisticated and you provided Doom Eternal as an example. I provided an example showing RT on vs off. Nothing more to it.

I am sure you can find some better examples if you need it, this was just one I have to hand.

If you don't like reading my posts, I recommend putting me on ignore. :)
 
OK, let's get back on topic here...

@Tired9 in the case of AMD v Nvidia, who do you feel won, and why?

But please can you summarise briefly in bullet points? I lose the thrust of the points amongst the back and forth.
 
OK, let's get back on topic here...

@Tired9 in the case of AMD v Nvidia, who do you feel won, and why?

But please can you summarise briefly in bullet points? I lose the thrust of the points amongst the back and forth.

In total sold, Nvidia won.
In terms of friction between the fans, well that's even.
In terms of driver stability, ampere have done well as has RDNA2, AMD have not had a widespread issue like they did on the 5000 series, the 2000 series from Nvidia also did have a brief stint of issues also. There was one game title where Nvidia cards were dying, Outer Worlds? Seems a fixed issue, but we don't know where the real fault lies as it was not all NV cards dying, certain AIB's.
RT performance is in favor of Nvidia this round along side upscaling.
There is one game where Vram may cause issues, Far Cry 6, AMD take the crown for having the better Vram options.

To me it is a close draw.
 
I think I'd agree with that summary, though it depends on how you'd judge "winning". I'd consider it a draw, but it seems like it's some comparisons are a touch strange - as an example price wise, i would expect the 3080 to be compared more to the 6800xt, but the 6900xt seems to be brought into it more, which was priced higher (by rrp, not necessarily actual price).

I'd say that the 3080 is a better proposition than a 6800xt, but the 6900xt is a better proposition than a 3090. Use case dependent, admittedly.

However, in terms of generational leap, AMD have absolutely won in my opinion. If they can carry on this trajectory, I think that next gen they could end up on top. As mentioned upscaling quality could end up being the differentiating factor, and I'm converted to the thought. Originally, I think I'd have been a native or nothing guy, but the gains I've seen with DLSS and FSR? The trade off (and option to use it, don't forget, it's not forced) is worth it to me.
 
Guessing by the framerate counter the bottom image is the DLSS one and indeed does look better/sharper than the top one.

It is indeed :)

Both like for like in terms of movement speed too i.e. using 360 controller to pan the camera with joystick all the way to the left.

There are plenty of good examples out there showing motion issues with TAA and where DLSS improves this, which is why I always find it amusing people still going on about dlss having motion issues when it is usually the opposite, especially if you get much higher fps (which is tied in with the pixel response)

DF recently did a very good in depth video showing TSR (which is better than TAA and TAAU) VS DLSS in ghostwire toyko:

https://youtu.be/FbFMafKzJyY?t=357

Lets hope AMD are basing FSR 2 on TSR and have added improvements on top otherwise it won't impress amd fans given how vocal they have been with dlss and all of its "issues".....

There is only 1 game for me where DLSS had a negative effect and only slightly, that is Horizon Zero Dawn, all other games I tried it with though are mostly fine, though trailing artifacts are still present in some titles, it is still a far better solution than AMD's FSR, in HZD FSR looks like someone applied massive amounts of sharpening and grass etc shimmers like crazy.

I found HZD to be noticeably better with dlss, although iirc, it did have some issues on release, which were fixed either with a nvidia driver update or game patch. If I recall it was to do with the LOD bitmap setting not being properly applied?

But yeah, it's a good example of showing why spatial upscalers are crap in comparison to temporal ones. Most people like over sharpening though given that quite a few have amds image sharpening set to like 80+%

I think RT was always going to AMD's downfall this gen, it's the way forward for graphical fidelity and implementation now we have the power and technology. We're seeing a more realistic pricing structure than MSRPs. 6700xt was priced to compete and outpace the 3070 which it does not. Granted the scalping made it a mute point up until recently but we're seeing the 6700xt now compete with the 3060 ti, hopefully both will be down to around £400 for base models in a couple of months.

I can see RDNA2 owners missing out on fidelity in games in a couple of years where RTX3000 owners will be able to scrape by (within reason, < 3060 may not cut it). I imagine RDNA3 being the same, RTT capabilities of RTX4000 will give developers the ability to push beyond RDNA3 capablities, and they will.

That has been my point all along with people getting their panties in a twist over vram fiasco (10gb specifically). Since we're on the topic of doom eternal, which uses only RT reflections and is regarded as being extremely well optimised and uses vulkan....

xG4jEmA.png


Then look at several other titles with RT especially ones with more effects and you'll see a similar trend (especially if the game doesn't have FSR).... Factor in, every developer is desperate to use RT where they can (more for their own interests than for consumers though tbh hence why even consoles are trying to use it where possible). It's not hard to see how things are going to go unless amd can somehow manage to come up with a magical driver for RDNA 2 that brings them up to ampere levels.

I just downloaded Doom Eternal this morning @Troezar as I wanted to capture some gameplay footage to test a hitching issue I saw reported on these forums.

Here is a side by side example of what you are missing out on in Doom Eternal without Ray Tracing (RT) effects enabled. Imgsli (RT on vs off)

Come on Matt, it's things like this, which is why people can't take you seriously at times :p

Same way with dlss screenshot for DL 2 you posted a while back, find select areas where it looks worse then post only that and go "yup dlss is nowhere as good as native" when it has been proven time and time again by various sources and end users dlss "can" look better a lot of the time and in a good chunk of areas, not always... but majority of time, it is better. Not to mention, the screenshots I took of cp 2077 a while back and most people guessed wrongly which ones were dlss and native :cry: In fact, I might have to do a few new comparisons/tests since dlss has come a bit further since then ;)

Unfortunately, doom eternal only has rt reflections so it is only going to have differences where "reflections" are a thing, go figure :p

 
I just downloaded Doom Eternal this morning @Troezar as I wanted to capture some gameplay footage to test a hitching issue I saw reported on these forums.

Here is a side by side example of what you are missing out on in Doom Eternal without Ray Tracing (RT) effects enabled. Imgsli (RT on vs off)

Tbh at first glance I thought the left hand one was with RT. It's a good game to use as it's so polished and optimised, good frame rates are achievable on a potato ;) Don't get me wrong I'm sure I'll love RT eventually, but I'm prepared to wait until the performance and price and support are there. I'd like to see it be like rasterisation, just there doing its thing rather than a marketing tool.
 
Opinions and bottom holes, we all have them.

Some people can articulate them better than others.
Come on Matt, it's things like this, which is why people can't take you seriously at times :p

Same way with dlss screenshot for DL 2 you posted a while back, find select areas where it looks worse then post only that and go "yup dlss is nowhere as good as native" when it has been proven time and time again by various sources and end users dlss "can" look better a lot of the time and in a good chunk of areas, not to mention, the screenshots I took of cp 2077 a while back and most people guessed wrongly :cry:

Unfortunately, doom eternal only has rt reflections so it is only going to have differences where "reflections" are a thing, go figure :p

You mean this Dying Light 2 screenshot Imgsli ?

I have more, it's the same everywhere. It's nothing to do with finding one spot where it's really blurry, it's just what happens when you use image reconstruction (DLSS/FSR) in Dying Light 2.

The reason I used Dying Light 2 was because that was touted as better than native, so I took some screenshots and quickly discovered it was not better than native.

That is unless you like blurry textures and overall IQ reduction. Not to mention the issues with artifacts in trees that appear if you have the sharpness slider higher than 51%. See here. When i last played the game near the end of Feb, that issue was still present. Not sure if it is still present now as the game was uninstalled before I completed it as I got bored.

That said, if you never saw it running at native would you notice the downgrade in image quality? Probably not. You can be blissful ignorant and that's fine IMO. Nothing wrong with that at all, but then don't claim it's better than native.

As you can see, you don't need to zoom into 400% to see the difference between native vs quality mode, it is clear as day on the speaker for example. Also, that screenshot there hides a lot of the lost detail as it is so dark so you can't see some of the textures due to the RT effects, which I have always said were good in DL2 despite the lack of effort on the rasterized only parts of the game.

My opinion is simply this despite what is said about me. I think the technology is great, DLSS/FSR/XESS, I just don't agree that it is better than native for the most part.
  • It introduces blurring of textures and reduces IQ overall. I have seen this on every game I have tested it on whether it's DLSS or FSR at 4K native then enable DLSS/FSR. It is very noticeable on my 48" OLED TV. I do not have the luxury of setting 6+ feet away where the better than native claim may hold more merit in the sense that you can't actually see the reduction in overall IQ because you sit so far away from the display. I sit 3 feet from the display.
  • It can have issues with movement, game dependent. I will say I did not notice ghosting in Dying Light 2 so as you will notice, I never mentioned that. However, that does not apply to all games as has been documented. This is less of an issue for me now tbh. Point 1 is the bigger issue because I am used to a clear, sharp 4K image.
  • I acknowledge that in some cases it can improve parts of the image like distant fences, wires etc. That's good, but it's not that exciting as when you play a game you don't tend to spend much time looking at these things. I do not believe that this improves the overall IQ though to make it better than a 4K native image, unless the native 4K image has massive issues of its own to begin with. Then maybe you have a point there.
You can see a reflection in that Doom image. I just don't think it's that good or worth the hit to the frame rate and I think that screenshot amplifies that point nicely.

I am sure there are area's where the effect is more pronounced though, but that does not take away from that example or its validity.

I tried a few areas with it on and off and there really was very little difference for the most part. As we know though, Doom Eternal does not have much RT in it.

What I will say is that the game runs smooth as butter its really well optimised.
Tbh at first glance I thought the left hand one was with RT. It's a good game to use as it's so polished and optimised, good frame rates are achievable on a potato ;) Don't get me wrong I'm sure I'll love RT eventually, but I'm prepared to wait until the performance and price and support are there. I'd like to see it be like rasterisation, just there doing its thing rather than a marketing tool.

It is not great in Doom Eternal which is why I added that comparison for you as it tied in with your point about it's not very sophisticated at the moment.

It is better than that in some other games though tbf so definitely not reflective of everything (pun intended).

As you can see though, as soon as you provide an alternative opinion it does not go down well with people that are heavily invested into it.
 
Last edited:
Opinions and bottom holes, we all have them.

Some people can articulate them better than others.

You mean this Dying Light 2 screenshot Imgsli ?

I have more, it's the same everywhere. It's nothing to do with finding one spot where it's really blurry, it's just what happens when you use image reconstruction (DLSS/FSR) in Dying Light 2.

The reason I used Dying Light 2 was because that was touted as better than native, so I took some screenshots and quickly discovered it was not better than native.

That is unless you like blurry textures and overall IQ reduction. Not to mention the issues with artifacts in trees that appear if you have the sharpness slider higher than 49%.

That said, if you never saw it running at native would you notice the downgrade in image quality? Probably not. You can be blissful ignorant and that's fine IMO. Nothing wrong with that at all, but then don't claim it's better than native.

As you can see, you don't need to zoom into 400% to see the difference between native vs quality mode, it is clear as day on the speaker for example. Also, that screenshot there hides a lot of the lost detail as it is so dark so you can't see some of the textures due to the RT effects, which I have always said were good in DL2 despite the lack of effort on the rasterized only parts of the game.

My opinion is simply this despite what is said about me. I think the technology is great, DLSS/FSR/XESS, I just don't agree that it is better than native for the most part.
  • It introduces blurring of textures and reduces IQ overall. I have seen this on every game I have tested it on whether it's DLSS or FSR at 4K native then enable DLSS/FSR. It is very noticeable on my 48" OLED TV. I do not have the luxury of setting 6+ feet away where the better than native claim may hold more merit in the sense that you can't actually see the reduction in overall IQ because you sit so far away from the display. I sit 3 feet from the display.
  • It can have issues with movement, game dependent. I will say I did not notice ghosting in Dying Light 2 so as you will notice, I never mentioned that. However, that does not apply to all games as has been documented. This is less of an issue for me now tbh. Point 1 is the bigger issue because I am used to a clear, sharp 4K image.
  • I acknowledge that in some cases it can improve parts of the image like distant fences, wires etc. That's good, but it's not that exciting as when you play a game you don't tend to spend much time looking at these things. I do not believe that this improves the overall IQ though to make it better than a 4K native image, unless the native 4K image has massive issues of its own to begin with. Then maybe you have a point there.
You can see a reflection in that Doom image. I just don't think it's that good or worth the hit to the frame rate and I think that screenshot amplifies that point nicely.

I am sure there are area's where the effect is more pronounced though, but that does not take away from that example or its validity.

I tried a few areas with it on and off and there really was very little difference for the most part. As we know though, Doom Eternal does not have much RT in it.

What I will say is that the game runs smooth as butter its really well optimised.


It is not great in Doom Eternal which is why I added that comparison for you as it tied in with your point about it's not very sophisticated at the moment.

It is better than that in some other games though tbf so definitely not reflective of everything (pun intended).

As you can see though, as soon as you provide an alternative opinion it does not go down well with people that are heavily invested into it.

BS, or you would not have selected DOOM as your example since there are others where RT is not that great too, your intention is to throw off the discussion through deception.
You purposely saw my post and it hit something in you than the bottom level 2000 series card delivered a very good experience at 1440P with RT and DLSS on showing no real visual deficit, especially in motion which people who play DOOM do not sit looking at things mostly.

The original point was aimed at DLSS having such an IQ loss and penalty, I just showed DOOM as it is the only RT upload I have on YouTube with a card that is not all that capable compared to Ampere.

You twisted it to RT it's self not looking that good.

I don't buy any other reasoning on my part due to you being a part of the Radeon team or were, your stance will forever follow you as we have no clarity of where you stand with AMD professionally.

The self admittance at the end though is hilarious.
 
Back
Top Bottom