• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Ampere vs AMD RDNA2, Who Won The GPU Generation (So Far)?

That said, if you never saw it running at native would you notice the downgrade in image quality? Probably not. You can be blissful ignorant and that's fine IMO. Nothing wrong with that at all, but then don't claim it's better than native.

Pretty much this, but the drop in image quality, for my experience at least, isn't huge at all and you need to be looking for it if there is a drop at all.

Warzone on my computer works fine without dlss at 3840 x 1600, but I turn on dlss to give the frame rates a boost, and I couldn't tell you, in motion as opposed to screenshots, where the differences were.

I don't understand the claim how native can be improved on at the same resolution. Can you upscale from say 2560 x 1440 to 4k and get better performance at a higher resolution? This is the interesting point for me.

Edited to clarify first paragraph.
 
Last edited:
Opinions and bottom holes, we all have them.

Some people can articulate them better than others.

You mean this Dying Light 2 screenshot Imgsli ?

I have more, it's the same everywhere. It's nothing to do with finding one spot where it's really blurry, it's just what happens when you use image reconstruction (DLSS/FSR) in Dying Light 2.

The reason I used Dying Light 2 was because that was touted as better than native, so I took some screenshots and quickly discovered it was not better than native.

That is unless you like blurry textures and overall IQ reduction. Not to mention the issues with artifacts in trees that appear if you have the sharpness slider higher than 51%. See here. When i last played the game near the end of Feb, that issue was still present. Not sure if it is still present now as the game was uninstalled before I completed it as I got bored.

That said, if you never saw it running at native would you notice the downgrade in image quality? Probably not. You can be blissful ignorant and that's fine IMO. Nothing wrong with that at all, but then don't claim it's better than native.

As you can see, you don't need to zoom into 400% to see the difference between native vs quality mode, it is clear as day on the speaker for example. Also, that screenshot there hides a lot of the lost detail as it is so dark so you can't see some of the textures due to the RT effects, which I have always said were good in DL2 despite the lack of effort on the rasterized only parts of the game.

My opinion is simply this despite what is said about me. I think the technology is great, DLSS/FSR/XESS, I just don't agree that it is better than native for the most part.
  • It introduces blurring of textures and reduces IQ overall. I have seen this on every game I have tested it on whether it's DLSS or FSR at 4K native then enable DLSS/FSR. It is very noticeable on my 48" OLED TV. I do not have the luxury of setting 6+ feet away where the better than native claim may hold more merit in the sense that you can't actually see the reduction in overall IQ because you sit so far away from the display. I sit 3 feet from the display.
  • It can have issues with movement, game dependent. I will say I did not notice ghosting in Dying Light 2 so as you will notice, I never mentioned that. However, that does not apply to all games as has been documented. This is less of an issue for me now tbh. Point 1 is the bigger issue because I am used to a clear, sharp 4K image.
  • I acknowledge that in some cases it can improve parts of the image like distant fences, wires etc. That's good, but it's not that exciting as when you play a game you don't tend to spend much time looking at these things. I do not believe that this improves the overall IQ though to make it better than a 4K native image, unless the native 4K image has massive issues of its own to begin with. Then maybe you have a point there.
You can see a reflection in that Doom image. I just don't think it's that good or worth the hit to the frame rate and I think that screenshot amplifies that point nicely.

I am sure there are area's where the effect is more pronounced though, but that does not take away from that example or its validity.

I tried a few areas with it on and off and there really was very little difference for the most part. As we know though, Doom Eternal does not have much RT in it.

What I will say is that the game runs smooth as butter its really well optimised.


It is not great in Doom Eternal which is why I added that comparison for you as it tied in with your point about it's not very sophisticated at the moment.

It is better than that in some other games though tbf so definitely not reflective of everything (pun intended).

As you can see though, as soon as you provide an alternative opinion it does not go down well with people that are heavily invested into it.

Here are some comparisons of my DL 2, dlss quality vs native (AA set to high and sharpness for dlss set to 49 although I usually set it to 0 as I don't like over sharpening)

https://imgsli.com/MTAzMTk3

https://imgsli.com/MTAzMTk4

https://imgsli.com/MTAzMTk5

https://imgsli.com/MTAzMjAw

https://imgsli.com/MTAzMjAx

I'm pretty sure a lot of people will have a hard time telling which is which or rather, which one is better looking ;)

There are quite a few areas where shimmering is very obvious in motion too but with dlss, shimmering is eliminated, obviously everyone is sensitive to different things, for me any kind of shimmering, aliasing or even over sharpening is far more annoying than slight loss of detail/blur and imo, that worsens IQ more than the former.

That is using DLSS 2.3.7, apparently the latest version has improved slightly.

Here is techpowerups take on dlss in DL 2 (since they are apparently more reputable now given wizards latest comments ;)):

https://www.techpowerup.com/review/dying-light-2-dlss-vs-fsr-comparison/

Speaking of image quality and performance, compared to native resolution, the DLSS performance uplift at 4K is an impressive improvement to the game. It almost doubles performance in Quality mode, and image quality is more detailed and stable than the TAA/FSR solution. DLSS deals with vegetation details even better than the native resolutions, including 1080p, and small details in the distance are rendered more correctly and complete. Speaking of FSR, with such a disappointing implementation (no Ultra Quality mode and incorrect sharpening filter values), this upscaling solution falls behind in quality more drastically than usual—only at 4K would it perhaps make sense for some users.

DF comparison - https://youtu.be/HnAn5TRz_2g?t=1302

That post of yours there is much better as at least you list out all the different bits which shows the big picture rather than your last/previous posts where it simply focuses on one thing and base an opinion purely on that one area/thing then apply that across the board for all cases.

There is nothing wrong with an alternative opinion, the problem is when people find a situation where a said feature is not as good or whatever, then use that as their battering ram to shoehorn their narrative/opinions with "this is why x is not good" whilst ignoring everything else that shows the opposite or ignore a more well rounded view point. Which is why your DE RT comparison is also deceiving, there are most definitely a lot of areas where the RT reflections are a lot more pronounced as per the video above, you might as well have just looked at the sky to go "look no difference" :p
 
An Nvidia V AMD thread is never going to go well no matter what the opinion and reading some of the posts its quite obvious its been reduced to tit for tat. :p;):):cry::p:mad::(:cry:;):):p
Personally for me AMD won as 3 Nvidia 3xxx cards and 1 AMD card owned this generation the AMD card was better for my personal requirements and gaming needs.
On a spin I bought 3060ti's (so thats 5 x Nvidia 3xxx cards owned) for both my boys as I think on the whole they are the best cards for 1080p/1440p gaming that my sons do.
 
I think I'd agree with that summary, though it depends on how you'd judge "winning". I'd consider it a draw, but it seems like it's some comparisons are a touch strange - as an example price wise, i would expect the 3080 to be compared more to the 6800xt, but the 6900xt seems to be brought into it more, which was priced higher (by rrp, not necessarily actual price).

I'd say that the 3080 is a better proposition than a 6800xt, but the 6900xt is a better proposition than a 3090. Use case dependent, admittedly.

However, in terms of generational leap, AMD have absolutely won in my opinion. If they can carry on this trajectory, I think that next gen they could end up on top. As mentioned upscaling quality could end up being the differentiating factor, and I'm converted to the thought. Originally, I think I'd have been a native or nothing guy, but the gains I've seen with DLSS and FSR? The trade off (and option to use it, don't forget, it's not forced) is worth it to me.

Generally yes, but some seem to think the 3080 is a mid range card or not in the same bracket for 4k.
 
I don't know if I'd class it as mid range, on the grounds that it just about handles 4k 60fps at near everything, but I'll let other people argue that back and forth as I don't really care. Also, if it is considered mid-range, it definitely shouldn't be compared against a 6900XT as that is definitely high range.

If that is mid range, but the 3090 is high range...these ranges are pretty small. It would make a 3060 subterranean. 6800XT, 6900XT, 3080, 3090, they're all damn good cards. If the nit picking is about the exception that breaks the rule, I couldn't care less. If it was a consistent issue across a wide variety of scenarios, then that is a problem.
 
I don't know if I'd class it as mid range, on the grounds that it just about handles 4k 60fps at near everything, but I'll let other people argue that back and forth as I don't really care. Also, if it is considered mid-range, it definitely shouldn't be compared against a 6900XT as that is definitely high range.

If that is mid range, but the 3090 is high range...these ranges are pretty small. It would make a 3060 subterranean. 6800XT, 6900XT, 3080, 3090, they're all damn good cards. If the nit picking is about the exception that breaks the rule, I couldn't care less. If it was a consistent issue across a wide variety of scenarios, then that is a problem.
1080P and the 3060 are subterranean as is the 2060.

1080p low end, 1440P mid range, 4K high end.

3080 only just manages 4K 60, low end 4K card, 3080 Ti, 3090 - Ti are comfortably 4K cards along side the 6900XT.
 
I don't know if I'd class it as mid range, on the grounds that it just about handles 4k 60fps at near everything, but I'll let other people argue that back and forth as I don't really care. Also, if it is considered mid-range, it definitely shouldn't be compared against a 6900XT as that is definitely high range.

If that is mid range, but the 3090 is high range...these ranges are pretty small. It would make a 3060 subterranean. 6800XT, 6900XT, 3080, 3090, they're all damn good cards. If the nit picking is about the exception that breaks the rule, I couldn't care less. If it was a consistent issue across a wide variety of scenarios, then that is a problem.

Pretty much this.

It's a bit silly to say a 3080 is a mid range gpu if the same people consider a 6900xt to be a high/flagship device given a 3080 can sometimes match or even beat a 6900xt at 4k depending on the title....
 
There were no Ti at release. It was just 3070, 3080, 3090.

As were 6800, 6800XT, 6900XT.

Remember, the 3090 was "only 10% better". At least I do.
 
The 3080 was the flagship and was later replaced by the 3080 Ti as the new flagship. Nvidia even said is on their website. Therefore, 3080 is high end. Flagship doesn't mean fastest. The dictionary says "the finest, largest, or most important one of a group of things".

The 3090 was just for people who needed more vram basically because the speed improvement is neglagible and certainly not something I would be most proud of if I ran Nvidia because of it's price. The 3080 and later the Ti would be the GPU I was most proud of if I was CEO so that would be my baby. My flagship.
 
Last edited:
Exactly. So when Ampere released, the 3080 was the flagship and not sold as a mid range card whatsoever. This was cemented if you work backwards and say 3090 = high end, 10% worse than high end is still high end...

The mid range cards were later into the cycle. You could argue the 3070 is mid range, but AMD followed with the 6700XT. These were targeted at your 1440p users.
 
Exactly. So when Ampere released, the 3080 was the flagship and not sold as a mid range card whatsoever. This was cemented if you work backwards and say 3090 = high end, 10% worse than high end is still high end...

The mid range cards were later into the cycle. You could argue the 3070 is mid range, but AMD followed with the 6700XT. These were targeted at your 1440p users.

At the price AIBs were selling 3070s for (until now) it was a high middle range card with a flagship price, some still are more than the 3080 MRSP
 
Well, good news is you don't have to convince me. :cry: I will leave it to the critical thinkers, just one of those things again I guess.

I don't know if I'd class it as mid range, on the grounds that it just about handles 4k 60fps at near everything

It even has it on the nvidia site showcasing it
 
Last edited:
It even has it on the nvidia site showcasing it

And I'd believe it for those games. Pretty sure that AMD will show a similar slide showing games they get 4k 60 fps on. I haven't seen nvidia pushing reBAR slides like AMD did with SAM mode, come to think of it. Also, I recall something about RAGE mode when the 6000 series came out although almost no more mention of it since. What happened with that, as that seemed like a good (and frankly logical) idea.
 
And I'd believe it for those games. Pretty sure that AMD will show a similar slide showing games they get 4k 60 fps on. I haven't seen nvidia pushing reBAR slides like AMD did with SAM mode, come to think of it. Also, I recall something about RAGE mode when the 6000 series came out although almost no more mention of it since. What happened with that, as that seemed like a good (and frankly logical) idea.
Rage mode was just a slight increase to the default power limit slider. Sounded good though hey. :D
 
The whole thing is genius really. Give 80% of the performance for a generation for a reasonable price then segment the remaining 20% with a few more SKUs with terrible price/performance and let the whales and fans argue over which is worth it. All whilst pocketing increased profits.

Using Nvidia's models only the 60, 70 and 80 make sense.
 
Not hard to see the gap once again.

  • Legacy rasterisation has been a draw between RDNA2 and Ampere.
  • RT tipped the scales heavily in Ampere's favour with up to twice the performance of RDNA2.
  • Hardware AI support was MIA with RDNA2 allowing Nvidia to provide superior image quality with DLSS, while AMD floundered with FSR based on a 30year old algorithm.

If they were athletes then Nvidia took gold at the Olympics, while AMD turned up a day late for the Special Olympics.

With Intel already backing RT and AI, while also working with Nvidia on Nvidia's Streamline, AMD really need to decide if they wish to be in the desktop market.

How Nvidia Streamline integrates


Nvidia, Intel and Hardware Vendor #3 :eek:
 
Back
Top Bottom