• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Ampere vs AMD RDNA2, Who Won The GPU Generation (So Far)?

Caporegime
Joined
4 Jun 2009
Posts
31,017
Interesting or rather it's hilarious reading back in this thread and seeing certain members view points on HUBs stance back then compared to recent posts in other threads now.... certain opinions have gone from them being nvidia shills/not trust worthy to now trust worthy and the best :cry: Wonder when they will become nvidia shills next time? ;) :D


Would be interesting for them to do another video on this with prices getting back to normal and a larger selection of games as well as the changes to FSR etc.
 
Soldato
Joined
3 Aug 2010
Posts
3,037
I'm going to give this round to AMD. They've done amazing closing the performance gap while consuming less power with more VRAM all in one gen.

Their work on drivers is not bad either with FSR rapidly gaining adoption. All they need is to sort out their RT performance for next gen.

Competition is great for us so I hope Intel can also be competitive.
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
I'm going to give this round to AMD. They've done amazing closing the performance gap while consuming less power with more VRAM all in one gen.

Their work on drivers is not bad either with FSR rapidly gaining adoption. All they need is to sort out their RT performance for next gen.

Competition is great for us so I hope Intel can also be competitive.
I give it to Nvidia, RT performance is much better.

Will clap for AMD making a good jump in performance where it now matters a bit less "Raster".

Don't worry, Radeon might destroy Nvidia next round but those are high hopes.
 
Soldato
Joined
25 Sep 2009
Posts
9,627
Location
Billericay, UK
Initial victory for AMD but now every new game has some element of ray tracing Ampere is really showing it's muscles.

That only really applies to the high end though, mid range cards don't really have the grunt for ray tracing effects so comes down to pure price/performance.
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
Initial victory for AMD but now every new game has some element of ray tracing Ampere is really showing it's muscles.

That only really applies to the high end though, mid range cards don't really have the grunt for ray tracing effects so comes down to pure price/performance.

Blanket statement, please cover this with critical thinking.

I am assuming you are at 4K to be using a high end GPU? and what games are you basing your claims off?

Are we still in comparison with AMD here? Well Ampere's mid tier cards kill even their 6900 XT with RT enabled so not sure about your comparison here.
 
Associate
Joined
4 Oct 2017
Posts
1,216
AMD have gotten better but Nvidia for me are far stronger in certain aspects.

If you mine, stream or want the best ray tracing performance at high resolutions it's Nvidia.

If you're not bothered about ray tracing and only care about gaming then AMD are worth going for, they're cheaper and you get similar rasterization performance.

The next gen of cards is going to be interesting, let's face it all AMD need to do is increase their ray tracing performance and it'll be close. If they provide similar differences in power draw it might even make people look towards AMD regardless.
 
Associate
Joined
4 Oct 2017
Posts
1,216
Don't worry, Radeon might destroy Nvidia next round but those are high hopes.

I think the difference between Nvidia and Intel is that Intel really had run out of idea's.

Nvidia seem to have so much up their sleeve. It won't be as easy.

Still no one is untouchable, I'm old enough to remember when Nokia were the king's of mobile....
 
Soldato
Joined
6 Aug 2009
Posts
7,071
I'm quite sure some will disagree but I prefer the image output of AMD's cards. I don't do mining and I can wait until ray tracing isn't a big hit to performance.
 
Permabanned
Joined
7 Oct 2018
Posts
2,170
Location
Behind Pluto
I think the difference between Nvidia and Intel is that Intel really had run out of idea's.

Nvidia seem to have so much up their sleeve. It won't be as easy.

Still no one is untouchable, I'm old enough to remember when Nokia were the king's of mobile....

I still had my N95 8GB edition until 2017 mate, fully agree with your point of view!.
 
Caporegime
Joined
17 Mar 2012
Posts
47,579
Location
ARC-L1, Stanton System
For AMD to take any meaningful market share from Nvidia they need:

Significantly better raster
Significantly better RT
Have Significantly lower power, as RDNA2 currently does.
Have temporal up-scaling tech that cannot be faulted by Nvidia themselves.
Have better software, they currently do, much better.
Be meaningfully cheaper.

Nvidia enjoy 80% + market share because the default in most peoples mind is: Just get Nvidia. So they need to be wowed, hard.
 
Caporegime
Joined
4 Jun 2009
Posts
31,017
I'm going to give this round to AMD. They've done amazing closing the performance gap while consuming less power with more VRAM all in one gen.

Their work on drivers is not bad either with FSR rapidly gaining adoption. All they need is to sort out their RT performance for next gen.

Competition is great for us so I hope Intel can also be competitive.

Yup they did very well this time round, I likely would have gone for a 6800 or 6800xt if it was possible to get them in the UK for MSRP but alas, it wasn't. Looking back though I'm happy I didn't as RT and DLSS have proved to be invaluable over the last year and 4/5 months, it's quite frankly embarrassing having a mid tier range i.e. 3070 match/beat the top end flagship 6900xt in RT titles imo.

Just a note on the consuming less power bit too, wrinkly posted some interesting insights with regards to RT efficiency:

Sure -
3080 ~340W @ 47FPS
6900XT ~300W @ 29FPS
360Ti ~200W @ 27 FPS

I'm not expecting much from RDNA 3 on the RT front, at least, I suspect they will still drag behind by quite a bit compared to nvidias latest.

That and majority of ampere cards are using way more power than what they really need to. Managed to reduce power by 100w on my 3080, which is 100% stable in RT.

For AMD to take any meaningful market share from Nvidia they need:

Significantly better raster
Significantly better RT
Have Significantly lower power, as RDNA2 currently does.
Have temporal up-scaling tech that cannot be faulted by Nvidia themselves.
Have better software, they currently do, much better.
Be meaningfully cheaper.

Nvidia enjoy 80% + market share because the default in most peoples mind is: Just get Nvidia. So they need to be wowed, hard.

The only things AMD need to focus on from that list is better RT and that's it. Well actually, jury is still out on their upscaling tech as chances are FSR 2 isn't going to be as good as DLSS, especially if it is based on TAAU and not TSR.....

The above and possibly get more sponsored games behind them, nvidia are drowning them on that front especially with the quality/status of certain titles i.e. cp 2077, metro, atomic heart, ascent, control to just name a few.

Doesn't help when amds live shows always focus mostly on their partnerships with various companies and they spend maybe 2 minutes on the pc gaming industry where as nvidia spend an entire 30 minute show purely for pc gamers.

I'm quite sure some will disagree but I prefer the image output of AMD's cards. I don't do mining and I can wait until ray tracing isn't a big hit to performance.

Can't say I noticed any difference coming from 4 amd cards before going to the 3080 (have to make sure nvidia control panel is set to us full rgb etc. though)
 
Soldato
Joined
21 Jul 2005
Posts
20,020
Location
Officially least sunny location -Ronskistats
For AMD to take any meaningful market share from Nvidia they need:

Significantly better raster
Significantly better RT
Have Significantly lower power, as RDNA2 currently does.
Have temporal up-scaling tech that cannot be faulted by Nvidia themselves.
Have better software, they currently do, much better.
Be meaningfully cheaper.

:cry::cry:

I think even when they have three out of the five there they would still be outsold and deemed the inferior product. Critical thinking and all that!
 
Back
Top Bottom