• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
Soldato
Joined
6 Feb 2019
Posts
17,565
Except in that game and settings used 3090 is slower than 2080ti so clearly there is a driver issue.

yeah there is something wrong with that guys system or the driver cause the 2080ti is beating the 3090 even at 4K - in his bench the 3090 has the same performance at 1080/1440/2160p which is weird. Maybe run the game at 8k. Nothing would surprise me though WoW has alway ran like ****, it's a super old game with a bad engine. Which compounds the issue is that Nvidia haven't released a driver for it, you'll only see a new driver for this game which the new expansion officially launches
 
Last edited:
Soldato
Joined
8 Jun 2018
Posts
2,827
And if he thinks amd won't eek more performance out of the core he's sadly mistaken. Few months down the line in all likelihood the 6800xt will be faster in 4k as well.

It's very probable that the 6900xt won't be on the same press release driver as the 6800xt. As AMD usually offer their performance drivers every December in the last couple of years now. Very likely performance uplifts will be had for the entire 6000 series.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Under the hood Rage mode is a profile that's stored in the BIOS. It has 4 vendor-defined values that rage mode overrides: power limit, temperature target, rpm target and acoustic limit. Note: no change in clocks or voltage
That's from w1zzard. Interesting stuff. Bios mods are going to be very interesting.

Update:
1210 pages for RDNA vs 1190 ampere :D
 
Last edited:
Soldato
Joined
8 Nov 2006
Posts
22,979
Location
London
I am not a fan of Nvidia. But i see a huge drop on both Radeon cards at 4k. There is no reason to not keep the same lead at 4K as they have at 1080 and 1440p, they have faster frequencies than Nvidia and more Vram. But if you check the benchmarks you will see them both dropping a lot more FPS at 4k than their Nvidia opponents.
Even when the reviews are favorable to them, like the HU review. You can notice the huge drop every time. Look at AC Valhalla, that`s a huge drop in performance. In 1440p both cards are ahead the 3080 and even the 3090.
Again i am not a Nvidia fan. I wish AMD could do better but i am afraid they missed a huge opportunity here.

Ampere is underperforming at lower resolutions. RDNA2 is scaling fine.

Ampere's poor scaling could be seen even before RDNA2 came out because the performance gains over the 2080ti weren't what they should be at 1440p.

The question is whether it is an architecture issue where it is best suited to 4K or if it is a software issue. AMD better hope its the former because if nvidia find a way of getting the 4K performance scaling (over Turing) at 1440p then it will apply some pressure on AMD.

That isn't to say AMD are not hampered by the lower bandwidth at 4K, but it's not the main reason why Ampere catches up and overtakes at 4K.
 
Soldato
Joined
6 Feb 2019
Posts
17,565
The question is whether it is an architecture issue where it is best suited to 4K or if it is a software issue. AMD better hope its the former because if nvidia find a way of getting the 4K performance scaling (over Turing) at 1440p then it will apply some pressure on AMD.
.

It doesn't even seem to be the number of pixels, but simply the load on the GPU. The previous world of Warcraft benchmark showed the 3090 underperforming at 4k too - probably because its an old game and doesn't place enough load on the GPU. Essentially Ampere wants to be GPU bottlenecked and run its full output, when it's not being taxed it runs like ass
 
Associate
Joined
7 Apr 2006
Posts
940
Location
Oblivion aka East Anglia
Ampere is underperforming at lower resolutions. RDNA2 is scaling fine.

Ampere's poor scaling could be seen even before RDNA2 came out because the performance gains over the 2080ti weren't what they should be at 1440p.

The question is whether it is an architecture issue where it is best suited to 4K or if it is a software issue. AMD better hope its the former because if nvidia find a way of getting the 4K performance scaling (over Turing) at 1440p then it will apply some pressure on AMD.

That isn't to say AMD are not hampered by the lower bandwidth at 4K, but it's not the main reason why Ampere catches up and overtakes at 4K.

I don't think the 6800s are bandwidth limited they seem to scale well and overclocking the memory doesn't offer much in performance from what i've seen. Ampere performing better, slightly better, at 4K looks architectural, like the shader don't have as much to do at lower resolution and the performance uplift drops off, relatively.

But the different at 4K is hardly substantial is it? If you put them side by side no one would tell the difference... The way some people are talking about the Radeons like they are unusable at 4k.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
I don't think the 6800s are bandwidth limited they seem to scale well and overclocking the memory doesn't offer much in performance from what i've seen. Ampere performing better, slightly better, at 4K looks architectural, like the shader don't have as much to do at lower resolution and the performance uplift drops off, relatively.

But the different at 4K is hardly substantial is it? If you put them side by side no one would tell the difference... The way some people are talking about the Radeons like they are unusable at 4k.
This is all purely a nvidia marketing ploy.

That's why this is being propagated as averages from reviewers who heavily used nvidia sponsor titled games to suggest that the 3080 is roughly 8% higher. Because the difference at 4K is hardly substantial for those games by themselves. At best the 3080 wins by a few fps, miniscule imo.

And to further add a lot of those reviewers didn't include Dirt 5, GodFall and Valhalla (or some combination). Which we know perform well for Radeon.

Look at Guru3d results: (no dirt 5/godfall): put a lot of nvidia sponsor titles: https://www.guru3d.com/articles-pages/amd-radeon-rx-6800-xt-review,1.html
6% for the 3080

Look at ComputerBase results (no Dirt 5/GodFall) all nvidia sponsor titles: https://www.computerbase.de/2020-11/amd-radeon-rx-6800-xt-test/3/#abschnitt_benchmarks_in_3840__2160
6% for the 3080

Look at TechPowerUp results (no Dirt 5/GodFall): a lot of nvidia sponsored titles: https://www.techpowerup.com/review/amd-radeon-rx-6800-xt/
6% for the 3080

You start to see a pattern. Now not all games are nvidia sponsored. However, it's easy to lie using percentages. I left out the outliers showing well over 6% for the 3080. But used the websites most commonly referred here.

Looking back at TPU they never updated their benchmark of GodFall using 6000 series: https://www.techpowerup.com/review/godfall-benchmark-test-performance-analysis/4.html
Yet, never included it in the 6000 series review only 4 days later.

Here is a review that includes GodFall and Dirt 5. As you can see Radeon does well.
https://www.techspot.com/review/2144-amd-radeon-6800-xt/

Cliffs:

The point is that with newer console ported games that properly use pure parallel asynchronous compute and DXR 1.1 (which AMD help create with MS) the RX6000 series shows better results. One that should mirror the 5700xt improvements with improved drivers. That's the reason why you are seeing this "3080 is better by 7-8%" campaign.

This marketing campaign is used to suppress the fact that as newer console ported titles come to PC Radeon is the better/smarter buy. Benchmarking older titles along with nvidia sponsor games that are aggregated into a % to hide where the performance % comes from. Also, hide newer console ported games doing better on Radeon.
:D

No wonder this pic was used earlier in this thread

Nvidia marketing is attempting to block AMD mindshare from knowing that newer console ported gaming are suited for RX6000 series. Kinda devious if you ask me. Like some Illuminati symbolism.
:p;)
 
Last edited:
Soldato
Joined
17 Aug 2009
Posts
10,719
An averaged number created from a filtered selection of games has issues.

These sample sizes are not large and multi game averages are distorted easily by adding and removing titles as you please.

It also obscures instances of specially good or specially bad performance which is relevant to actually wanting to play specific games.

The less multi game averages are waved around the better.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
An averaged number created from a filtered selection of games has issues.

These sample sizes are not large and multi game averages are distorted easily by adding and removing titles as you please.

It also obscures instances of specially good or specially bad performance which is relevant to actually wanting to play specific games.

The less multi game averages are waved around the better.
Its why this is HIGHLY used in marketing. :D
 
Last edited:
Soldato
Joined
6 Feb 2019
Posts
17,565
An averaged number created from a filtered selection of games has issues.

These sample sizes are not large and multi game averages are distorted easily by adding and removing titles as you please.

It also obscures instances of specially good or specially bad performance which is relevant to actually wanting to play specific games.

The less multi game averages are waved around the better.

I'd suggest to jump on youtube and find benches from games you actually play.

Aggerated average performance numbers are meant to provide something that represents how a cross section range of titles perform so its something you could use if you play a range of games.

But if you like really really care only about 1 or 2 games, then go find benchmarks for those games don't use an aggregate - aggregates should only be used by gamers (like me) who will play a wide range of games (I typically go through 30 new games a year)
 
Associate
Joined
25 Apr 2017
Posts
1,118
This is all purely a nvidia marketing ploy.

That's why this is being propagated as averages from reviewers who heavily used nvidia sponsor titled games to suggest that the 3080 is roughly 8% higher. Because the difference at 4K is hardly substantial for those games by themselves. At best the 3080 wins by a few fps, miniscule imo.

And to further add a lot of those reviewers didn't include Dirt 5, GodFall and Valhalla (or some combination). Which we know perform well for Radeon.

Look at Guru3d results: (no dirt 5/godfall): put a lot of nvidia sponsor titles: https://www.guru3d.com/articles-pages/amd-radeon-rx-6800-xt-review,1.html
6% for the 3080

Look at ComputerBase results (no Dirt 5/GodFall) all nvidia sponsor titles: https://www.computerbase.de/2020-11/amd-radeon-rx-6800-xt-test/3/#abschnitt_benchmarks_in_3840__2160
6% for the 3080

Look at TechPowerUp results (no Dirt 5/GodFall): a lot of nvidia sponsored titles: https://www.techpowerup.com/review/amd-radeon-rx-6800-xt/
6% for the 3080

You start to see a pattern. Now not all games are nvidia sponsored. However, it's easy to lie using percentages. I left out the outliers showing well over 6% for the 3080. But used the websites most commonly referred here.

Looking back at TPU they never updated their benchmark of GodFall using 6000 series: https://www.techpowerup.com/review/godfall-benchmark-test-performance-analysis/4.html
Yet, never included it in the 6000 series review only 4 days later.

Here is a review that includes GodFall and Dirt 5. As you can see Radeon does well.
https://www.techspot.com/review/2144-amd-radeon-6800-xt/

Cliffs:

The point is that with newer console ported games that properly use pure parallel asynchronous compute and DXR 1.1 (which AMD help create with MS) the RX6000 series shows better results. One that should mirror the 5700xt improvements with improved drivers. That's the reason why you are seeing this "3080 is better by 7-8%" campaign.

This marketing campaign is used to suppress the fact that as newer console ported titles come to PC Radeon is the better/smarter buy. Benchmarking older titles along with nvidia sponsor games that are aggregated into a % to hide where the performance % comes from. Also, hide newer console ported games doing better on Radeon.
:D

No wonder this pic was used earlier in this thread

Nvidia marketing is attempting to block AMD mindshare from knowing that newer console ported gaming are suited for RX6000 series. Kinda devious if you ask me. Like some Illuminati symbolism.
:p;)
Valhalla should not be used for benchmarking. It’s fine you want it included as it would skew results in AMD’s favour but there is something wrong with NVIDIA’s cards in that game. Since when was 2080 Ti only 6 FPS faster than 2080 (non Super) because that’s what this game shows
 
Soldato
Joined
6 Feb 2019
Posts
17,565
Whats up with the performance in cold war? 100fps no RT to 15fps with RT.

Now this is interesting, the 6800xt pulls 250w no RT and 150w with RT. Suggests to me the shader cores are getting massively bottlenecked by the RT cores that can't keep up with rasterization.


Minecraft runs pretty well though

 
Status
Not open for further replies.
Back
Top Bottom