• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Graphics cards 2021 Benchmark

Caporegime
Joined
12 Jul 2007
Posts
40,511
Location
United Kingdom
Pretty comprehensive list of benchmarks run by computer base on the fastest CPU (Ryzen 5000 series) and Motherboard PCI-E 4.0 platform (AM4).

Plenty of rasterization, memory utilisation tests and some RT benchmarks too. Something for everyone.

I'll link to a few results, more in the link below for you to peruse at your convenience.

1080P
SxcTG2H.png

lb43Hf4.png



1440P
Q6Zw9Ru.png

SnF5fC0.png




2160P
0i1yaUI.png

5LJ3PvM.png


Couple of things that stand out to me. Does anyone remember when it was proclaimed that the 3080 was faster than the 6900 XT at 4K? That aged badly.

Also worth noting that the benchmark clearly shows and comments on the fact that Big Navi scales well at 4K, instead it is the other GPUs that scale poorly below 4K.

Two common myths busted and put to bed.

Link: AMD Radeon & Nvidia GeForce im Benchmark-Vergleich 2021: Testergebnisse im Detail - ComputerBase
 
Last edited:
Associate
Joined
16 Oct 2012
Posts
218
I'd have to say the testing method seems to be flawed somehow, the 3090 gains performance as the resolution increases.
If it merely remained static then it would clearly be CPU bound, but what's happening in those results doesn't make sense as it gains nearly 7% going from 1080 to 4k
 
Caporegime
OP
Joined
12 Jul 2007
Posts
40,511
Location
United Kingdom
I'd have to say the testing method seems to be flawed somehow, the 3090 gains performance as the resolution increases.
If it merely remained static then it would clearly be CPU bound, but what's happening in those results doesn't make sense as it gains nearly 7% going from 1080 to 4k
Taken from the review. (google translate)

A weakness in high or low resolutions?
The question now is whether AMD's RDNA 2 in Ultra HD breaks away a bit or Nvidia's ampere in low resolutions simply can't properly utilize the massively increased number of execution units compared to its predecessor with barely changed front end. The new benchmarks point to the latter, as they have done before.


The Radeon RX 6900 XT can maintain its lead over the GeForce RTX 2080 Ti in all three resolutions (40 percent in Full HD, 42 percent in WQHD and 41 percent in Ultra HD). On the other hand, the geForce RTX 3090 significantly increases the gap to the GeForce RTX 2080 Ti in high resolutions (35 percent in Full HD, 41 percent in WQHD, 47 percent in Ultra HD), suggesting that the large ampere GPUs require many pixels to properly utilize the ALUs. Because a CPU limit no longer exists with the Used Ryzen 9 5950X at the latest in WQHD.
 
Associate
Joined
16 Oct 2012
Posts
218
That doesn't make my statement wrong. There is a flaw in their testing and it needs to be addressed. The high end Nvidia cards are uniformly faster at higher resolutions than they are at lower resolution.
That's completely counter intuitive and indicates a flaw in methodology.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
The big issue RDNA2 still has is the lack of next gen performance. Poor ray tracing and no DLSS option.

Benchmarks mit Raytracing in Full HD, WQHD und Ultra HD
Ray tracing is still Nvidia land. The GeForce RTX 3080 works at 2,560 × 1,440 on average 33 percent faster than the Radeon RX 6800 XT, which in turn is on par with the GeForce RTX 3070 and GeForce RTX 2080 Ti. The higher the resolution, the more Nvidia is ahead. In 1,920 × 1,080 the RTX 3080 is "only" 29 percent ahead of the RX 6800 XT, in 3,840 × 2,160 it is 40 percent.

The individual games then show that AMD's deficit fluctuates massively depending on the title. Control doesn't like RDNA 2 at all with activated ray tracing, where the GeForce RTX 3080 in WQHD delivers 61 and 36 percent more FPS than the Radeon RX 6800 XT. Call of Duty: Black Ops Cold War is pretty average here, with Nvidia doing 33 and 34 percent better in the game. Watch Dogs: Legion, on the other hand, runs much better on RDNA 2 with ray tracing, here Nvidia is in the lead in 2,560 × 1,440 by a significantly lower 10 and 26 percent. Both graphics cards do almost equally well in Full HD.

Benchmarks mit Nvidia DLSS inklusive Raytracing
GeForce RTX graphics cards have a massive increase in performance with DLSS, which is particularly useful in combination with ray tracing. And since DLSS 2.0, the result often looks good. With the target resolution Ultra HD and the DLSS setting "Quality" (rendering resolution: WQHD), the GeForce RTX 3080 is on average 60 percent faster than with native resolution, the GeForce RTX 3070 even increases by 89 percent.

Control is the model game for DLSS performance. It increases by a whopping 90 percent on the GeForce RTX 3080 thanks to DLSS. CoD: Cold War shows a smaller plus with 47 percent, the same applies to Watch Dogs: Legion with 47 percent. Cyberpunk 2077 is in the middle, here the performance increases by 72 percent thanks to DLSS.

Looking at the ray tracing benchmark on here we can see RDNA2 has ~50% the performance of Ampere. Not good when buying what is supposed to be a next gen card. It will be interesting to bench Metro Exodus Enhanced Edition.
 
Permabanned
Joined
22 Jul 2020
Posts
2,898
I am normal, I just on occasion checked the Nvidia store at random times, now I do not know if this has changed since then (mid January),ordered a 3090FE at around 12:50am Friday and had it on Sunday afternoon.
 

HRL

HRL

Soldato
Joined
22 Nov 2005
Posts
3,026
Location
Devon
I'd have to say the testing method seems to be flawed somehow, the 3090 gains performance as the resolution increases.
If it merely remained static then it would clearly be CPU bound, but what's happening in those results doesn't make sense as it gains nearly 7% going from 1080 to 4k

That was picked up at launch with the 3090.

Makes me a little relieved as I bought one and game exclusively at 4K, so it turned out to be the right choice for me at least.
 
Associate
Joined
16 Oct 2012
Posts
218
Picked up by who?
I've never before seen a benchmark result where 4k is 7% faster than 1080p
 
Last edited:

HRL

HRL

Soldato
Joined
22 Nov 2005
Posts
3,026
Location
Devon
Picked up by who?
I've never before seen a benchmark result where 4k is 7% faster than 1080p

Definitely remember reading it in more than one review after the cards launched.

Afraid I can’t be bothered to go scouring the Internet for the reviews though.
 
Associate
Joined
24 Mar 2011
Posts
632
Location
Cambridgeshire
I certainly don't remember seeing anything like that.

It well known the 3090 scales nicely as the resolution increasing (or is slow at 1080p if you work for AMD :p) but it should never get faster. Stay the same if CPU limited sure, but that quite a notable difference.

I own a 3090 too and it certainly doesn't get faster when I go from 1080p to 4k!
 
Associate
Joined
16 Oct 2012
Posts
218
Afraid I can’t be bothered to go scouring the Internet for the reviews though.

You know just this once i have the day off, and in the spirit of trying to increase my own knowledge i have looked at this in detail. The reason it stuck out to me is that 7% is a huge performance margin, that's way outside of deviation expected between test runs. For reference it's about the same as the expected difference between a 3080 a 3090 in most circumstances.

It turns out that most reviewers simply aren't testing 1080p for these cards anymore, in the reasonable assumption that if you are spending this much on a GPU you are probably spending at least that much on your monitor too, however Tech powerup, Guru3d, Hardware Canucs, and Hexus all continue to test at 1080p.

Reading through each of their reviews i've found only 2 tests which even come close, Tech powerups Anno 1800 test and Hexus's Civ 6 test, every single other test shows an increase in FPS as resolution is decreased and so the average of results should also represent this.

https://www.techpowerup.com/review/msi-geforce-rtx-3090-suprim-x/5.html
https://hexus.net/tech/reviews/graphics/146083-msi-geforce-rtx-3090-gaming-x-trio/?page=5

In both cases you can see that the tests are clearly CPU limited and so 1080p will perform essentailly identically to 4K, but it does show that performance inversion is possible, allbeit within 2% which i feel is what you might expect.

I've not found a single example elsewhere of this huge performance inversion as shown by Computerbase.
 
Associate
Joined
16 Oct 2012
Posts
218
It's not a translation issue, it's a methodology issue which unfortunately undermines all of the results they have generated.
 
Associate
Joined
29 Dec 2003
Posts
1,933
Looking at the comparative performance, I’m pleased with how the £480 (delivered) 3070 I bought a couple of weeks ago does. I was getting close to buying a 6800 for £720+ or a 6800XT for £850+. Neither would have come close to providing similar value.
 
Associate
Joined
4 Oct 2017
Posts
1,216
I'd have to say the testing method seems to be flawed somehow, the 3090 gains performance as the resolution increases.
If it merely remained static then it would clearly be CPU bound, but what's happening in those results doesn't make sense as it gains nearly 7% going from 1080 to 4k

You're right in theory it doesn't make sense. Performance should scale, so if card x is more powerful than card y, card x should offer faster performance at every resolution.

However it's been said that Ampere just tends to work better at higher resolutions due to the architecture.

I have a 3090 and I've tried gaming at 1440p but it actually looks really low res for me now lol. Granted it's on a native 4k Oled tv so of course non-native resolutions look softer. I'm happy just gaming at a locked 60fps with max settings on most games.

Here's just one of many videos on the subject (yes I know it's focused on the 3080 but it applies to all the cards)

 
Back
Top Bottom