• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon RX Vega 64 vs. GeForce RTX 2080, Adrenalin 2019 Edition Driver Update Benchmark Test

Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
Thought this video was worth a thread. Like the guy uploading the video I was also quite surprised by some of the results. Has expected the 2080 wins in most of the games but not by has much I was expecting.


From video description

Radeon RX Vega 64 vs. GeForce RTX 2080 2019 update A like or share (e.g Reddit) helps a lot. Thank you. all 1440p

00:01 - Assassin's Creed Origins | very high
02:11 - Battlefield 1 | high
02:57 - Battlefield V | high
04:06 - Civilzation 6 | max
04:36 - Call of Duty: Black Ops 4 | max
05:26 - Deus Ex: Mankind Divided | very high
07:04 - Far Cry 5 | high (HD textures)
08:00 - Fortnite: Battle Royale | high
09:21 - Forza Horizon 4 | ultra
10:50 - Monster Hunter: World | high
12:20 - PUBG | high
13:13 - Shadow of the Tomb Raider | highest
14:41 - Tom Clancy's Rainbow Six: Siege | very high
15:29 - Total War: Warhammer II | high
16:31 - Wolfenstein 2 | max

General Notes: Thought it's time to compare the two post Adrenalin- and pre Vega VII launch and see where they stand and how the performance delta is now in certain games.
The colors are different because of Nvidia's default hdmi color range settings (can be changed to full, but I forgot)

I tend to set presets to max only in well optimized games, where the visual benefits outweigh the loss in performance (is rarely the case).

Since the RTX is factory OC I gave the RX Vega 64 a easy OC that everybody should be able to achieve with a few clicks: Undervolted the P6 state by 50mV, increased the PT by 25% and overclocked HBM by 100Mhz. I expect many Vega buyers do that anyway.

Game Notes:
Despite BFV being a "Nvidia game" Vega 64 comes much closer to the 2080 than in BF1. Surprising.

Surprise win for Vega in Civilization 6.

Far Cry 5: As to be expected both cards are not that far apart.

Fortnite: Something is wrong here. AMD GPUs do not well in this UE game in general, but it seem there is a terrible software bottleneck now. I will have a closer look again (could be the latest patch)

Forza Horizon 4: Just like with prior Forza games, Nvidia has improved the performance post launch significantly. Almost all tests online comparing Nvidia and AMD have been done with pre-game-ready Nvidia drivers and early game builds.

Wolfenstein II implements Nvidia's Content Adaptive Shading which helps the 2080 to pull ahead further. The RX still performs great here nonetheless.

Hope you liked that video.
Intel Core i9-9900K at 4.0Ghz 2x8GB DDR4-3400 Palit GeForce RTX 2080 Gaming Pro OC Sapphire Radeon RX Vega 64 Limited Edition


Lets look at RE2 Demo "YEAH I KNOW ITS A DEMO"

VEGA shows good gains here and its defo not running OC!!


Again more
Look at the Battlefield 5 performance! Its clear to see that VEGA can do really well when either the game is optimised for VEGA or AMD is just getting more out of VEGA lately. I tell you what its interesting to keep an eye on VEGA and see how well it ages.
Ryzen 5 was used this time

AMD Ryzen 5 2600X
X470 ASRock Master SLI
GSkill F4-3200C15D-16GTZKO

SAPPHIRE NITRO Radeon RX 64 -
MSI RTX 2080 TRIO Corsair AX 860


1440p high

This guy understands

MiauX2 months ago
Factory OC 2080 vs full stock Vega 64 and Vega its still competitive with a 900€ GPU. Not a bad buy in my book

Edit
And point proven again this time with HBM OC and Ultra BF5 1440p
 
Last edited:
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
The guy should not have given the Vega card an overclock as it is too big and totally trashed the comparison making the video pointless.

The 2080 card used in the video has an 1815mhz boost clock compared to a Founders Edition card with 1800mhz, less than 1% difference.

Vega card was hardly overclocked kaap. All he done was fixed the throttling that would happen out the box.
1500 is nothing compared to the Liquid version that can hit 1600 out the box and then pushed to 1700 with tweaking.
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
That is wrong to presume imo, lucky if 40/50% get tweaked at all (and Vega is an enthusiast card only pretty much).

That should have said wound not happen.

I would expect the percentage of people buying Vega know about tweaking and undervolting will be much higher. Considing AMD also list Wattman in there drivers for people to find and tweak.
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
Correction
AMD uses YCbCr 4:4:4 still full PC range 0-255
Nvidia by default uses RGB limited 4:4:4 16-235

Nvidia-signal-table.png
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
The point I am making is he is no longer comparing like with like standard cards.

+100mhz on the HBM and +25% PT is a noticeable OC too.

I am aware that the guys on the forum have been quite successful at overclocking Vega but Turing also can overclock a long way. This morning my RTX Titans were running @2100mhz on air with stock volts and bios, if I changed these they would go even higher.

https://www.3dmark.com/pr/22830

Stock HBM is 950 on my 64 so it's only 50mhz increase from my testing on my GPU that is very little difference. Even 1050 is not worth it that is why I keep my Vega at 950.
On balanced preset my Vega run from 1200/1400 up and down based on temperature because of the aggressive voltage out the box.
A simple lower of voltage makes my Vega run 1500 all the time.
Since the 2080 is tweaked out the box with a factory OC it's only fair like he done to give the Vega a little tweak to stop it throttling.
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
I found on Vega64 running the memory @1050mhz gave a very noticeable performance bump.

Getting the HBM2 on my other cards up to 1050 also gave good results.

In games or 3Dmark? Because in games I seen no difference I can even run a test to show this.
It's when you have Tony's HBM speed do you see the changes.
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
So why did the guy in the video even overclock the HBM2?

I find I get a performance bump in both by overclocking HBM2 but for normal gaming I don't overclock anything.

Because most users he thinks will bump this up a touch like he said. Although it really doesn't make a difference, when the increase is this small. Very little in it and this has always been the case for me while testing games. So I just leave it on 945 stock.

167984566129081337285a44a65acef44f29f571bbe12c2c774d8913a623150d025f3cb5.jpg
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
I disagree though I definitely see scaling in Hbm overclocking, here's a crude but collection of hbm scaling on Firestrike that I carried out.
I made sure that at each Hbm clock change, that the core clock still clocked to 1550mhz in each test.

hbm-scaling.png
[/url]

Again like I said 3Dmark does show changes in HBM. Games on the other hand do not. They is nothing in it for games I have tested this a lot!!
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
Been through the Video comparing the images and TBH there were wins for both vendors but that is just my opinion.

Having said that I preferred the AMD ones 60% and the NVidia ones 40% of the time.

Here is one that I preferred NVidia.

vwfJRsO.jpg

Has for this image what I looking for is they anything not rendered on one vs the other.

1. They both look wasted out
2. They both suffer from Youtube compression
3. AMD colours look different, Water is more blue on Nvidia, while water is more dirty looking on AMD.
4. The sky looks better on AMD clear see clouds etc on Nvidia they missing??
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
I'd say it's more dependent on a game for game basis. Some Game engines respond well to Hbm overclocking, some don't, some are more Cpu bound.
For example I have just recorded a lil video of Prey and it shows quite clear scaling, and Kingdom Come shows a good 2-4fps. Assassins creed however didn't respond at all to the hbm overclock.

Prey https://www.youtube.com/watch?v=QuYk3Q74sk8
Ass Creed https://www.youtube.com/watch?v=WuSiOnxK0RI
Kingdom Come https://www.youtube.com/watch?v=yIhRGE-4OcU

The 2-4 fps is basically every game I have tested. I dont have Prey installed at the moment. Its not a game changers.
I think when you start going 1100+ is when you start seeing some worthy gains tbh

edit
Thanks for taking the time to test though.
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
I've seen a few people state this over the years. It's interesting, has anybody done a video covering this?

They really isn't anything in it tbh

This all comes from Nvidia's own doing. They never used to support RGB full range and at that time AMD did. They was a clear difference and still is because nvidia default to limited range.

So users are not wrong when they say amd looks better but that is carried over from the past and Nvidia's default settings.
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
You see the difference on both.
I can't remember what 16-235 RGB means... Does it mean both - limited 8-bit and limited 6-bit colours? And limited 10-bit colours?
How many colours are 16-235 RGB?

Its not about Colour depth. My two monitors always remain 8bit

RGB limited vs Full is just the range at what point white is white and black is black.

That is why the screen always looks washed out when viewing limited RGB.
 
Back
Top Bottom