• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon RX Vega 64 vs. GeForce RTX 2080, Adrenalin 2019 Edition Driver Update Benchmark Test

Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
Been through the Video comparing the images and TBH there were wins for both vendors but that is just my opinion.

Having said that I preferred the AMD ones 60% and the NVidia ones 40% of the time.

Here is one that I preferred NVidia.

vwfJRsO.jpg

Has for this image what I looking for is they anything not rendered on one vs the other.

1. They both look wasted out
2. They both suffer from Youtube compression
3. AMD colours look different, Water is more blue on Nvidia, while water is more dirty looking on AMD.
4. The sky looks better on AMD clear see clouds etc on Nvidia they missing??
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
I'd say it's more dependent on a game for game basis. Some Game engines respond well to Hbm overclocking, some don't, some are more Cpu bound.
For example I have just recorded a lil video of Prey and it shows quite clear scaling, and Kingdom Come shows a good 2-4fps. Assassins creed however didn't respond at all to the hbm overclock.

Prey https://www.youtube.com/watch?v=QuYk3Q74sk8
Ass Creed https://www.youtube.com/watch?v=WuSiOnxK0RI
Kingdom Come https://www.youtube.com/watch?v=yIhRGE-4OcU

The 2-4 fps is basically every game I have tested. I dont have Prey installed at the moment. Its not a game changers.
I think when you start going 1100+ is when you start seeing some worthy gains tbh

edit
Thanks for taking the time to test though.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
3. AMD colours look different, Water is more blue on Nvidia, while water is more dirty looking on AMD.
4. The sky looks better on AMD clear see clouds etc on Nvidia they missing??

Yeah, the Radeon image shows a lake full with mud. But this grey colour may come also from a possible clouds reflections in the water surface.
 
Soldato
Joined
27 Mar 2010
Posts
3,069
The 2-4 fps is basically every game I have tested. I dont have Prey installed at the moment. Its not a game changers.
I think when you start going 1100+ is when you start seeing some worthy gains tbh

edit
Thanks for taking the time to test though.

Despite my wattman settings shown in the vids, it isn't my gaming profile I have a couple more refined profiles but I've only just upgraded to 19.1.1 so my usual overdriventool doesn't work properly.
But I totally agree that 1100+ is a helper, though that in some cases it's not. I have both a reference V64 and an LC64 which are happy to run most games with a Hbm of 1100-1175, couple of games get artifacts at 1175.
I lock my frtc to 74fps as my 1440p freesync monitor is only 75hz, In the less demanding games I play I like to run my lc64 at a held 1500mhz at 0.95v and the rad fan is practically inaudible and in the low 40c.
 
Soldato
Joined
22 Apr 2016
Posts
3,425
Quickly skimmed through the thread and couldn't see it.

Why is the CPU at only 4.0Ghz?

To introduce a bottleneck in the same way that ultra presets weren’t always used for graphics to increase the frame rates.

This is of course the problem with end user benchmarks they either don’t know what they are doing or are deliberately trying to skew them one way or another. Meaningless Guff.

The Vega 64 does very well so much so that I’m very tempted to get the Asus one and punt on my 56, but professional benchmarks show it some way off a 2070 let alone a 2080.
 
Soldato
Joined
27 Mar 2010
Posts
3,069
To introduce a bottleneck in the same way that ultra presets weren’t always used for graphics to increase the frame rates.

This is of course the problem with end user benchmarks they either don’t know what they are doing or are deliberately trying to skew them one way or another. Meaningless Guff.

The Vega 64 does very well so much so that I’m very tempted to get the Asus one and punt on my 56, but professional benchmarks show it some way off a 2070 let alone a 2080.

At this long stage in Vega's lifetime and the fact you have a V56 I'd have to say it's not worth it unless it barely cost you anything. I'd stay away from the asus, whether it's the v2 or not. I'dIgo for the Powercolor or Sapphire or a ref v64 on s block
 
Associate
Joined
9 Feb 2018
Posts
37
This proves how good the vega has become over time. The mining craze prices just killed it as a gaming card untill now with prices dropping back.
However its still going to be a hard sell to most users now with the 2060 released and nvidia now supporting freesync. The vega64 or 2060 question i think is more important.

I have a almost 3 year old 1080 and at current prices it looking like the best piece of hardware i ever brought . However would go vega 64 or vega 7 if my card died. I just cant stand what nvidia has done to price performance with turing.
 
Associate
Joined
25 Dec 2014
Posts
621
Location
Charlotte, NC
The 2-4 fps is basically every game I have tested. I dont have Prey installed at the moment. Its not a game changers.
I think when you start going 1100+ is when you start seeing some worthy gains tbh

edit
Thanks for taking the time to test though.


Overall I agree, mine does 1150 - 1180 hbm in games so I do use that, but if I couldn't get it to at least 1150 I would just leave it at 945 and focus on the core clocks.
 
Associate
Joined
29 Jun 2016
Posts
2,149
Location
Up Norf
It looks like nvidia uses some kind of colour compression which reduces the overall image quality and gains some performance.
The Radeon image quality looks better across almost everywhere.

I've seen a few people state this over the years. It's interesting, has anybody done a video covering this?
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
I've seen a few people state this over the years. It's interesting, has anybody done a video covering this?

They really isn't anything in it tbh

This all comes from Nvidia's own doing. They never used to support RGB full range and at that time AMD did. They was a clear difference and still is because nvidia default to limited range.

So users are not wrong when they say amd looks better but that is carried over from the past and Nvidia's default settings.
 
Associate
Joined
29 Jun 2016
Posts
2,149
Location
Up Norf
They really isn't anything in it tbh

This all comes from Nvidia's own doing. They never used to support RGB full range and at that time AMD did. They was a clear difference and still is because nvidia default to limited range.

So users are not wrong when they say amd looks better but that is carried over from the past and Nvidia's default settings.

So by doing that, in essence Nvidia perform better?
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
So by doing that, in essence Nvidia perform better?

I wonder when nvidia will finally decide to send the appropriate colour signals to the monitors.
Yeah, they can excuse that the a majority of the user base is ok with 16-235 RGB instead of 0-255 RGB but still leaves a not good taste in anyone's mouths given that this is a graphics company and top notch graphics should be their priority number one.
 
Soldato
Joined
4 Jan 2009
Posts
2,682
Location
Derby
I wonder when nvidia will finally decide to send the appropriate colour signals to the monitors.
Yeah, they can excuse that the a majority of the user base is ok with 16-235 RGB instead of 0-255 RGB but still leaves a not good taste in anyone's mouths given that this is a graphics company and top notch graphics should be their priority number one.
I just changed the setting in nvcp and it’s now always set to 0-255.
 
Back
Top Bottom