• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon RX Vega 64 vs. GeForce RTX 2080, Adrenalin 2019 Edition Driver Update Benchmark Test

Soldato
Joined
27 Mar 2010
Posts
3,069
Anyways, pretty good performance of the V64, yeah it can clock a bit more but generally that is an average undervolt and underclock for the typical Vega user, most will just run it out the box and it will run in the 1400's or worse.
As to the V64 performance in that video I think it's pretty solid, I'm still happy with mine I wonder if that's because my refresh rate is only 75hz@1440p?
Another thing to bear in mind is if Nvidia release the Rtx2085 would 5-10% make much difference.
As to the Rtx2080 I wonder how well it scales with memory overclocking?

edit

 
Soldato
Joined
28 May 2007
Posts
10,071
You are talking lots of different variables which are not relevant.

Using the FE cooler my RTX Titans average 2050mhz when overclocked however long I run them for and sometimes hit 2130mhz which is not bad on air.

If only the guy in the video had used a FE 2080 and left the overclocking alone the results would have been far more meaningful.

If he had done the above I still think Vega64 would have performed very well indeed, unfortunately he messed up before he even started.

I think the results are fine. It's a very average Vega v a very average 2080. The OC's you are doing on your cards are pushing to the extreme which is different entirely. I have barely tweaked my Vega and it runs over 1600 with 1100hbm for everyday gaming while staying under 70oc on the stock fan profile.
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
In games or 3Dmark? Because in games I seen no difference I can even run a test to show this.
It's when you have Tony's HBM speed do you see the changes.

So why did the guy in the video even overclock the HBM2?

I find I get a performance bump in both by overclocking HBM2 but for normal gaming I don't overclock anything.
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
I think the results are fine. It's a very average Vega v a very average 2080. The OC's you are doing on your cards are pushing to the extreme which is different entirely. I have barely tweaked my Vega and it runs over 1600 with 1100hbm for everyday gaming while staying under 70oc on the stock fan profile.

Only reason I mentioned them is to point out what the guy is claiming as an overclock over a FE card is just 15mhz or less than 1%, practically non existent. What he did with the Vega64 card is far more than 1%.
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
Another thing to bear in mind is if Nvidia release the Rtx2085 would 5-10% make much difference.

And another 15% on the price.:eek:

We don't need anymore Turing cards until NVidia get the pricing right on the existing ones.

As to the Rtx2080 I wonder how well it scales with memory overclocking?

You get decent gains on any of the Turing cards when overclocking the memory, 3500mhz default may sound fast but getting it close to 4000mhz or better is worth doing for benching.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
The colours on the Radeons are not brighter, actually the ones on the Geforces are brighter, washed out and with slight fog effect over the image.
The Radeons images are darker, with higher contrast and more naturally looking colours.

Can you elaborate on the missing details, Do you mean pop in, texture mapping or draw distance.
I'm not interested in the washed out look it's not an issue, It's clear why it's happened and there's no need to talk about it.
But if you can highlight the missing details then great.

Dave, what 4K8KW10 wrote, is similar to what I wrote it when moved from GTX1080Ti to Vega 64 in the summer. FPS was lower, but everything looked better.
And pointed the differences on the game (WOT) I mostly play because the changes were obvious.

On the 1080Ti & on the GTX1060 6GB (laptop) same max out settings, everything looks foggy and less detailed compared to the Vega 64.
Even the foliage on the bushes and trees are significantly less on the Nvidia cards compared to the Vega 64, impairing my ability to aim and shoot successfully behind double bushes, because I cannot see through them any more.

Also three maps there are more eye catching changes.
Paris, on the GTX cards the reflections of the thousands of windows are non existent, while on the Vega 64 you can see reflections of the Eiffel tower on all windows as moving around. (nothing to do with ray tracing as is rasterised reflections)
Overlord map, if you spawn on the south side, with Vega can see to the north side, a big thick smoke (with red heating center) from the burning ships. While on the GTX cards, the smoke is similar to someone put up a small bonfire in November, not burning ships during the Normandy landing.
Lakevile map, on Vega you can see the details of the houses on the lake, with a glorious (rasterized) reflections. While on the GTX cards looks plain and some very low quality reflections on a single corner.

Imho and given the age with Adaptive sync we are, more FPS doesn't mean much, if you are hitting 100+ already.
 
Soldato
Joined
27 Mar 2010
Posts
3,069
And another 15% on the price.:eek:

We don't need anymore Turing cards until NVidia get the pricing right on the existing ones.
You get decent gains on any of the Turing cards when overclocking the memory, 3500mhz default may sound fast but getting it close to 4000mhz or better is worth doing for benching.

Well a real rtx2080 would be nice lol.
If for like my old gtx970, With the core at 1600mhz and overclocking the mem to 8000/8100 at 1080p I'd see around 3-4fps then if I left it at 7000. That to me was worth it.
But with the 1060 it doesn't scale as well as it doesn't have enough cuda cores, so I don't bother overclocking it.
Can you do a graph of some memory scaling to see ?
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
So why did the guy in the video even overclock the HBM2?

I find I get a performance bump in both by overclocking HBM2 but for normal gaming I don't overclock anything.

Because most users he thinks will bump this up a touch like he said. Although it really doesn't make a difference, when the increase is this small. Very little in it and this has always been the case for me while testing games. So I just leave it on 945 stock.

167984566129081337285a44a65acef44f29f571bbe12c2c774d8913a623150d025f3cb5.jpg
 
Soldato
Joined
27 Mar 2010
Posts
3,069
Dave, what 4K8KW10 wrote, is similar to what I wrote it when moved from GTX1080Ti to Vega 64 in the summer. FPS was lower, but everything looked better.
And pointed the differences on the game (WOT) I mostly play because the changes were obvious.

On the 1080Ti & on the GTX1060 6GB (laptop) same max out settings, everything looks foggy and less detailed compared to the Vega 64.
Even the foliage on the bushes and trees are significantly less on the Nvidia cards compared to the Vega 64, impairing my ability to aim and shoot successfully behind double bushes, because I cannot see through them any more.

Also three maps there are more eye catching changes.
Paris, on the GTX cards the reflections of the thousands of windows are non existent, while on the Vega 64 you can see reflections of the Eiffel tower on all windows as moving around. (nothing to do with ray tracing as is rasterised reflections)
Overlord map, if you spawn on the south side, with Vega can see to the north side, a big thick smoke (with red heating center) from the burning ships. While on the GTX cards, the smoke is similar to someone put up a small bonfire in November, not burning ships during the Normandy landing.
Lakevile map, on Vega you can see the details of the houses on the lake, with a glorious (rasterized) reflections. While on the GTX cards looks plain and some very low quality reflections on a single corner.

Imho and given the age with Adaptive sync we are, more FPS doesn't mean much, if you are hitting 100+ already.

I get the cloudiness fogginess but that to me is just the Dynamic range of the colour output. I remember one of your posts way back think it was WOTanks? there were some trees missing, or some foliage on branches missing?
This is what I want to see, if either Amd or Nvidia render differently to get different performance targets.

In your description of the three maps and the eye catching changes, Paris, Overlord and Lakeville is that in the video in this thread or from personal experience, not that i'd disbelief you, but I would like to see render differences.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
I get the cloudiness fogginess but that to me is just the Dynamic range of the colour output. I remember one of your posts way back think it was WOTanks? there were some trees missing, or some foliage on branches missing?
This is what I want to see, if either Amd or Nvidia render differently to get different performance targets.

In your description of the three maps and the eye catching changes, Paris, Overlord and Lakeville is that in the video in this thread or from personal experience, not that i'd disbelief you, but I would like to see render differences.

I need to find the time to make a video on those maps plus Westfield (which is full of bushes) on the laptop (GTX1060 6GB) and on the Vega (PC) as I do not have the 1080Ti.
 
Caporegime
Joined
4 Jun 2009
Posts
31,046
Definitely a win for Vega on the price/performance front.

IQ is the same for both vendors even though at default AMD have brighter punchier colours.

Having brighter punchier colours at default is not actually better as it can be over the top sometimes in the same way as Ray Tracing on Turing cards in BFV can look unrealistic.

Having said that I do prefer the default colours on AMD cards.

Coming from a calibration POV, AMD are using the correct settings by default for "PC monitor" usage, nvidia are not.
 
Soldato
Joined
27 Mar 2010
Posts
3,069
I need to find the time to make a video on those maps plus Westfield (which is full of bushes) on the laptop (GTX1060 6GB) and on the Vega (PC) as I do not have the 1080Ti.
Ok anything is good, Ias I have a 1060 in a htpc so can test with same hardware.
 
Soldato
Joined
27 Mar 2010
Posts
3,069
Because most users he thinks will bump this up a touch like he said. Although it really doesn't make a difference, when the increase is this small. Very little in it and this has always been the case for me while testing games. So I just leave it on 945 stock.

I disagree though I definitely see scaling in Hbm overclocking, here's a crude but collection of hbm scaling on Firestrike that I carried out.
I made sure that at each Hbm clock change, that the core clock still clocked to 1550mhz in each test.

hbm-scaling.png
https://ibb.co/6BZnNmD
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
Been through the Video comparing the images and TBH there were wins for both vendors but that is just my opinion.

Having said that I preferred the AMD ones 60% and the NVidia ones 40% of the time.

Here is one that I preferred NVidia.

vwfJRsO.jpg
 
Soldato
OP
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
I disagree though I definitely see scaling in Hbm overclocking, here's a crude but collection of hbm scaling on Firestrike that I carried out.
I made sure that at each Hbm clock change, that the core clock still clocked to 1550mhz in each test.

hbm-scaling.png
[/url]

Again like I said 3Dmark does show changes in HBM. Games on the other hand do not. They is nothing in it for games I have tested this a lot!!
 
Soldato
Joined
27 Mar 2010
Posts
3,069
Been through the Video comparing the images and TBH there were wins for both vendors but that is just my opinion.

Having said that I preferred the AMD ones 60% and the NVidia ones 40% of the time.

Here is one that I preferred NVidia.
That is interesting although there maybe a situation of fov giving a draw distance effect, however I can see the plane detail on the nvidia is clearer, (diffirence in actual render times maybe?) There is better detailed refelction of the cliff face in the water on the Nvidia, and tree reflections
 
Soldato
Joined
27 Mar 2010
Posts
3,069
Again like I said 3Dmark does show changes in HBM. Games on the other hand do not. They is nothing in it for games I have tested this a lot!!
I'd say it's more dependent on a game for game basis. Some Game engines respond well to Hbm overclocking, some don't, some are more Cpu bound.
For example I have just recorded a lil video of Prey and it shows quite clear scaling, and Kingdom Come shows a good 2-4fps. Assassins creed however didn't respond at all to the hbm overclock.

Prey https://www.youtube.com/watch?v=QuYk3Q74sk8
Ass Creed https://www.youtube.com/watch?v=WuSiOnxK0RI
Kingdom Come https://www.youtube.com/watch?v=yIhRGE-4OcU
 
Back
Top Bottom