• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RX Vega 56 Owners Thread

Associate
Joined
17 Sep 2018
Posts
1,431
Glad of the “to and fro” we are having on this subject mate, a good debate results in good decisions. The fact however, that I will have to update the B-350 motherboard bios with either a throw away CPU or pay a local shop a similar cost to enable support for the latest gen Ryzen CPU's makes it a not starter to be honest. I played Kingdom Come Deliverance for a few hours this afternoon, and to be honest I think I might be jumping the gun somewhat. My CPU maxes out at circa 90% load in only the most heavily and involved populated / battle areas with my GPU hovering close to 100% whilst mostly only clocking at circa 1500MHz. Obviously I have no in game benchmark results to offer, but 1920 x 1200 High settings with CPU physics settings to medium and motion blur to off, I get as close to the steady 60FPS I want.


Footnote, any one playing Kingdom who is not already aware, open the console and input “r_BatchType = 1” (minus quotes). CryEngine subdues AMD GPU core clock frequencies for a reason beyond my education. This command increases GPU core clock in game, netting increased frame rates.

Fair enough mate, if you're happy enough that's all that matters. Interestingly though Kingdom Come Deliverance is one that is CPU bound, so at medium you'll still be getting lows in the 30s with the 4 threaded part where as it's around high 50s with 12 threads (and maybe less threads). So if the performance on that isn't irritating you, then you don't need to upgrade until you do start to get performance you're not happy with. In which time prices will have dropped or something better will have come out.

Terribly optimised game that it is.

Thanks for the KCD tip also, it's not on my playlist right now but I'll try and bookmark this one.
 

Kei

Kei

Soldato
Joined
24 Oct 2008
Posts
2,750
Location
South Wales
See that's something that creates a worthwhile debate. :). Care to run a shadow of the tomb raider benchmark at 1920 x 1080 high settings with your best overclock?

Picked this up this morning so gave it a go using your preferred settings. Wattman config set to my day to day settings of 1677/1100 +55% PL @ 1175mV. It seems my 1920X is quite the bottleneck at these settings.
 
Associate
Joined
23 Sep 2019
Posts
1
Hi guys, I just got the Acer Predator laptop that has vega 56 on it. Im interested in possibly modifying the soft power tables in it. Any idea where I can get a custom vbios for the card. Its somewhat like a custom card, it only has around 120 watt limit. Planning to bump it up maybe upto 150 watts. Other overclocking tool does not work so I need to edit it at the registry level. thanks.
 
Associate
Joined
17 Sep 2018
Posts
1,431
Picked this up this morning so gave it a go using your preferred settings. Wattman config set to my day to day settings of 1677/1100 +55% PL @ 1175mV. It seems my 1920X is quite the bottleneck at these settings.

I ran the Haven benchmark numerous times with different overclocks and undervolts. I got my best result at 1620mhz/950mhz with a 1020mV undervolt. If I pushed the core too high it ran slower because I'm told the card throttles and can't maintain the 1600mhz, so a lower core clock and an undervolt may give you improved performance
 

Kei

Kei

Soldato
Joined
24 Oct 2008
Posts
2,750
Location
South Wales
I ran the Haven benchmark numerous times with different overclocks and undervolts. I got my best result at 1620mhz/950mhz with a 1020mV undervolt. If I pushed the core too high it ran slower because I'm told the card throttles and can't maintain the 1600mhz, so a lower core clock and an undervolt may give you improved performance
If I pull the voltage down to 1150, the card will intermittently TDR with the clock set at 1677. (rare but it'll black screen and drop back to windows every now & then with a dx device removed error in some games) 1100 seems to work ok at 1612 IIRC. If I try to drop below 1050mV my HBM also drops of it's own accord too which really hurts the performance. My card seems to like the volts, which isn't the end of the world as it is watercooled and doesn't get much above 50 degrees when hammering it with the full 1200mV and +150% power limit doing benching runs at 1727/1190.
 
Associate
Joined
16 Jul 2019
Posts
102
Picked this up this morning so gave it a go using your preferred settings. Wattman config set to my day to day settings of 1677/1100 +55% PL @ 1175mV. It seems my 1920X is quite the bottleneck at these settings.

Interesting results Kei, I appreciate you taking the time to provide a comparative. I wrote my results down on paper so I do not have any screen shots, but like I previously posted; I ran the benchmark @ high settings 1920x1080. I ran my daily 1637/945 + 15% PL @ 1010 mV, and 1707/945 from memory and after several drinks +50% PL and 1200mV to get it through the benchmark. My 2500K @ 4.9GHz water cooled. Two things I can see... One your CPU definitely outshines mine (but then I paid just £90 for mine second hand 2.5 years ago). Two, your GPU overclock provides a tangible performance increase, subject to a willingness to flash a 64 bios on a 56 card to attain higher HBM clocks. I read have read previously and frequently that higher HBM clocks result in much better performace increases than core clock increases, your data comparitive to mine backs that up I would say?

My results...

@1637/945:

Average FPS 84

FPS______CPU Game___CPU Render___GPU

Min________92___________59________79
Max_______228__________305_______218
Average____137__________98_______107
95%_______101__________62________82


@1707/945

Average FPS 83

FPS_______CPU Game____CPU Render____GPU

Min_________85____________57________ 77
Max _______212___________231________174
Average____134___________95_________107
95%________98____________61_________80
 
Associate
Joined
16 Jul 2019
Posts
102
Fair enough mate, if you're happy enough that's all that matters. Interestingly though Kingdom Come Deliverance is one that is CPU bound, so at medium you'll still be getting lows in the 30s with the 4 threaded part where as it's around high 50s with 12 threads (and maybe less threads). So if the performance on that isn't irritating you, then you don't need to upgrade until you do start to get performance you're not happy with. In which time prices will have dropped or something better will have come out.

Terribly optimised game that it is.

Thanks for the KCD tip also, it's not on my playlist right now but I'll try and bookmark this one.

I'm getting closer to biting the bullet on upgrading my system mate, but I have so many other things going on right now I am struggling to convince myself to spend the money. Regarding Kingdom Come Deliverance, I think you may be surprised somewhat (whilst I agree with your assessment of my 2500K even at 4.9GHz). I have uploaded 4 screen shots below, sitting still at a fire, standing still looking from Rattay, mid gallop through the woods and mid sprint through Rattay. Resolution is 1920 x 1200, settings set to high then object quality, particles, physics, and shadows set to medium; motion blur off. I also run a custom re-shade which I found to reduce frame rates by circa 10 to 20%.





 
Associate
Joined
12 Sep 2019
Posts
216
Location
Auld Reekie
Just think how quiet and cool your GPU would be clocked to circa 1620 MHz core, low mV, quiet fan. It might cost 2 frames per second

OK, so I have to confess that I wasn't at all convinced by this but decided to put it to the test using the Gears of War 5 benchmark and I have to say that the results surprised me. Tests were carried out with Ultra video quality, 2560x1440 resolution, vsync disabled and no frame rate limits. GPU settings were made in Wattman and power, temp & fan readings taken using HWINFO64. I started the test with 1602 MHz target GPU clock and 900 mV P7 voltage, and increased by 10 MHz and 25 mV increments up to 1702 MHz and 1150 mV. Power limit in all custom tests was 50% but the fan PWM setting for 72C was set at 60% up to 1652 MHz and increased to 80% for higher clocks to avoid excessive temperatures. HBM2 frequency was 980 MHz for all custom tests. I also tested the 3 built-in Wattman profiles for comparison.

I've embedded the graphs and table of results below which can be summarised as follows:
  • All 3 built-in profiles produce woeful performance. In all my earlier testing I found that high core voltage with low power limit combinations produced poor results compared to low core voltage and high PL - hence why I am only using 50% PL here.
  • Increases in FPS gradually decrease with increasing MHz/mV up to a peak at 1692 MHz / 1100 mV, after which the FPS reduces again, presumably due to throttling. From 1602 to 1692 MHz, the FPS range is a mere 6 FPS or 9.3% with this title & settings.
  • The power consumption delta from lowest to highest is 110 Watts or 68%! Interestingly power does not increase linearly with the mV which I'm guessing will be due to internal power limit and temperature management algorithms.
  • Watts per frame neatly shows a much more favourable efficiency at lower clock speeds and significantly diminishing returns at higher clocks.
  • Fan speed increases in 4 distinct ranges: at 1602-1622 MHz, fan speed peaks around a comfortable 1670 RPM; at 1632-1652 MHz, this increases to a bearable ~1920 RPM; at 1662-1682 MHz, rotation notches up another 200-300 RPM and fan noise becomes intrusive; and finally at the 1692-1702 MHz range, the screaming fan is just offensive.
  • GPU temperature rose from 58C to 68C between 1602 and 1662 MHz and plateaued at 70/71C from 1672 MHz. At this max temperature, VRM and hotspot temps were in the 85-92C range (too hot).
So the upshot of all this is that, for me, there is no clear winner setting as all options from 1602 to 1662 MHz are acceptable - it would depend on the priority between FPS vs Power/Temp/Noise which could vary according to the titles being played. If I had to pick a sweet spot setting it would be 1622 MHz / 950 mV since this achieves lowest fan noise, excellent power efficiency, and acceptable performance with 6 FPS above stock and only 3 FPS behind the 1662 MHz / 1050 mV profile. In reality I will probably keep 2 custom profiles: 1662 MHz / 1050 mV for the games that need the extra horsepower, and 1622 MHz / 950 mV for the games that don't. I haven't fully game tested these overclocks yet so further tweaking may still be required :D

Thanks for your help Dan

Vega56-Gears5-Graphs.jpg


Vega56-Gears5-Table.jpg
 
Last edited:
Associate
Joined
16 Jul 2019
Posts
102
OK, so I have to confess that I wasn't at all convinced by this but decided to put it to the test using the Gears of War 5 benchmark and I have to say that the results surprised me. Tests were carried out with Ultra video quality, 2560x1440 resolution, vsync disabled and no frame rate limits. GPU settings were made in Wattman and power, temp & fan readings taken using HWINFO64. I started the test with 1602 MHz target GPU clock and 900 mV P7 voltage, and increased by 10 MHz and 25 mV increments up to 1702 MHz and 1150 mV. Power limit in all custom tests was 50% but the fan PWM setting for 72C was set at 60% up to 1652 MHz and increased to 80% for higher clocks to avoid excessive temperatures. HBM2 frequency was 980 MHz for all custom tests. I also tested the 3 built-in Wattman profiles for comparison.

I've embedded the graphs and table of results below which can be summarised as follows:
  • All 3 built-in profiles produce woeful performance. In all my earlier testing I found that high core voltage with low power limit combinations produced poor results compared to low core voltage and high PL - hence why I am only using 50% PL here.
  • Increases in FPS gradually decrease with increasing MHz/mV up to a peak at 1692 MHz / 1100 mV, after which the FPS reduces again, presumably due to throttling. From 1602 to 1692 MHz, the FPS range is a mere 6 FPS or 9.3% with this title & settings.
  • The power consumption delta from lowest to highest is 110 Watts or 68%! Interestingly power does not increase linearly with the mV which I'm guessing will be due to internal power limit and temperature management algorithms.
  • Watts per frame neatly shows a much more favourable efficiency at lower clock speeds and significantly diminishing returns at higher clocks.
  • Fan speed increases in 4 distinct ranges: at 1602-1622 MHz, fan speed peaks around a comfortable 1670 RPM; at 1632-1652 MHz, this increases to a bearable ~1920 RPM; at 1662-1682 MHz, rotation notches up another 200-300 RPM and fan noise becomes intrusive; and finally at the 1692-1702 MHz range, the screaming fan is just offensive.
  • GPU temperature rose from 58C to 68C between 1602 and 1662 MHz and plateaued at 70/71C from 1672 MHz. At this max temperature, VRM and hotspot temps were in the 85-92C range (too hot).
So the upshot of all this is that, for me, there is no clear winner setting as all options from 1602 to 1662 MHz are acceptable - it would depend on the priority between FPS vs Power/Temp/Noise which could vary according to the titles being played. If I had to pick a sweet spot setting it would be 1622 MHz / 950 mV since this achieves lowest fan noise, excellent power efficiency, and acceptable performance with 6 FPS above stock and only 3 FPS behind the 1662 MHz / 1050 mV profile. In reality I will probably keep 2 custom profiles: 1662 MHz / 1050 mV for the games that need the extra horsepower, and 1622 MHz / 950 mV for the games that don't. I haven't fully game tested these overclocks yet so further tweaking may still be required :D

Thanks for your help Dan

Vega56-Gears5-Graphs.jpg


Vega56-Gears5-Table.jpg

As nice as I can be...I ******* said so didnt i? My 1637/945 +15% PL and 1010 max mV wins unless one has water cooling
 
Soldato
Joined
22 Nov 2010
Posts
5,713
Are the blower style cards any good? Can these be undervolted aswell or is it pot luck?

Edit. Just found the other thread specifically talking about the airboost cards
 
Last edited:
Associate
Joined
12 Sep 2019
Posts
216
Location
Auld Reekie
As nice as I can be...I ******* said so didnt i? My 1637/945 +15% PL and 1010 max mV wins unless one has water cooling

Hahaha, yes you did mate although I wasn't aware this was a competition ;)

So I decided to give your settings a try and and **** the bed!, FPS performance came out par with my profile of 1652/980/1025mV/50%PL which was 3C hotter and consumed +15W. I was able to better it even further by upping my HBM2 to its max stable 980MHz setting which put it on par between my 1662 and 1772 MHz profiles which are more power hungry and hotter still. This surprised me as my previous testing with lower power limits showed big sacrifices in FPS with PLs below 50%.

I've added the 2 new tests to my table (blue fill) and ranked the 4 metrics; FPS, Power, Temp & Fan Speed. In order to combine the ranks into a single score and find a clear winner, I've weighted the FPS rank by 75% to reflect the higher priority (to me) of this metric - the remaining metrics are summed, multiplied by 25% and added to the FPS score to give an overall score. When the overall score is ranked, the best custom profile to balance performance against power/temp/noise is... wait for it .... 1637/980/1010mV/15%PL. No need for any more "I told you so" now please ;)

What's great about this card and AMD driver combo, is that there's a variety of ways to achieve a particular performance goal which makes it a lot of fun to own. I am a bit concerned though that it's nearing it's EOL at 1440p Ultra after both DX12 titles I've been playing (Division 2 and Gears 5) only average 55-65 FPS. Let's wait and see what AMD have in store for us next year....

Vega56-Gears5-Table2.jpg


Edit: Further comparative testing between the 1637 and 1662 MHz profiles shows no frame rate loss in World War Z (Vulcan) or Gears of War 4 (DX12) and only a 0.5 FPS (1%) drop in the Superposition OpenGL test. The Division 2 (DX12) however drops 5 FPS to 54 FPS average (8.5%) and is stuttering noticeably more so I'm still going to need to keep the big MHz for the games that need it, or start accepting some dialling down of the graphics quality

Edit2: Further testing on Division 2 last night revealed better performance and lower power/temp with the 1622/980/950mV/50%PL profile than with 1637/980/1010mV/15%PL: 1637 = 64 FPS, 1622 = 68 FPS, 1662 = 69 FPS. So it would appear that there is no single profile that optimises performance/power for all titles and that profiles will need to be tested against each new title to find the best one.
 
Last edited:
Associate
Joined
12 Sep 2019
Posts
216
Location
Auld Reekie
Ahh I see, well I can type it in manually in Wattman or using overdriven so was wondering the value you need over most of them tests?

I can type in other values too but it doesnt actually change anything on the card which is annoying. I was considering flashing to the 64 BIOS to get the higher VDDC and so get past the 980MHz ceiling but have so far decided against it as the benefits dont really outweigh the risks and issues from what I've read
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,566
Location
Greater London
Got one of these to play about with while waiting for Nvidia's 7nm offerings. Got to say I am really impressed with how much that can be done by tuning it.

I dropped from the default 1.2V to 0.95V and upped Power Limit to 50% and I get a lot more performance and it runs a lot cooler and quieter!

Is this normal of all Vega 56 cards? Has anyone gone even lower and maintained better than balanced performance?
 
Associate
Joined
31 Oct 2009
Posts
854
Location
in the tower
Is it worth upgrading my powercolour reference vega 56 to a 5700xt as the 56 is now 2 years old ?
i know its only about 23% better but hey wait for something better mmm
 
Associate
Joined
12 Sep 2019
Posts
216
Location
Auld Reekie
Got one of these to play about with while waiting for Nvidia's 7nm offerings. Got to say I am really impressed with how much that can be done by tuning it.

I dropped from the default 1.2V to 0.95V and upped Power Limit to 50% and I get a lot more performance and it runs a lot cooler and quieter!

Is this normal of all Vega 56 cards? Has anyone gone even lower and maintained better than balanced performance?

These cards rock for tweaking other than the HBM voltage which is a placebo and does nothing. At that core voltage and power limit, my card runs best with core frequency set to 1622 MHz. In Division 2, it is only 1-2 FPS slower running on 1662 MHz on 1050mV and 50% PL but using 45W less power and running 7C cooler with 400 less RPM on the fans. Compared to Balanced, 1622/950mV/50%PL is 4 FPS better for a similar wattage and temp. Check out my post above for differences in Gears 5 performance. I've found that less demanding games scale better on higher clocks and voltages before throttling but the extra power draw and heat is hardly worth it. Enjoy tuning!
 
Back
Top Bottom