• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RX Vega 64 Owners Thread

Matt does DX12 give you very high CPU usages and lag spikes? I had this for ages now with BF1 I thought it was my R9 290 but I also get it with VEGA.. DX11 seems to work fine though.
Not noticed high CPU usage, but DX12 performance in BF1 is poor imo. DX11 works really well on Vega, so no reason not to use that.

Changing subject slightly...

Recently had a smart meter installed for electricity, gave me a chance to put 'power usage' in perspective.

The smart meter has three ratings for current electricity usage.
1) green = low usage
2) amber = medium usage
3) red = high usage

Using a 1950x (16core) and a Radeon Pro Duo @1000/500Mhz +4K display to play Plants Vs Zombies @ 3840x2160P @ maximum settings + Using a 1800x @4.075Ghz + a RX Vega 64 @ 1802/1100Mhz +3440x1440P display to play Plants Vs Zombies @ 3440x1440P at the same time via local LAN = Green usage.

Taking a hot shower = red usage
using the dishwasher = amber usage
Using the tumble dryer = red usage

I was surprised to see that running two systems running full whack at the same time was counted as low power usage, in comparison to just taking a hot shower. :o
 
Resident Evil 7, 3440x1440 maximum settings + Shadow Cache On, resolution scaling off. Alt tabbing in and out of the game, opening a browser with multiple tabs and videos playing. Video memory usage shoots up to 11GB+, no stuttering at all, smooth as a babies bottom. High bandwidth cache has serious potential, remember this post in the years to come. :)
Resident Evil 7, a known video memory hog + 3440x1440 + Max Settings + Shadow Cache on + resolution scaling off. (video memory usage would climb even higher with this feature enabled - you have the FPS to drive it)
Alt tab, open browser with loads of tabs + videos playing= video memory usage shoots up to 11GB. No stuttering, no performance drop, High Bandwidth Cache enabled in Radeon Settings, it has serious potential moving forward for games that breach the maximum available memory limit.
 
Not noticed high CPU usage, but DX12 performance in BF1 is poor imo. DX11 works really well on Vega, so no reason not to use that.

Changing subject slightly...

Recently had a smart meter installed for electricity, gave me a chance to put 'power usage' in perspective.

The smart meter has three ratings for current electricity usage.
1) green = low usage
2) amber = medium usage
3) red = high usage

Using a 1950x (16core) and a Radeon Pro Duo @1000/500Mhz +4K display to play Plants Vs Zombies @ 3840x2160P @ maximum settings + Using a 1800x @4.075Ghz + a RX Vega 64 @ 1802/1100Mhz +3440x1440P display to play Plants Vs Zombies @ 3440x1440P at the same time via local LAN = Green usage.

Taking a hot shower = red usage
using the dishwasher = amber usage
Using the tumble dryer = red usage

I was surprised to see that running two systems running full whack at the same time was counted as low power usage, in comparison to just taking a hot shower. :o

Does it not tell you your Watts or pence per hour?

Mine just tells me how many watts I'm using and also how much that is in pence. My PC add's 0.5Pence per Hour at idle and 2pence per hour at full pelt.
 
Power usage is way lower on Vega 64 than I previously experienced with the 295X2. I now idle at 100-130W (haven't got adaptive voltage on my overclock set up, which could help lower it a bit more) rather than the 185W from before with the 295X2. On full GPU load (e.g. Superposition benchmark) I see some 550W of power usage at the wall; I've seen power usage on full load (both GPU and CPU) be as high as 850W before with the 295X2.
 
That's it, I'm at a point of happy now. All stress testing and benching done, found my best clocks for best 3dmark scores...1700/1100. Now to just game and enjoy this great card. Better for me to go higher on the HBM and lower on the core than vice versa.
 
gaming with a smile on my face is just how it should be, it's been a while. :)

Yup, I'm glad for ya man. I know you had nothing but trouble sometimes with your Crossfire setup!

I personally am never going mGPU again, it's all (mostly!) a bit of a ****-show now :(

Single powerful card + adaptive sync tech for me now, a perfect combo :cool:
 
has anybody suffered with games artifacting (if that's the word) ive been playing PUBG for a couple of hours and I got a warning saying my card was near its temperature limit (85c) not long after I started to notice black squares appearing briefly. I whacked the fan up to combat this and managed to get it to sit around mid 70's. not sure I want the fan speed so high though.
 
That's it, I'm at a point of happy now. All stress testing and benching done, found my best clocks for best 3dmark scores...1700/1100. Now to just game and enjoy this great card. Better for me to go higher on the HBM and lower on the core than vice versa.
Nice one Tony. I hadn't considered lowering the core clock to increase HBM. Previously I couldn't run Timespy at 1750/1100 but I just tried it at 1700/1100 and whilst it ran ok for me the graphics score was 7866 which isn't my highest. The best for me on the 17.8.2 drivers was 1700 and 1750 on states 6 & 7 respectively and the memory at 1080, this gave me a graphics score of 8098. I think I'm done with the benching and all that too - at least until the next set of drivers...
 
Back
Top Bottom