• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RX Vega 64 Owners Thread

I actually sold my Vega recently and bought an RTX. I won't be needing Freesync anymore since the 2080 (unlike Vega) should be powerful enough max the 100hz of my 3440x1440p Freesync Ultrawide.

My decesion was based on wanting more FPS and no Vega 64 successor likely to be released until 2020.

I owned Vega for a total of 8 months. It was a fun ride.

are RTXs available now?

ive heard that AMD start releasing the next one up from Vega late this year, with Gaming cards coming next year, but probably be 2020 till we see them


reading this page about freesync, I have a freesync monitor, but gone from an old AMD 7870 or something, to a GTX 970 @ 144HZ

if i moved the GPU for an AMD (Vega or other) will i actually see/notice a difference because i will be running games on freesync then ?
 
Hey guys,

just setup some CS:GO and had a quick sesh on death-match to check my system out..my FPS is pretty poor, around 150~ but my Red Devil load meter is around the middle (four LEDs)..

Does it really matter? Or should I be getting better performance out of it?
 
are RTXs available now?

ive heard that AMD start releasing the next one up from Vega late this year, with Gaming cards coming next year, but probably be 2020 till we see them


reading this page about freesync, I have a freesync monitor, but gone from an old AMD 7870 or something, to a GTX 970 @ 144HZ

if i moved the GPU for an AMD (Vega or other) will i actually see/notice a difference because i will be running games on freesync then ?

FreeSync has certainly give my R9 290X an extended life and on/off is very noticeable.
 
Hey guys,

just setup some CS:GO and had a quick sesh on death-match to check my system out..my FPS is pretty poor, around 150~ but my Red Devil load meter is around the middle (four LEDs)..

Does it really matter? Or should I be getting better performance out of it?
Try creating an application profile for CS:Go, enable the histogram in Wattman inside that profile, then set P7 as the minimum state, by right clocking the orange dot.
2hIDmT7.jpg
 
Hey guys,

just setup some CS:GO and had a quick sesh on death-match to check my system out..my FPS is pretty poor, around 150~ but my Red Devil load meter is around the middle (four LEDs)..

Does it really matter? Or should I be getting better performance out of it?

At 4k max settings 8x msaa i get around 220 average. That's with system in the sig and the Devil on balanced setting.
 
What core and HBM settings in Wattman and what actual core clock do you see in game?

In game the core is sitting at around 1500 with 30mhz ups and downs either side. Memory was a flat 945. As i have it set to balanced i can't see the actual core and memory speeds that are set in this mode as there is no control.
 
Probably not a huge difference. I think people have problems getting 3000mhz+ ram working at full speed on ryzen as well, not sure if that got fixed.
 
are RTXs available now?

ive heard that AMD start releasing the next one up from Vega late this year, with Gaming cards coming next year, but probably be 2020 till we see them


reading this page about freesync, I have a freesync monitor, but gone from an old AMD 7870 or something, to a GTX 970 @ 144HZ

if i moved the GPU for an AMD (Vega or other) will i actually see/notice a difference because i will be running games on freesync then ?
Even without Freesync. The Vega 64 is hugely faster then a 970.
 
Probably not a huge difference. I think people have problems getting 3000mhz+ ram working at full speed on ryzen as well, not sure if that got fixed.

Fast memory gives a good boost on Ryzen. It's one of the reasons i went for 3200 ram. Here is a video that shows the difference well.

https://www.youtube.com/watch?v=uMgF1TWhhs8&ab_channel=vmreviews

Also at 4k my gpu was getting 100% usage so i would say that's your problem, 4 bars is only using 50% of your gpu. I was getting the full 8 bars on mine. What resolution are you running.
 
I actually sold my Vega recently and bought an RTX. I won't be needing Freesync anymore since the 2080 (unlike Vega) should be powerful enough max the 100hz of my 3440x1440p Freesync Ultrawide.

I think you're being extremely optimistic that it'll be a good experience.

Frame rate is not just GPU, it's also the CPU. But anyway, what makes VRR solutions so great is that it can cover for drops in framerate. Once you're without VRR any drop from 100 will be very noticeable, as there's no intermediate frame rates and it has to wait for the next frame (not sure when that'll be at 100 Hz, but at 60 Hz the next step down is effectively 30 Hz - it's these jumps between frame rates that are noticeable), and I really doubt even a 2080Ti would be able to give a reliable 3440x1440 at 100 fps with very few (basically no) frame drops.

But please, do let us know your experience, would love to know it if I'm utterly wrong on this.
 
Probably not a huge difference. I think people have problems getting 3000mhz+ ram working at full speed on ryzen as well, not sure if that got fixed.
Low Latency at 3200MHz is where Ryzen performs best, hence the premium on G.Skill C14D 8GB/16GB pairs currently. As long as the memory is Samsung B-Die you won't have issues, with other lesser grades of memory there's still a little bit of spotty support usually as the dimms aren't selected as accurately. High Latency RAM (C18/C19) will highlight the latency issues present with the IMC.

There's quite a bit of performance to be found if you mess with Memory timings and OC properly. Out of the box there's not a lot of difference between C16D and C14D memory at 3200MHz.
 
This may be a stupid question but is the Powercolor V64 fan setting for silent done through software or is there a dip switch ?
 
Can someone explain to me why the clock speeds are dropping at 4k virtual scaling compared to 1080p and 1440p. As expected I'm getting 1510-1550mhz playing Doom which I'm perfectly happy with but noticed it drops to 1350mhz at 4k using vsr. Its not really that important as fps is still well over 60 but thought was odd. If anything I thought lower res and higher fps would be the one that clocks lower due to heat.
 
Hi ..i have i7 8700k trident z 3200mhz 16gb..and my monitor is samsung 49" 3840x1080p freesync 2 hdr quanton dot.. i sell my gtx 1080ti and want to buy rtx 2080 or ti... or buy vega 64 asus strix for 600euro or 700euro sapphire nitro beacouse i hawe freesync2 .. rtx heare in croatia is 900euro and ti 1350 euro... but i play only pubg on that resolution shadows and efects wery low other ultra i dont know hawe much is good vega64 for that i want 100+ fps
 
Back
Top Bottom