• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RX Vega 64 Owners Thread

List your settings. What temps are you getting? Keep an eye on the Hotspot which there's been suggestion is located on the Interposer between the HBM and Core from what I've been reading at Overclock.net.

Ok this is what I was running

P7 : 1752Mhz/1200mv
HBM : 1100Mhz/1050mv
GPU Max Temp : 44c
Hot Spot Max Temp : 63c
HBM Max Temp : 45c

I got less with this on Superposition than I did with stock clocks. Makes no sense to me. Any help would be appreciated.
 
Didn't knew you had channel mate. Subbed and keeping close eye :)
Thanks mate, i do intend to test the latest games so will update the channel a few times a month at least.

Have a Threadripper 1950x and 2x Vega 64 system as well, but i find myself spending most of my time on the 'well oiled and tuned' Ryzen 1800x 4.075Ghz and Vega 64 1802/1100Mhz system @ 3440x1440.

I would class Vega 64 as the perfect GPU from AMD at this resolution for maintaining 60-100HZ FreeSync.
 
Thanks mate, i do intend to test the latest games so will update the channel a few times a month at least.

Have a Threadripper 1950x and 2x Vega 64 system as well, but i find myself spending most of my time on the 'well oiled and tuned' Ryzen 1800x 4.075Ghz and Vega 64 1802/1100Mhz system @ 3440x1440.

I would class Vega 64 as the perfect GPU from AMD at this resolution for maintaining 60-100HZ FreeSync.

;)
Waiting my 1800X & CH6 between Tuesday and Friday next week, while holding atm to see the custom gigabyte Vega64 offering at the end of the month. (The 1080ti goes to my brother as gift for his 40th bday).
 
Ok this is what I was running

P7 : 1752Mhz/1200mv
HBM : 1100Mhz/1050mv
GPU Max Temp : 44c
Hot Spot Max Temp : 63c
HBM Max Temp : 45c

I got less with this on Superposition than I did with stock clocks. Makes no sense to me. Any help would be appreciated.
Well it's not thermal throttling so I'd guess power throttling. What's the actual MHz during Superposition and does it fluctuate a lot? It'll show up the top right the current actual Core MHz.

At 1752MHz/1200mv you'll be pulling tons of power so +50% power may not be enough. You may need to test with the modified registry powerplay tables that allow up to +142% power target.
 
Well it's not thermal throttling so I'd guess power throttling. What's the actual MHz during Superposition and does it fluctuate a lot? It'll show up the top right the current actual Core MHz.

At 1752MHz/1200mv you'll be pulling tons of power so +50% power may not be enough. You may need to test with the modified registry powerplay tables that allow up to +142% power target.

GPUZ was reporting max draw as 282W. Would flashing the LC bios be my next step? Think it allows 1250mv.

I'll check superposition later to see if the core fluctuates.
 
GPUZ was reporting max draw as 282W. Would flashing the LC bios be my next step? Think it allows 1250mv.

I'll check superposition later to see if the core fluctuates.
It's a possible step. It allows up to 260W on the LC BIOS too instead of 220W. Though it does seem those Wattage "allowances" are mostly irrelevant to the power drawn :D

Give it a test run though first I'd test the registry hack to increase the available power from +50% to +142%.
http://www.overclock.net/t/1633446/preliminary-view-of-amd-vega-bios The SoftPP.zip has the required power table increase.

If that doesn't get the desired results try the LC BIOS.
https://www.techpowerup.com/vgabios/195001/msi-rxvega64-8176-170811 This is the MSI 8774 LC BIOS I use on my Air V64. Works fine for me but YMMV as always.
 
Update/comments on my install:

0. Can others with the AIO comment on how loud it is? I popped it in today to replace the AC one for a quiet life and to my horror it is less quite than the AC, the fan buzzes and the pump grinds/whines, louder than the AC at idle desktop and a more annoying noise under load. :( This can't surely be right?

1. CF/mGPU in 17.10.1 seems very, very broken, I see around 8100 in Timespy versus @11200 with 17.9.3 WHQL. Anyone else seeing this? Also no HBCC option available in mGPU (yet?) in 17.9.3 or 17.10.1?

2. Eiswolf loop expanded, Eisbaer and reservoir all installed (fairly neatly IMO), seems to perform better than the Kraken X62 for the CPU with lower speed/quieter fans as well and the GPU temperature did drop a bit too.

3. I also installed a fan cable to the card under the Eiswolf so the GPU can now control it, if I was doing it again (hello Alphacool) it actually needs a notch/hole in the alu block where the GPU fan cable is it only just squeezes in but bends the PCB, I put a plastic sheet between the pins and block to make sure it didn't short - too much hassle to pull it apart again and risk all the thermal pads falling out/getting damaged. Likewise the BIOS switch is now inaccessible as are the LED switches should you wish to disable the GPU tach or connect some external LED to the LED header - doh!

Alphacool should include a fan cable in the pack and some more cut outs to fix the above plus better instructions on positioning of the backplate as it is critical. pre-cut pads would be fantastic too as that's the hardest and most time consuming thing but if they made a tool at the factory it would be easy for them to do.

Not meant to be a review but heading that way! <G>).

4. My 1300W SF PSU's fan is coming on very loudly at ~800W (i.e. full load on CPU + the two cards) which is also annoying, so I may have to see if I can find a quieter one. :(

At below around 650W the fan is off, so another option when gaming is the frame limiter, with good mGPU support the power draw might be less. It should really ramp up more slowly and based on the internal temperature but seems to bang on suddenly at a high speed (but not full power).

And people say houses/cars/bikes are money pits. ;)

Its not really loud in my "eye's" even at 2500 rpm. But I have also a double fan mounting, push/pull, so they stayed a bit cooler allready, so it doesn't have to spin so high and the pump seems fairly quit.
As afar as CF is conserned, after theree drivers in a raw I'm still unable to get it working in W10. During the driverinstall the monitor goes out of signal/black and drivers and CF could not be initialised.
Also reinstalled my OS, but it makes no difference. I'm completely lost what happen here.
W7 works, but I only use that for one game as W10 is the primary gamesystem.

The bad thing is here, your're on you own, getting no/minimal support and hope one day it will work. 1,5 grand of GPU's waiting on the desk to be used. It could lessen the frustration if I knew what the issue's is, for example MST compatibilty for 5K screens or something.
But I can't get into a deep problemsolving contact with the techsupport and ofcourse I know they can't do it with every customer, but still.

If it was not for these very promising GPU's I would let it go, but I've seen the potential of Vega also in CF on my W7 install (nearing Titan/Ti in 5K on a few gamescenes!) and I really want it to work out.

@Matt, did you accidently heard something about the issue I encouter?. If not, thanks anyway for taking your time on it.
 
The AIO I got is definitely whining, at idle desktop it is louder than the AC card and is a more annoying noise (fan and pump) than the AC under load!

RMAing it. :(

3rd time lucky I hope!

Under 17.9.3 crossfire works for me in 3DMark, I'm not in a position to test it in games properly at the moment. I might try putting RoTR back on, it has a built in benchmark right, so I could test that?

Edit: I would think driving 5k would be an edge case they would be interested in.
 
I'm currently testing some undervolting/Mem OC.

With both Mem and core U/Ved, GPU clock at 1400MHz (for testing) and HBM2 at 1100MHz on both cards at lost ~1000 points in CF Timespy, but the power draw looking good, around 500W total, that's for two cards, 1800X at stock with 3,200MHz RAM which I think is reasonable. I forgot to check it before the U/V though. Doh!.

I'll work the core clock back up now and see what happens. It's nice seeing the HBM seemingly stable at 1100MHz on both cards at least.

Regarding the AIO BIOS, none of the ones I have tried seem to work at all, I get a windows green screen at boot to windows or black screens or corruption etc. I've tried the ones from TPU and also the "full" ones from OCnet

Can someone with one that is working on their AC card maybe post a link so I can try a "known good" one to see if I just have bad files or if it's the card?

Thanks!

Re: Prey, it would be nice to see a graph over time including the minimums, if the Vegas are (as seems to be the case a lot of the time) producing better mins, it could actually provide a better gaming experience than the Ti. ;) It certainly shows the potential even if it is an AMD partnered title (I assume).

Edit 2: I it just me or does the detail on the Vega (look at the last scene in the video with the Lion statues) look much better/better contrast? If all the settings were the same and it's frame captured internally (rather than a video of the game using a camera) this would presumably to do with how the card is rendering. Maybe someone could do some high-res side by side/zoomed pics.
 
Last edited:
After reading what Videocardz were told by a Gigabyte rep I think it's pretty obvious that AMD tried to fob off their board partners with the C0 revision chips that weren't suitable for the air-cooled reference Vega.

Contrary to previous reports (which by the way were true, but Gigabyte changed their story after they received a new batch of Vega chips) the company is indeed making a custom Radeon RX Vega 64.

https://videocardz.com/newz/gigabyte-radeon-rx-vega-64-gaming-oc-pictured

From that it seems that AMD tried to pass the C0 chips on to the board partners in the hope that they would make a beefy enough air cooler for them, The only way the C0's were usable was with water-cooling and the fact that they tried to sell the lesser chip in the flagship card is disgusting and I imagine they've burnt any trust they'd built up with the board partners, I just hope it isn't only Gigabyte who complained and got a new batch of chips, If I can get a Sapphire Tri-x with a C1 chip for around 5 to 6 hundred that''ll do me.
 
After reading what Videocardz were told by a Gigabyte rep I think it's pretty obvious that AMD tried to fob off their board partners with the C0 revision chips that weren't suitable for the air-cooled reference Vega.



https://videocardz.com/newz/gigabyte-radeon-rx-vega-64-gaming-oc-pictured

From that it seems that AMD tried to pass the C0 chips on to the board partners in the hope that they would make a beefy enough air cooler for them, The only way the C0's were usable was with water-cooling and the fact that they tried to sell the lesser chip in the flagship card is disgusting and I imagine they've burnt any trust they'd built up with the board partners, I just hope it isn't only Gigabyte who complained and got a new batch of chips, If I can get a Sapphire Tri-x with a C1 chip for around 5 to 6 hundred that''ll do me.

Proof? I don't see any evidence of what you are saying in that link. I tried looking up something to show that you are right but I can't find anything. Toms hardware says "source" but provide no real details, other sites say it's because of the technical difficulties in making a cooler because the vega variants have different height memory. The only consistent information on C0 faults is people linking back to your posts on this site.
 
Proof? I don't see any evidence of what you are saying in that link. I tried looking up something to show that you are right but I can't find anything. Toms hardware says "source" but provide no real details, other sites say it's because of the technical difficulties in making a cooler because the vega variants have different height memory. The only consistent information on C0 faults is people linking back to your posts on this site.

It's not something someone like me could prove it's a presumption based on events we've watched unfolding over the last few months. First we had the new stepping rumour before release, Then on release there was two different revisions of Vega 64, A C0 chip for the AIO and a C1 chip for the air-cooled version. That made sense when you consider the new stepping rumour, Then we had a few sites reviewing the Vega Strix and it didn't seem to be doing that good. I asked on the J2C Strix review video and the other reviewer site (I forget which) whether it was a C0 or C1 but got no reply as expected due to my post being just one among hundred's. Then there was reports about board makers having problems getting stable overclocks for there non-reference models and then according to Videocardz a Gigabyte rep said Gigabyte wasn't going to make a non reference model.Then today we get another article on Videocardz saying that Gigabyte will now be doing a non reference model as they got given a new batch of Vega chips from AMD. That's what I quoted from the article I linked. Yes it's a lot of presumptions but to me it all adds up. I've never tried to say it's insider info or anything other than me having a tin foil hat moment but the more info we get the more it stacks up as a likely scenario. I found it weird that the two Strix reviewers didn't mention the revision used on the Strix as the fact that there's two revisions was documented and hypothesized on at release.
 
Back
Top Bottom