• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RX Vega 64 Owners Thread

That was not the whole conversation though that was only one part of an on going series of posts between you, me and a couple of others where I had previously said things like " I think it's pretty obvious that" and "From that it seems that" Those are statements of an opinion not of a fact. I then continued with "It's not something someone like me could prove it's a presumption based on events we've watched unfolding over the last few months. " That's not me claiming it as a fact.

Please stop twisting things. There was no on going conversation. Your post, the one that I quoted above, was the one that started the conversation because I wanted you to show some proof of those "facts" that you mentioned. You even said the words "and the fact that" and "I think it's pretty obvious" The whole tone of your post is written as statements of fact. It sounded so much like you were stating facts that GoogalyMoogaly used the info in his post.

Other people chipped in after my post. One to post a link that showed that Gigabyte always intended to make custom cards. Others to post that the stepping theory you are keen on isn't so clear cut.
 
My new PC is an MATX build and the 1080 did not fit plus my monitor is a Freesync ultrawide.

I haven't convinced myself to get it yet but the price isn't bad at £455.

Still too expensive imo, as only ref cards, custom oc'd 1080s are around that, and the 1070 Ti will be around the same performance imo, and is also even cheaper, then you have to factor in, the possibility of a new PSU as well.

Im surpsied the 1080 isn't around the same size as the ref 64 tbh.
 
Last edited:
There's obviously something iffy with AIB models or they'd be out, regardless. I mean I've never seen an AIB model reviewed and then disappear before.
In fact Vega has had the least partner support overall I've seen of a brand new GPU launch.
 
IMO out of box Vega performance was poor in what I tried. As is the ref cooler. The basis of this was my purchase price vs similar priced competitor product (not the EK X).

My experience on ref cooler lasted ~30min. The moment I did a SuperPosition 4K optimized preset run with monitoring, I was frankly distraught as much as I believe the silicon would have been :o. Example ref cooler was MAX GPU: 74C Hotspot: 105C HBM 81C, with WC same setup MAX GPU 36C HotSpot: 55C HBM: 41C.

What we see in WattMan when we move to manual voltage is not what the silicon uses in normal operation, regardless of stock/tweaked. Vega uses AVFS, Advanced Voltage Frequency Scaling. I believe this is an evolution of Adaptive Clocking as done on Streamroller, link. AVFS AFAIK came about with Carrizo, link. Then we have ACG, link, which again has similarities to AC/AVFS.

At stock say when running f@h my VEGA was getting to ~1650-1700MHz with max 1.131V in HWINFO. Once I tweaked DPM 7 to 1100mV it become the ceiling voltage. Even if I currently use DPM 7 as 1652MHz, in Compute tasks GPU MHz can be above that. I believe this is all due to the "tech" employed. HBM voltage in WattMan is not HBM voltage, that is for sure. V56 is 1.25V and V64 1.35V. When viewing the voltages associated with HBM clock states it is as if it is linked to GPU voltage/DPMs somewhat. This was the case on Hawaii for example, you had same number of DPM states as GPU in RAM Freq table and both tables had to match for VID, manual or EVV (Electronic Variable Voltage).

The GTX 1080 I didn't spend time tweaking TBH. I have some results IIRC of 2.0GHz+, will have to check. GDDR5X from what I had read from owners shares on OCN (where I practically live :p ), doesn't OC much, so I never tried it. My card didn't like undervolting at all when I tried it, as it was boosting to ~1975MHz due to low temps (as on WC), how nvidia boost 3.0 works and the MSI EK X has increased powerlimit compared with other Pascal cards. See the FE bios PL vs EK X.

I believe nVidia boost 3.0 is very polished vs VEGA boost. VEGA clock/voltage tech is vastly different then any past AMD GPU I have had my hands on. Example Hawaii was ever so slight changes vs Evergreen. Fiji again was mild changes vs Hawaii in this context. I do like how VEGA is this context vs past AMD GPU, it just takes some getting used to IMO.

You are correct on Avfs but it doesn't have the capability to tune to silicon variance, maybe future Avfs for both Raven Ridge and Vega2 will improve the register and provide shadow p-state/ adaptive to silicon. I appreciate your post as you make good points and you like to tweak, but you are deliberately avoiding my 2 questions.
You can't compare a tweaked vega 64 on a custom block vs gp104 air out the box and then claim victory by a 10fps difference in firestrike. I appreciate you are bringing valuable information to the forum and from your own testing and results shared you are showing scaling, (of which there is more headroom to come from tweaking) but you have to tweak gp104 for the other half of the story.

Firestrike on a well tweaked 1080 ranges from 23k-25k graphics score, and gddr5x does respond very well to clocking depending on manufacture bios/fw, also the latest version of gddr5x is clocked at 11gbps not 10gpbs but most models don't have this update.
Please don't be offended by my post I like to see people tweaking and showing the results from before and after, but I must leave with this viewpoint.
If you factor in the price of the block to the v64 you purchased how is it good value to performance anymore when a cheaper 1080 can run on air and get close results, or to a 1080ti aib air card.

If you could what would be brilliant is to do a thread where you show stock out the box settings of a v64 on your water setup, then show fps increase at frequency points with maxed out power limit say 1100-1200-1300 etc to show the scaling. It would be interesting to then compare the fps gain and expense vs just buying a reference 64 and undervolting and running it at it's sweetspot.

edit I acknowledge your 1080 was on a block not air, didn't see it first of

Here was Icdp thread , https://forums.overclockers.co.uk/t...ing-information-guide.18793012/#post-31149524
 
Last edited:
@Davedree

I am not offended at all :) . Your dialogue causes none, but brings further discussion to table :) .

My view on purchased VEGA / GTX 1080 is pretty much as stated in this post, link.

Then yesterday I basically placed the cusp of my change from GTX 1080 to VEGA, here (as well as other linked post).

Why I see my bench compare valid for SuperPosition 4K and 3DM is this:-

i) MSI increased Base/Boost clocks and PowerLimit in VBIOS for GTX 1080 vs GTX 1080 FE, coupled with the WB cooling solution = lower temps and how nVidia boost 3.0 tech = higher clocks than FE.

ii) VEGA boost tech does not take advantage of lower temps as much as nVidia boost 3.0 IMO, also the MSI GTX 1080 EK X has a higher base/boost vs GTX 1080 FE, check the link to TPU in quote (added below now).

TPU VBIOS GTX 1080 FE
TPU VBIOS GTX 1080 EK X

So as you can see the EK X should boost to 1822MHz, but it did not, it went to ~1975MHz (nVidia Boost 3.0 at play).

I do believe VEGA is accounting for "Silicon variance". If VBIOS/driver did not do that each card would need tailored VBIOS for silicon soldered to PCB.
 
@Davedree

I am not offended at all :) . Your dialogue causes none, but brings further discussion to table :) .

My view on purchased VEGA / GTX 1080 is pretty much as stated in this post, link.

Then yesterday I basically placed the cusp of my change from GTX 1080 to VEGA, here (as well as other linked post).

Why I see my bench compare valid for SuperPosition 4K and 3DM is this:-

i) MSI increased PowerLimit in VBIOS for GTX 1080 vs FE, coupled with the WB cooling solution = lower temps and how nVidia boost 3.0 tech = higher clocks than FE.

ii) VEGA boost tech does not take advantage of lower temps as much as nVidia boost 3.0 IMO, also the MSI GTX 1080 EK X has a higher base/boost vs GTX 1080 FE, check the link to TPU in quote.

It's so refreshing to see someone testing with 1st hand experience and giving viewpoints from their own experience.
All your points are spot on in that #35 post, and yes for freesync it was the dealbreaker for you, I get this can be the major dilemma for anyone.
 
Gupsterg
I thought that all Vega runs at 1.25v and it's upto the dpm 5-6-7 states and current regulation to determine if it can hold it?
Bristol Ridge uses a Shadow p-state which offsets the vddc to silicon at post and during the ageing process, (well so the documentation mentions in the issc article I read.
 
I think the ref Vega 64 air cooled, needs to be around £380 now, what with the custom 1070 Ti's starting at £420, as that will give the custom 64s, if any ever appear, to come in around the same price, it won't happen though, as probably make a loss at that, which is what they should just do imo, with the way things are going, as they just aren't moving at all, retailors must be stacked floor to ceiling with em, and the 1070 Ti, will just make whats very bad now for them, even worse.
 
Last edited:
I think the ref Vega 64 air cooled, needs to be around £380 now, what with the custom 1070 Ti's starting at £420, as that will give the custom 64s, if any ever appear, to come in around the same price, it won't happen though, as probably make a loss at that, which is what they should just do imo, with the way things are going, as they just aren't moving at all, retailors must be stacked floor to ceiling with em, and the 1070 Ti, will just make whats very bad now for them, even worse.
Honestly the 1070 Ti is a nonsense card. For the money you may as well spend the extra and get a 1080 as it isn't a crippled 1080. One of the worse decisions they made was only cutting it down by 1 CU to 19 vs the 1080's 20. It's like the 1080 Ti losing 1GB VRAM because "reasons". While some compliment Nvidia on the wide range, I think they are just tripping over themselves with too many cards at too many price points a few of which only compete with other Nvidia cards like the 1070 Ti vs the 1080.

A stock V64 with good silicon with kick the crap out of a 1080. A stock V56 with good silicon and a V64 BIOS flash will kick the crap out of a 1080 as it's not the core CU's holding the card back. So really the V56 is the much better buy out of the three Nvidia cards. The only reason the 1070 Ti exists is because it costs them nothing to release a slightly castrated 1080 with a 1070 Ti badge and gain a lot of media noise to bring the attention away from Vega56. When people get some of the 1070 Ti's I bet it's wearing 1080 clothes with a new badge.


Just to add to this, the delay in V56/V64 AIB cards could actually work in AMD's favour. With all the noise about the 1070Ti now, they could sweep in just before the holiday buying season and create noise at the right time for more sales...
 
Honestly the 1070 Ti is a nonsense card. For the money you may as well spend the extra and get a 1080 as it isn't a crippled 1080. One of the worse decisions they made was only cutting it down by 1 CU to 19 vs the 1080's 20. It's like the 1080 Ti losing 1GB VRAM because "reasons". While some compliment Nvidia on the wide range, I think they are just tripping over themselves with too many cards at too many price points a few of which only compete with other Nvidia cards like the 1070 Ti vs the 1080.

A stock V64 with good silicon with kick the crap out of a 1080. A stock V56 with good silicon and a V64 BIOS flash will kick the crap out of a 1080 as it's not the core CU's holding the card back. So really the V56 is the much better buy out of the three Nvidia cards. The only reason the 1070 Ti exists is because it costs them nothing to release a slightly castrated 1080 with a 1070 Ti badge and gain a lot of media noise to bring the attention away from Vega56. When people get some of the 1070 Ti's I bet it's wearing 1080 clothes with a new badge.


Just to add to this, the delay in V56/V64 AIB cards could actually work in AMD's favour. With all the noise about the 1070Ti now, they could sweep in just before the holiday buying season and create noise at the right time for more sales...

If my prediction is correct then Nvidia will reduce the 1070 below the current 370-380 V56, (how much I dunno say 320-350, and then regain the margins of the 1070 higher pricing via the introduction of 1070ti 420-480, leaving the 1080 unaffected price wise. This Way Nvidia still gain prospect buyers at either sides of V56.

As to Vega kicking the crap out of Gp104 I'd say that's a little far fetched, Considering you are overlooking the price to manufacture the Vega massive die, and hbm2 costs etc and that Vega can only cater for desktop, where as Gp104 caters for mobile and deskto, (where in mobile there is even more profit).
Sure Vega is competitive/ahead/behind depending on game etc and if you like to tweak then it's fun.
Out the box though Vega is not great and at the current pricing it's still not economical except if you have freesync or really insist on buying only Amd.
 
£380 would be ridiculous compared to NVidia prices. My 64 (tweaked admittedly) is performing better than my 1080 G1 gaming (tweaked also) did in every game I play. £450 is fair IMO.

Frankly it needs to be £400 because it does have specific drawbacks versus an equally priced 1080. You can only price on par, if the card's all around on par. Which it isn't.
 
Same here, I grabbed an RX470 off the MM while waiting for the 1080 to sell and I'm glad to be back with the AMD software rather than the Geforce experience.

From tests and results I've seen online the Vega 64 does a decent job overall and for my monitor which is 3440x1440 with a Freesync range of 30-75hz I'm sure it'll provide a better experience than a 1080 without any adaptive sync.
It definitely does. I have the same monitor and kept my 1080 for about 2 weeks, not worth losing freesync on that screen IMO, performance will be better than a 1080 and it's £50 cheaper. Win win.
 
It's so refreshing to see someone testing with 1st hand experience and giving viewpoints from their own experience.
All your points are spot on in that #35 post, and yes for freesync it was the dealbreaker for you, I get this can be the major dilemma for anyone.

NP :) , nice to make your acquaintance as well :) . Yes, FreeSync was biggest factor, then I hated the nVidia driver panel as well.

I hope you can see the MSI GTX 1080 EK X was no way stock concerning clocks/vbios besides cooling. So I have merely made VEGA similar with tweaks :) .

Gupsterg
I thought that all Vega runs at 1.25v and it's upto the dpm 5-6-7 states and current regulation to determine if it can hold it?
Bristol Ridge uses a Shadow p-state which offsets the vddc to silicon at post and during the ageing process, (well so the documentation mentions in the issc article I read.

I have no idea fully yet how VEGA is concerning VID/VDDC, but it is not definitely not fixed or as we see when we change to manual voltage in Wattman. I believe VEGA is employing the most advanced "tuning" techniques AMD have at their disposal.

AFAIK V56/V64 when we change to manual mode DPM7 1200mV is pulled from PowerPlay (ATOM_Vega10_Voltage_Lookup_Table).

Code:
7A 00 (0x7Ah)       USHORT usVddcLookupTableOffset;            /* points to ATOM_Vega10_Voltage_Lookup_Table */

typedef struct _ATOM_Vega10_Voltage_Lookup_Table {
01   UCHAR ucRevId;
08   UCHAR ucNumEntries;                                          /* Number of entries */
   ATOM_Vega10_Voltage_Lookup_Record entries[1];             /* Dynamically allocate entries */
} ATOM_Vega10_Voltage_Lookup_Table;

typedef struct _ATOM_Vega10_Voltage_Lookup_Record {
20 03 (800mV)   USHORT usVdd;                                               /* Base voltage */
} ATOM_Vega10_Voltage_Lookup_Record;

typedef struct _ATOM_Vega10_Voltage_Lookup_Record {
84 03 (900mV)   USHORT usVdd;                                               /* Base voltage */
} ATOM_Vega10_Voltage_Lookup_Record;

typedef struct _ATOM_Vega10_Voltage_Lookup_Record {
B6 03   USHORT usVdd;                                               /* Base voltage */
} ATOM_Vega10_Voltage_Lookup_Record;

typedef struct _ATOM_Vega10_Voltage_Lookup_Record {
E8 03 (1000mV)   USHORT usVdd;                                               /* Base voltage */
} ATOM_Vega10_Voltage_Lookup_Record;

typedef struct _ATOM_Vega10_Voltage_Lookup_Record {
1A 04 (1050mV)   USHORT usVdd;                                               /* Base voltage */
} ATOM_Vega10_Voltage_Lookup_Record;

typedef struct _ATOM_Vega10_Voltage_Lookup_Record {
4C 04 (1100mV)   USHORT usVdd;                                               /* Base voltage */
} ATOM_Vega10_Voltage_Lookup_Record;

typedef struct _ATOM_Vega10_Voltage_Lookup_Record {
7E 04 (1150mV)   USHORT usVdd;                                               /* Base voltage */
} ATOM_Vega10_Voltage_Lookup_Record;

typedef struct _ATOM_Vega10_Voltage_Lookup_Record {
B0 04 (1200mV)   USHORT usVdd;                                               /* Base voltage */
} ATOM_Vega10_Voltage_Lookup_Record;

EgvJiHD.jpg

Asder00 of Guru3D, famed for providing AMD drivers in the past before release and has helped in Hawaii bios mod (has some tools which can't share ;) ) posted this from VEGA, link.

So I believe AVFS is active. Besides Platform_Caps in powerPlay, marketing slides, etc, I see the case from monitoring data. I do believe there are "states" of GPU clock which we are unaware of and SW is not showing.

Let's roll back to Hawaii for a moment. Hawaii used EVV, in VBIOS where there were 8 DPM states, did PowerTune have only 8?

PowerTune is also highly granular in terms of its ability to manage clocks. While previous GPUs had only 3 or 4 power states (idle/low, medium, and peak), a GPU with PowerTune contains hundreds of intermediate states in between the primary power states to maximize performance within the TDP constraint as outlined above in Figure 4.

From page 8 of this basic whitepaper.

Even when we set manual VID for a DPM the voltage was never fixed. The GPU would have clocks/voltages between each DPM, etc. This perplexed some owners.

Now how did we know what EVV/ASIC Profiling determined as VID per DPM? First tool we had was shared by The Stilt. His own tool made for his own "work". This only showed DPM 7, you can find the posts in a thread on Guru3D from 2015. Next we found AIDA64 allowed dumping of GPU registers and translated VID per DPM, also worked on GPUs upto Polaris. It does not for VEGA yet, but I do believe it may not. Why I say this is AMD have locked down VEGA hard compared with past cards. AIDA64 register dump is so small on VEGA vs past cards.

I2C comms has been disabled to voltage control chip. Originally VEGA FE VBIOS showed The Stilt it was disabled via registers set. RX VEGA does not have those registers set. So it is believed now the driver is blocking access or another strategy has been employed. TBH even the voltage control chip does not have full control, it is a slave to the SMU. Scanning i2c on Fiji was slow, as SMU kept the line busy. This is why GPU-Z didn't have as many monitoring values for it as Hawaii. HWINFO author also had issues using i2c then reverted to AMD ADL. MSI AB employs "messaging" SMU via driver to do changes on Fiji onwards, previously voltage control on Hawaii was i2c in MSI AB via voltage control chip. I believe MSI AB on VEGA will be using "messaging", it has command line messaging capability as will (some messages can be found in Linux driver).

I believe VEGA is using SMU far more than Fiji and implemented heavier than it. Also I believe VEGA has/is SOC GPU (see VEGA whitepaper before linked).

VEGA also has on die hardware to check VBIOS is correct for ASIC. You can crossflash V56/V64 as they use same device ID, but you can not crossflash VEGA FE. Last I read recently even VEGA FE owners can not use later RX VEGA drivers/path. I was going to suggest to a member on OCN to edit inf, but I believe it may pick up "fused ASIC ID". An example of this was Hawaii flashed to Grenada still showed as Hawaii as driver picked up "fused ASIC ID".

Sorry for mega post :o .
 
If my prediction is correct then Nvidia will reduce the 1070 below the current 370-380 V56, (how much I dunno say 320-350, and then regain the margins of the 1070 higher pricing via the introduction of 1070ti 420-480, leaving the 1080 unaffected price wise. This Way Nvidia still gain prospect buyers at either sides of V56.
Indeed they will, though my point on the cards performance still holds. The issue is that Nvidia has a drop in solution vs the tweaking required to get the best out of Vega. In that way Nvidia have managed to corner the market nicely as it's easy to get 100% performance out of their cards due to their boost implementation. Something Vega has tried to copy and gotten it backwards.

As to Vega kicking the crap out of Gp104 I'd say that's a little far fetched, Considering you are overlooking the price to manufacture the Vega massive die, and hbm2 costs etc and that Vega can only cater for desktop, where as Gp104 caters for mobile and deskto, (where in mobile there is even more profit).
Sure Vega is competitive/ahead/behind depending on game etc and if you like to tweak then it's fun.
Out the box though Vega is not great and at the current pricing it's still not economical except if you have freesync or really insist on buying only Amd.
I guess my card is one of the luckier cards about, massive OC headroom and great undervolting (down to 900mv) though I've see a few of the guys mining hitting 800mv with some tweaking down. Though I wouldn't say it's far fetched for V56/64 to beat 1080's quite handily in games that support DX12's feature sets. In fact the more DX12 features used/supported seems to correlate with how much a Vega beats the 1080. The recent releases of Destiny 2, Forza 7 and TW Warhammer 2. Admittedly DX11 is not Vega's strong suit but it's definitely no slouch there either.
 
Still too expensive imo, as only ref cards, custom oc'd 1080s are around that, and the 1070 Ti will be around the same performance imo, and is also even cheaper, then you have to factor in, the possibility of a new PSU as well.

Im surpsied the 1080 isn't around the same size as the ref 64 tbh.

The cheapest 1080 is the one I had which is the MSI 1080 Armor, that was £500 and because it was a dual fan design rather than a tri-fan model like your Tri-x's the fans are a bigger size making the card too tall for my new case (Yes I should have been on that but I missed it). The last few game reviews have shown the Vega 64 to be faster than the GTX 1080 at my resolution and it's a trend I expect will continue so I don't think £456 is that bad..
 
Please stop twisting things. There was no on going conversation. Your post, the one that I quoted above, was the one that started the conversation because I wanted you to show some proof of those "facts" that you mentioned. You even said the words "and the fact that" and "I think it's pretty obvious" The whole tone of your post is written as statements of fact. It sounded so much like you were stating facts that GoogalyMoogaly used the info in his post.

Other people chipped in after my post. One to post a link that showed that Gigabyte always intended to make custom cards. Others to post that the stepping theory you are keen on isn't so clear cut.

I'm sorry but I quoted from the same post, You posted the middle of a paragraph that started with me saying
From that it seems that AMD tried to pass the C0 chips on to the board partners
At the time you quoted it asking for proof and I further clarified that it was only my opinion by replying to you with
It's not something someone like me could prove it's a presumption based on events we've watched unfolding over the last few months.
That should have been clear cut enough but here you are days later inaccurately quoting it and now accusing me of twisting things.

You're right about one thing though and that is that it wasn't part of an ongoing conversation that was a different conversation on the same topic as I'd posted about it before and when the Videocardz Gigabyte article landed I made a point of pointing it out because it added credence to what I'd previously said but even then I was quite clear it was my opinion and not me stating a fact by saying
I think it's pretty obvious that AMD tried to fob off their board partners with the C0 revision chips that weren't suitable for the air-cooled reference Vega.
 
Last edited:
Does anyone know what's happening with VEGA custom cards and why there's still no sign of them?
Yeah I've been keeping an eye out for them but I haven't seen any news, you'd think AIB would have been out much earlier considering the biggest downfall of Vega is that reference cooler, Volta will soon be here to wipe the floor but AMD still haven't cashed in!
 
Back
Top Bottom