• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: The Vega Review Thread.

What do we think about Vega?

  • What has AMD been doing for the past 1-2 years?

  • It consumes how many watts and is how loud!!!

  • It is not that bad.

  • Want to buy but put off by pricing and warranty.

  • I will be buying one for sure (I own a Freesync monitor so have little choice).

  • Better red than dead.


Results are only viewable after voting.
The issue is either the option does nothing,or the reviewers all have an earlier driver where it is inactive. Either way whether it does anything is one thing,but its another AMD launch.
HBCC is only going to.do.something if the GPU tried to address more memory than it has available in Vram. Given Vega has 8gb it is unlikely to be short of vram before other limitations restrict performance
 
I did wonder where that was, I just presumed it was working but of not much effect so its going to be more interesting if it could still make a stand. My take was HBCC had a use in any open world game, anything which has over 8gb of textures in that 'level'.
AMD seem to refer a lot to massive data sets [SSG] and maybe there is some relevance between the professional side and games with 50gigs of textures even though a game isnt normally trying to access them all but if we talk of multi threading perhaps its a help.

Believe to the end and beyond.
yea a performance lift of 30% incoming incouple months :D
the matter is we have 1070 and 1080 competition, for the same price, and maybe with better value when you add dx12/vulcan and freesync, thats going to be enough for now, no Ti competitor yet.

PvWOtj8.png

It doesnt appear to be a simple story either way. Hopefully they will refine it and make things clearer to both sides of the battlefield :p
 
Last edited:
I was meaning in comparison to the 64 TDP :)

The £450 price was a limited number subsidised by AMD.

subsidised?

the retailers get it for less then that, so I guess you mean AMD may have discounted to allow retailers to keep handy profit margins or perhaps even paid some kind of rebate for cards sold at that price.

I have no doubt the reason we have what we got now is retailers have decided that maximising profit margins is the way they want to go, and its paying off due to under supply.

If a retailer popped up tomorrow selling these for £450 what do you think would happen to the rest of the UK market?

--edit--

I thought the RRP was £450, if its £550 then yeah thats AMD's fault.

Interestingly I only just noticed pascal prices have dropped finally since launch, "some" 1080s are now only a bit more then I paid for my 1070, although I assume thats because the 1080 RRP was lowered when the 1080TI launched, interestingly my card on OCUK is now more expensive than when I got it, the 1070s value has dropped since launch vs the 1080 give, not had the same drops in price the 1080s had. I probably wouldnt suggest a 1070 to any friends now if they asked my opinion, 1080 pricing is too close to it and the 1070s I think got overgimped by nvidia.

I think for AMD cards to be good value the 64s should be about £400-460 and the 56s about £300-360. Based on current pascal prices.
 
Last edited:
subsidised?

the retailers get it for less then that, so I guess you mean AMD may have discounted to allow retailers to keep handy profit margins or perhaps even paid some kind of rebate for cards sold at that price.

I have no doubt the reason we have what we got now is retailers have decided that maximising profit margins is the way they want to go, and its paying off due to under supply.

If a retailer popped up tomorrow selling these for £450 what do you think would happen to the rest of the UK market?
Everything above £550 is supply/demand affecting prices.

But the current UK RRP is £550.

Gibbo said they could only sell at £450 with "support from AMD". This implies some kind of subsidy, whether rebate or some other arrangement matters little. That support is no longer being offered.

You can't know for sure that all retailers are buying new stock for less than £450. Can you prove this?
 
I cannot, see now I edited my post, the comment was me thinking the RRP was £450.

Of course the question is why AMD have set a much higher RRP in the UK than america, that to me is really bizarre, the american RRP is 499usd which is £388 GBP rounded up. Dont think I have ever seen a gap like that on RRP. Its as if they raised the RRP to what retailers requested. :p

So I cannot prove it but you have to admit, it would be weird if the wholesale price in the UK is higher than the "retail" price in america.

What usually happens is RRP is maybe a bit higher in the UK than america but then the retailers ignore it, so this then makes the retailers get caught out when people talk about price gouging and the like. So in short I find this high RRP in the UK really suspicious.

If I remember right the 1080 launched with 600usd and 600gbp RRP, still unfavourable to the UK but not completely crazy like this RRP we have for rx vega.

Also I find it odd when ocuk say they have deals with AMD for limited prices etc. I dont think ocuk would even buy direct from amd, there is distributors in the middle and also for branded cards the original company selling them is not amd, its asus, evga etc.
 
Last edited:
I think for AMD cards to be good value the 64s should be about £400-460 and the 56s about £300-360. Based on current pascal prices.

I think if those were the prices then this thread would have a very different tone. AMD has zero control over pricing in a market where literally anything that resembles a gpu is being snapped up blindly for currency mining. I have no doubt miners snapped up the vega initial release due to the leaked reports of mining performance and the promise of things to come.
 
I think if those were the prices then this thread would have a very different tone. AMD has zero control over pricing in a market where literally anything that resembles a gpu is being snapped up blindly for currency mining. I have no doubt miners snapped up the vega initial release due to the leaked reports of mining performance and the promise of things to come.

The rrp of the 64 is £550, AMD do control that. 550 is way overpriced.
 
Also I find it odd when ocuk say they have deals with AMD for limited prices etc. I don't think ocuk would even buy direct from amd, there is distributors in the middle and also for branded cards the original company selling them is not amd, its asus, evga etc.
IIRC retailers buy from AIB partners like ASUS, EVGA, etc, but can be given rebates by either AIB partners or AMD/Nvidia.
 
The rrp of the 64 is £550, AMD do control that. 550 is way overpriced.
They set the RRP but they have to set it realistically based on market conditions. If we weren't in the middle of a mining epidemic then I should imagine the cards would have been significantly cheaper to offset their lacklustre performance/noise/heat/power issues. I have no doubt miners will buy as many as they can make.
 
Says the self-confessed miner ;)

Sure, sure, mining hasn't pushed prices up or anything. No wai, guyz! Miners are just like gamers! They just like games so much they've bought 20 GPUs each! And don't game with them!

LOL.

Yes, I'm a miner so clearly I'm the boogeyman and anything I say can't be trusted.
Yes, mining has pushed up the prices of some cards - how on earth you conflate this with EVERY GPU issue brought up on the forum is beyond me. It's not ALL the miner's faults - just the price hikes on RX580/570 1060/1070 (although 1070 prices are back to normal now, I've not looked at 1060s so I can't say on that).
 
They set the RRP but they have to set it realistically based on market conditions. If we weren't in the middle of a mining epidemic then I should imagine the cards would have been significantly cheaper to offset their lacklustre performance/noise/heat/power issues. I have no doubt miners will buy as many as they can make.
so miners dont exist in america then?

also guys any links to info of the 56 bios TDP cap?
 
What I assume is a reference Vega 56 is up for pre-order somewhere else for £400. The custom cards would have to be that much to be worth a buy, IMO. Especially if no-one finds a way around this 300 W hard power limit.
 
The 64 TDPs are 300 & 350W. The Liquid can hit nearly 600W using certain settings.
TDP =/= power usage.

GamersNexus' tests showed their Vega 56 using 180 W at stock, 270 W with a +50% power limit (obviously), and 210 W with an undervolt and +50% power limit. Undervolting gave them stable clocks of ~1550 MHz instead of the ~1300 MHz at stock. 30 W more for 20% clock boost with no increase in heat output is pretty nice, however each chip will vary in how much it can be undervolted and it's not guaranteed. You're also still looking at 100+ W more than a stock GTX 1070.

So basically a stock Vega 56 trades blows with a stock GTX 1070 but whilst using ~100 W more power. The former can be undervolted to run at higher stable clocks, but the latter can be overclocked similarly too.
 
Last edited:
I did wonder where that was, I just presumed it was working but of not much effect so its going to be more interesting if it could still make a stand. My take was HBCC had a use in any open world game, anything which has over 8gb of textures in that 'level'.
AMD seem to refer a lot to massive data sets [SSG] and maybe there is some relevance between the professional side and games with 50gigs of textures even though a game isnt normally trying to access them all but if we talk of multi threading perhaps its a help.
I think in that sigraph presentation they demod huge dataset with WX9100 16gb memory and tried to open up same scene with P6000 24gb memory and it wouldnt open up, the program crashed do to not enough memory on Nvidia card.

https://hothardware.com/news/amd-radeon-rx-vega-mining-block-chain-ethereum
As you can see, we’re getting some pretty significant gains already (at stock speeds) with this beta driver. We wouldn’t be surprised if there are even further optimizations to be found, once AMD is ready to go with a production driver, but we’ll take what we can get right now. We did have one performance anomaly that we ran into, however. When cranking up the memory speeds, the Vega 56 actually vaulted past the Vega 64, cranking out 36.48 MH/s. That’s not bad for a card that's supposed to retail for $399.

Undervolting, overclocking memory and latest beta drivers. It propably wont be that much off if miners are willing to pay ~400€ for rx580s.
 
TDP =/= power usage.

In the case of Vega FE and Vega 64 power usage pretty much matches their TDP ratings (within a few watts).

The question I was answering was the difference between the hard limit of the 56 and the 64 power usage.

GamersNexus' tests showed their Vega 56 using 180 W at stock, 270 W with a +50% power limit (obviously), and 210 W with an undervolt and +50% power limit. Undervolting gave them stable clocks of ~1550 MHz instead of the ~1300 MHz at stock. 30 W more for 20% clock boost with no increase in heat output is pretty nice, however each chip will vary in how much it can be undervolted and it's not guaranteed. You're also still looking at 100+ W more than a stock GTX 1070.

So basically a stock Vega 56 trades blows with a stock GTX 1070 but whilst using ~100 W more power. The former can be undervolted to run at higher stable clocks, but the latter can be overclocked similarly too.

I'm not sure if you've noticed but GamersNexus ONLY measures the power draw at the PCIe cables.
This measurement isn't the total power draw of the card as the slot isn't included so their results are flawed. I'm not sure why they do this as it makes it very difficult to compare cards due to the differences in PCIe slot power usage between various cards of even the same manufacturer (eg RX480 power debacle).

PCper use a much better system to measure actual GPU power draw. The 56, like it's bigger brothers virtually matches it's TDP rating.

https://www.pcper.com/reviews/Graph...4-Vega-64-Liquid-Vega-56-Tested/Clocks-Power-
 
Last edited:
Back
Top Bottom