• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: The Vega Review Thread.

What do we think about Vega?

  • What has AMD been doing for the past 1-2 years?

  • It consumes how many watts and is how loud!!!

  • It is not that bad.

  • Want to buy but put off by pricing and warranty.

  • I will be buying one for sure (I own a Freesync monitor so have little choice).

  • Better red than dead.


Results are only viewable after voting.
.

Sorry to snip your post but this last part needs to be qualified a bit IMHO. At stock settings AMD have pushed Vega well beyond it's peak efficiency. The fact that lowering the TDP in Wattman gives 95% of the performance for a significant reduction in Watts proves this.

From 214 W average gaming power using Powersaving mode, to 292 W at stock is a massive efficiency hit for only ~5% performance gains. It looks even worse when comparing lowest TDP settings at 200W compared to 316 W at turbo settings, all for less than 10% extra performance. Using the lower TDP settings also improve heat and noise levels.

https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/29.html

I feel AMD missed a trick and should have had the default setting at powersaving mode in Wattman. Then anyone clicking on higher TDP settings are choosing to kill efficiency in the name of performance. The reviews would IMHO have been far more forgiving at a GPU that was 5% slower but consuming ~90 W less power.

Typical AMD to be honest.

If this is true, it's remarkable that they can make mistakes like this considering how late the card is and how many extra months they had to tweak.

AMD is not a company that instills confidence compared to Nvidia, whose products are akin to well-oiled machines.
 
Some of us genuinely wanted to see a bit better from AMD as it helps us all in the end, you could argue though that they're selling well so it doesn't really matter what we think as long as the sales figures give them something to build on.

Without mining and the sync technologies sort of tying people in then this may have been judged as a poor launch but then if that were the case we'd have likely seen lower pricing.

Yeah :(


I was under the impression you don't need freesync monitor to be able to use adaptive sync, is this right?
Adaptive Sync = G-sync or Freesync.

You are thinking of a technology nvidia uses in their drivers called adaptive v-sync which is something else.
 
If this is true, it's remarkable that they can make mistakes like this considering how late the card is and how many extra months they had to tweak.

In fairness it's understandable given the mixed signals they keep getting.

AMD: Here's the new Fury we worked hard on!
Masses: It's too hot, it uses too much power, You're not Nvidia!
AMD: Here's the new more efficient Fury Nano we worked hard on!
Masses: It's too slow, Nvidia sell something for the same price that uses more power to not be slow, You're not Nvidia!
AMD: Here's the new RX we worked hard on!
Masses: It's even slower, we've told you we don't care about efficiency, just make it FAST, You're not Nvidia!
AMD: Here's the new Vega we worked hard on, beats the 1080 for the same launch price!
Masses: It's too hot, it uses too much power, have you even heard of efficiency? You're not Nvidia!



That could be it, a sort of poor man's freesync.
Ahh, yeah it looked pretty lame to me TBH.

I get the marketing idea that it has less overhead than V-Sync so less of a performance hit, but it's not as good quality wise. So in a competitive game I'm still not going to enable it because I want performance, and in a solo game I'm still going to use V-sync instead because I want quality.
 
In fairness it's understandable given the mixed signals they keep getting.

AMD: Here's the new Fury we worked hard on!
Masses: It's too hot, it uses too much power, You're not Nvidia!
AMD: Here's the new more efficient Fury Nano we worked hard on!
Masses: It's too slow, Nvidia sell something for the same price that uses more power to not be slow, You're not Nvidia!
AMD: Here's the new RX we worked hard on!
Masses: It's even slower, we've told you we don't care about efficiency, just make it FAST, You're not Nvidia!
AMD: Here's the new Vega we worked hard on, beats the 1080 for the same launch price!
Masses: It's too hot, it uses too much power, have you even heard of efficiency? You're not Nvidia!
Yes, the bottom line is they can't compete with Nvidia when it comes to performance per watt. The bigger issue is that this disparity has barely fallen with AMD's last couple of releases. In fact one reviewer noted how this has increased, not decreased.

People love to cast Nvidia as villains, but their GPUs are objectively superior to the competition, and their generational improvements have been consistently remarkable. AMD must compete on price.
 
Higher frame rates still tear. More so in my experience. Freesync is about getting rid of tearing.

I dont follow your point.....

Guess if you are only fussed about tearing and not stutter. Personally I don't notice any tearing without freesync or gsync so I would only use them to smooth a game or. But playing FPS games I would rather have the higher frame rate. I'm in the market for a new monitor and gfx card but no use now avoiding the gsync tax on a monitor when I then have to put that money back in on a gfx card for less performance. May as well pony up for gsync now.
 
.

Sorry to snip your post but this last part needs to be qualified a bit IMHO. At stock settings AMD have pushed Vega well beyond it's peak efficiency. The fact that lowering the TDP in Wattman gives 95% of the performance for a significant reduction in Watts proves this.

From 214 W average gaming power using Powersaving mode, to 292 W at stock is a massive efficiency hit for only ~5% performance gains. It looks even worse when comparing lowest TDP settings at 200W compared to 316 W at turbo settings, all for less than 10% extra performance. Using the lower TDP settings also improve heat and noise levels.

https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/29.html

I feel AMD missed a trick and should have had the default setting at powersaving mode in Wattman. Then anyone clicking on higher TDP settings are choosing to kill efficiency in the name of performance. The reviews would IMHO have been far more forgiving at a GPU that was 5% slower but consuming ~90 W less power.

Typical AMD to be honest.


The problem is the voltage and power settings were carefully selected by AmD to.ensirr sufficient yeilds at specified clocks. So.while it is possible many Vega chips could be powered.doen and maintain similar.performance not every chip can.


You have the same with Pascal cards. Most of them you can undervolt,drop power, increase clocks and still run.lower overall.power at equal or higher performance.

You can hand test individual chips and bin the best and worst. AMD did this with the Nano ecting the best chips that would run at the lowest voltage. But it is expensive and takes time to stock pile the better performing chips
 
Correct me if iam wrong, but surely AMD have not set the prices that are currently advertised, market forces are driving them up. Not sure how this is AMD's fault. It will fall with teh supplier (retailer) and the users (buying everything in sight!!) :)
 
A Vega blower isn't going to be noticeably noisier than a 1080 blower.

Not doubting that but a quick flick through the 1080 products page is sobering, 64 blower card is a like-for-like purchase against a 1080 MSI Gaming X.

34.5db for the 1080 card vs. 45 (balanced) or 50db+ (turbo) on the Vega 64 blower, depending on setting and ambient temp.

The AIB's will fix that gap in short order, I am just arguing that for air only users it is hard to justify a 64 based card against a AIB 1080 when the price is the same and the use of HBM has given them little wiggle room which may mean the AIB 64 Vega's come in at £650.

It is a much easier decision to go for a 56 vs a 1070 as the 56 is a much stronger card.
 
In fairness it's understandable given the mixed signals they keep getting.

AMD: Here's the new Fury we worked hard on!
Masses: It's too hot, it uses too much power, You're not Nvidia!
AMD: Here's the new more efficient Fury Nano we worked hard on!
Masses: It's too slow, Nvidia sell something for the same price that uses more power to not be slow, You're not Nvidia!
AMD: Here's the new RX we worked hard on!
Masses: It's even slower, we've told you we don't care about efficiency, just make it FAST, You're not Nvidia!
AMD: Here's the new Vega we worked hard on, beats the 1080 for the same launch price!
Masses: It's too hot, it uses too much power, have you even heard of efficiency? You're not Nvidia

Well yeah nothing surprising there, people want at least the same perf/watt as the competition and AMD can't deliver on that front, if AMD had the same perf/watt as Nvidia but used more power to make the GPU more performant than the Nvidia counterpart that wouldn't be a problem at all, but unfortunately that isn't the case.
 
No you see the 1080Ti is clearly mid range because the Titan Xp is the only top end card up in that exclusive bracket. :rolleyes:

The 1080 is high end. The 1070 is borderline.
Nice if you to make a statement rather than an opinion which is what I did :p. Who cares anyway?
Maybe I should say "top end" rather than high end. In which case IMO you're right, only the TXP fits that category.
Personally I did consider the 980 Ti as upper-mid but the 1080 Ti as high.
When the 1180 comes out, that'll be the new top end card for a while. IMO a 1 year old card (14months- 1080) cannot fit into high end when it's been superseded by two faster cards :). People who own a 1080 will no doubt disagree.....
All a bit of fun BTW :).
 
Last edited:
Yes, the bottom line is they can't compete with Nvidia when it comes to performance per watt. The bigger issue is that this disparity has barely fallen with AMD's last couple of releases. In fact one reviewer noted how this has increased, not decreased.

People love to cast Nvidia as villains, but their GPUs are objectively superior to the competition, and their generational improvements have been consistently remarkable. AMD must compete on price.

No one can deny that Nvidia have been involved in dubious behaviour regarding the selling of their GPUs in the past. They falsely marketed the whole Maxwell range of GPUs as having async compute and when this turned out to be false, they said they'll enable it in a driver update which still has not materialised 2 years later. Not to mention the whole GTX 970 3.5GB fiasco, of which buyers outside the USA have still not been compensated for. If people see Nvidia as villains, it's most likely not because they are the market leader, it's because of their direct actions in the recent past.
 
Yes, the bottom line is they can't compete with Nvidia when it comes to performance per watt.
They could when they tried, people said they didn't care and were more interested in performance per £, which is pretty lol as recently whenever AMD have been matching/beating Nvidia on performance/£ people have complained about power. It's a vicious circle.
 
In fairness it's understandable given the mixed signals they keep getting.

AMD: Here's the new Fury we worked hard on!
Masses: It's too hot, it uses too much power, You're not Nvidia!
AMD: Here's the new more efficient Fury Nano we worked hard on!
Masses: It's too slow, Nvidia sell something for the same price that uses more power to not be slow, You're not Nvidia!
AMD: Here's the new RX we worked hard on!
Masses: It's even slower, we've told you we don't care about efficiency, just make it FAST, It's 18 months LATE!!!!!!!!
AMD: Here's the new Vega we worked hard on, beats the 1080 in some but not many games for the same launch price!(but we will be adding £100 after)
Masses: It's too hot, it uses too much power, have you even heard of efficiency? it's 18 months to late!




.

Oh well...another year to wait for AMDs next fail and NV charging what they like :(
 
Guess if you are only fussed about tearing and not stutter. Personally I don't notice any tearing without freesync or gsync so I would only use them to smooth a game or. But playing FPS games I would rather have the higher frame rate. I'm in the market for a new monitor and gfx card but no use now avoiding the gsync tax on a monitor when I then have to put that money back in on a gfx card for less performance. May as well pony up for gsync now.

Fair enough, it's true that tearing is only in some games, not all suffer from it. However I have noticed it, and it's not nice when you have the scan lines coming down your screen. Seems to be more noticeable in racing sims. But yeah freesync was nice improvement for me, wouldnt want to go back. And I'm not buying 3 G-sync monitors! lol
 
Not doubting that but a quick flick through the 1080 products page is sobering, 64 blower card is a like-for-like purchase against a 1080 MSI Gaming X.
Oh of course, it needs to go back to £450 and the MSI Gaming 64 needs to match (or even better beat) the MSI Gaming 1080 on price, that would be great.
 
Back
Top Bottom