• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: The Vega Review Thread.

What do we think about Vega?

  • What has AMD been doing for the past 1-2 years?

  • It consumes how many watts and is how loud!!!

  • It is not that bad.

  • Want to buy but put off by pricing and warranty.

  • I will be buying one for sure (I own a Freesync monitor so have little choice).

  • Better red than dead.


Results are only viewable after voting.
Really REALLY wanted to get a Vega 64 with AIO to complement my 1800X based system but for £50 less I can get an aftermarket 1080 Ti which is a huge chunk faster than Vega 64 while using much less power, From everything I've seen, Vega 64 is a higher clocker Fury X which just about keeps up with a GTX 1080 *Non Ti* while drawing a huge amount more wattage, Very disappointing :(
 
Another bad review although this one (quite properly) takes price into account and is all the worse for it. In most of those benchmarks the overclocked 64 is between a stock 1080 fe and a 1070 in performance.
 
Old tech? VEGA is top line in new technology right from the hardware. Only it's performance is matching Nvidia last year, but I honestly believe Vega needs new title to really showcase its full potential. Means waiting sadly
Maybe, we'll just have to see. AMD had better hope Vega does well in upcoming AAA titles but even then it only really helps Navi, not Vega.

450 for a reference is way over priced and I would never buy a reference cooler from either company. Unless I was maybe water cooling it.
I agree but that's the price it was, so saying you'd be happy with a custom cooled card at £600 is insanity. £150 extra for a decent cooler??
 
So their tests suggest an undervolted Vega 56 beats a stock 1080 whilst using less power. That sounds really nice but what about when the 1080 is undervolted and/or overclocked?
 
Yes we played for years and years without it, because we never had it. But seeing and feeling how good it Is, why would you now go without adaptive...thats just daft, it's a game changer

I think either different people are able to perceive it differently or people value extra smoothness more than others. For me 60fps v-sync is smooth enough to play a single player game offline. Having g-sync or freesync on will not suddenly increase how much enjoyment I have at this point. So not I do not think it is daft from my point of view.


Yer, same for me. I am a big proponent of Adaptive Sync tech, albeit I run a G-Sync monitor (PG348Q) and playing without it feels horrible. It is a game changer for me and the smoothness is something to be seen. My mate has a Freesync screen and plays just as well. I would happily buy a Freesync screen if AMD had a GPU that was worth it for me.
I am also, my argument was not that it is not needed. My argument was why do people feel they are stuck when they buy a freesync monitor. I have had both, I never felt stuck and having to buy from AMD or Nvidia only. If one can afford expensive graphics cards, it is not too hard to sell what you got and pay a few hundred quid extra if adaptive tech means so much to the said individual no? :)

^^

By TNA's argument, we should all go back to playing 1024x768 on CRT screens because we played for years on it just fine before higher resolutions ;D

Not uncommon here, people take one part of the post which was not even the primary argument/point and just reply to that making it seem like I am saying adaptive sync is pointless or something.

Not one person even replied to my main point which was why feel stuck with AMD just because you purchased a freesync monitor ;)


I wouldn't say it's a game changer. I've got 3 PCs connected to my Freesync monitor but only one has Freesync, I can't say I notice a huge difference. I can often notice the one running the screen at 60Hz compared to the 144Hz ones (unless I use the 60Hz for a while and then I get used to it until I go back to 144Hz). But once your game is bouncing off the 144Hz/fps cap I can't tell much difference between Freesync on and off.
At least I am not the only one that see's this. Must be a perception or people putting different value different things, thing. Like I can see a clear difference between 1440p and 4K, others cannot.

It is nice to have adaptive sync, rather have it than not. But for example I would rather buy a 1080Ti and play with V-Sync or adaptive V-sync on than play with a Vega 64 to keep Freesync. That is what I would do if I had to due to not having the money to change from freesync to g-sync or did not think it was worth the extra outlay. But my point is why feel stuck, just sell Freesync monitor and buy a G-Sync one. Not like most here complaining about cannot afford it ;)
 
If Nvidia did that AMD would be dead, Freesync is the only reason for ever making a case for buying an AMD GPU anymore. I didn't want to buy Vega, but felt I had to due to being tied into Freesync. If it ever happened, Nvidia supporting Freesync I mean, then I'd gladly just drop AMD GPU's once and for all.
Yep. If nVidia added FreeSync support in a driver I would probably buy a GTX 1070 immediately.

I don't get why people feel the need to buy AMD because of their monitor. Just sell the monitor and buy a G-Sync or just go without adaptive sync. It is not as big deal as some make it out to be. We happily gamed all these year before adaptive sync. Unless one plays competitively, then it is still in my opinion perfectly fine to game with v-sync on at 60fps locked. Plenty smooth for a single player offline game. Before people say anything, I have had both g-sync and currently have freesync. It is nice, but not so nice that it will force me to buy from one company.
Huh? I never used to need adaptive sync because I had a 1200p 60 FPS monitor and could play games locked at 60 FPS. Now I have a 1440p 144 Hz monitor - why would I spend nearly a grand on a card and drop settings to low just to get smooth gameplay at 144 FPS when I can take advantage of adaptive sync and get a £250 card that can run games at 100 FPS?
 
Last edited:
So their tests suggest an undervolted Vega 56 beats a stock 1080 whilst using less power. That sounds really nice but what about when the 1080 is undervolted and/or overclocked?

As a GTX1080 user I don't really care about that(my GTX1080 with an overclock on core and memory gains about 10% since you obviously know I have been using one since late last year ),but this is not the first time this has happened with AMD products. 70W on the core with simple undervolting - is AMD just trying to pass every raggedy chip or something by setting a high vcore? If that is the case RTG is even more fail than I thought.

They have forgotten the joke which was the FX9590 - performance per watt is the new buzz word and it seems they have obviously forgotten that and also any commonsense when it comes to marketing.

The thing is the Vega56 in its stock state is only slightly better than first generation Polaris in performance/watt - they really needed to stop worrying about 5% extra performance,and try to get over the whole hot and loud meme. They can't compete with the GTX1080TI so what is the point??

I mean the AMD CPU division have obviously got the message - the AMD GPU division last half dozen proper launches have had issue after issue,either with the way the products are launched or the way the products are implemented. Even ATI could launch its worst products in a better way.


The trouble is games are less demanding than stuff like compute so undervolting might not affect stability but if people run compute with too low a voltage you can get math errors. There's no way to tell really. Chip makers will generally go with a voltage that leaves some room for tolerance etc but if you remove that tolerance and get a minor dip in voltage you'll have errors happen.

These are gaming cards,so AMD should then try and make sure the different lines are adjusted differently.
 
Last edited:
The thing is the Vega56 in its stock state is only slightly better than first generation Polaris in performance/watt - they really needed to stop worrying about 5% extra performance,and try to get over the whole hot and loud meme. They can't compete with the GTX1080TI so what is the point??
Maybe they thought being competitive with nVidia in terms of performance at a similar price point (well with the initial launch prices anyway) was more important than dialling back the clocks to get into more power efficient territory. If they were clocked lower gamers could've overclocked them anyway to match the 1070/1080 and the default clocks would've looked much more sensible in terms of power draw and heat output, but the price would then have looked even worse.
 
Maybe they thought being competitive with nVidia in terms of performance at a similar price point (well with the initial launch prices anyway) was more important than dialling back the clocks to get into more power efficient territory. If they were clocked lower gamers could've overclocked them anyway to match the 1070/1080 and the default clocks would've looked much more sensible in terms of power draw and heat output, but the price would then have looked even worse.

Performance is one thing but look at most of the moaning - its price and power consumption. They need to realise that the high end products affect the whole range - they function as sellers of their cheaper products too. If they lost a bit of performance but dropped power,it would at least look like a step forward over Polaris. The Vega64 has worse performance/watt than many Polaris cards,and it makes AMD look even more stagnant technology wise for gaming.
 
The Vega64 has worse performance/watt than many Polaris cards,and it makes AMD look even more stagnant technology wise for gaming.
That's because when Polaris came out everybody moaned that AMD was wasting their time focusing on efficiency, nobody cared about power/heat, just make the best card you can, etc so with VEGA they prioritised performance over efficiency. Just like Polaris prioritised efficiency over performance because when Hawaii came out people moaned it used too much power and efficiency was just as important to them as performance.

Basically people just like to moan and tech companies really shouldn't listen to most of it.
 
Working fine in my Ubisoft games, but i know FreeSync is disabled in The Division due to flickering that cannot be overcome. The Application causes an erratic flip rate, causing the monitor to jump between the min and max refresh rate that unfortunately results in display flicker.

Well, it's nice to know the reason :)
 
That's because when Polaris came out everybody moaned that AMD was wasting their time focusing on efficiency, nobody cared about power/heat, just make the best card you can, etc so with VEGA they prioritised performance over efficiency. Just like Polaris prioritised efficiency over performance because when Hawaii came out people moaned it used too much power and efficiency was just as important to them as performance.

Basically people just like to moan and tech companies really shouldn't listen to most of it.

Well I wasn't moaning about it AFAIK - it make sense AMD does concentrate on it,since they have a hot and loud meme which is not good for their brand perception. What I was moaning about was them over-engineering the VRM,despite the fact the cooler and power connector were clearly not upto the job. If they had used a more appropriate VRM design and put more money into a better cooler,half the moaning would not be there.

You do realise Powercolor made an aftermarket RX480 which drew the same amount as power as the reference card,but was faster and had a better cooler?
 
I think either different people are able to perceive it differently or people value extra smoothness more than others. For me 60fps v-sync is smooth enough to play a single player game offline. Having g-sync or freesync on will not suddenly increase how much enjoyment I have at this point. So not I do not think it is daft from my point of view.
"I could never go back to a 60Hz, TN, non-curved, <10-bit colour, >1ms refresh with less than 10 USB ports and a cup holder for my coffee. Literally, never."
 
"I could never go back to a 60Hz, TN, non-curved, <10-bit colour, >1ms refresh with less than 10 USB ports and a cup holder for my coffee. Literally, never."

I straight up can't go back to 16:9 screens anymore... it'd be like one of you 16:9 peasants going back to 4:3 - it's just WRONG ;D
 
"I could never go back to a 60Hz, TN, non-curved, <10-bit colour, >1ms refresh with less than 10 USB ports and a cup holder for my coffee. Literally, never."

It's funny isn't it, I was in that camp not so long back and thought gaming without gsync or at least high refresh would be awful, moved from an ultrawide rog swift to gaming on an OLED TV and I don't miss it at all, if anything the OLED has added more to my experience as the colours, contrast and blacks have improved the way some titles look immensely, granted I can't do any serious competetive FPS on the OLED but for pretty much everything else it's been more enjoyable.
 
Back
Top Bottom