• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
So much for the bannana yellow, it looks fantastic.
bdBuscfBhh7oGvWF-_Gv6eTsZ3vBkHpSJ4FEgFBVqOo.jpg
 
Wow Vega and Fiji has terrible power effiency than Pascal. Feel sorry for Fury X owners played 4K UHD game burned over 300W at full load.

The power usage of Vega should improve greatly in gaming workloads compared to what we see now with Vega FE, if tiled rasterisation really is not enabled. But the gamers nexus clock for clock review points to features not being there yet.
 
Really short summary:

Nobody can tell you if any of the Vega RX cards will be comparable to a 1080Ti because nobody knows. It's probably worth waiting because information should be available late July or very early August.

Short summary:

AMD released the Vega Frontier Edition, which is meant for low-level development and research work. It's not pro kit because it doesn't have the certified drivers that requires, but it's not home kit either. Bit of a niche market, but it is a real market. It has a gaming mode and AMD lightly mentioned it mainly in the context of someone who wants 1 machine to both do game development on and game on.

People bought them and reviewed them for gaming. They were mediocre - between a 1070 and a 1080 but using far more power and generating far more heat, so much noisier and with little or no scope for overclocking. Also, they're over £1000.

AMD have mostly ignored that, but have replied saying (truthfully) that Vega FE isn't really meant for gaming and a different version - Vega RX - will be released soon that is meant for gaming.

There is a lot of speculation about how much difference will exist between Vega FE and Vega RX. A lot of speculation...and almost no information. The GPU will be the same, but it is possible that the FE drivers don't efficiently use it for gaming. Maybe RX will be far better for gaming than FE. Maybe not. Maybe it will be significantly cheaper than nVidia cards with similar performance. Maybe not. Maybe it will use much less power than FE. Maybe not.

Just a fantastic summary, thanks so much for taking the time.

From my point of view it needs to at least be close to a 1080Ti simply because that's what I'm set on. If it's not quite there but considerably cheaper then will also consider it!

Thanks :)
 
Are AMD setting a voltage to make sure every FE card is 100% stable at stock speeds and maybe erring on the side of caution?

Well the FE runs around 1440Mhz stable at 1.2V. People are getting it stable at 1600Mhz 1.075V, that's a massive difference in temps and power.

At 1600Mhz 1.075V the card is using 330W total. Where as stock FE hits the 300W wall at 1440Mhz.

So the cards can run faster and cooler; which is good, but looks like AMD aren't doing a lot of binning. Which considering the price of the Frontier Editions is disheartening.

I expect the Radeon Pro Vega cards going to Apple to be exceptionally binned again; as they did with the Polaris cards for them.
 
Are AMD setting a voltage to make sure every FE card is 100% stable at stock speeds and maybe erring on the side of caution?
It seems like pro software use needs higher voltage to be stable, maeby Vega just needs more juice when runnin FP16? Looking underlvolted scores Vega RX XT could easily be in 1,1V range. Wasnt there already news XTX 300W, XT 225W and Pro 225W?
 
Well the FE runs around 1440Mhz stable at 1.2V. People are getting it stable at 1600Mhz 1.075V, that's a massive difference in temps and power.

At 1600Mhz 1.075V the card is using 330W total. Where as stock FE hits the 300W wall at 1440Mhz.

So the cards can run faster and cooler; which is good, but looks like AMD aren't doing a lot of binning. Which considering the price of the Frontier Editions is disheartening.

I expect the Radeon Pro Vega cards going to Apple to be exceptionally binned again; as they did with the Polaris cards for them.
A guy in the German forum undervolted and had it stable in games but it crashed running Pro applications so there probably isn't an across the board undervolt for all cards just like overclocking.

It'll be interesting to see clocks, temps, power and voltage of the FE with the AIO. Reviews will be out soon.
 
It seems like pro software use needs higher voltage to be stable, maeby Vega just needs more juice when runnin FP16? Looking underlvolted scores Vega RX XT could easily be in 1,1V range. Wasnt there already news XTX 300W, XT 225W and Pro 225W?
Yeah, the Pro applications may need the higher voltage. I thought it might be the games that needed it.
 
It seems like pro software use needs higher voltage to be stable, maeby Vega just needs more juice when runnin FP16? Looking underlvolted scores Vega RX XT could easily be in 1,1V range. Wasnt there already news XTX 300W, XT 225W and Pro 225W?
A guy in the German forum undervolted and had it stable in games but it crashed running Pro applications so there probably isn't an across the board undervolt for all cards just like overclocking.

It'll be interesting to see clocks, temps, power and voltage of the FE with the AIO. Reviews will be out soon.

Yeah, That was for ASIC though, with total board powers being 375W, 285W, and 250W.

If it means they can be binned correctly, or undervolted for purely 3D loads it could be good. Even saving a little power with 8GB HBM2 as opposed to 16GB can help there then, considering how much it can be undervolted.

1.075W stable for gaming, not for Compute; but 1.1V seems a good middle ground with still having great power savings.

Vega is certainly interesting; I'll give it that.
 
Yeah, That was for ASIC though, with total board powers being 375W, 285W, and 250W.

If it means they can be binned correctly, or undervolted for purely 3D loads it could be good. Even saving a little power with 8GB HBM2 as opposed to 16GB can help there then, considering how much it can be undervolted.

1.075W stable for gaming, not for Compute; but 1.1V seems a good middle ground with still having great power savings.

Vega is certainly interesting; I'll give it that.

considering if he really did get 1600MHz at 1.075V, that shows they really did alter the architecture for higher clocks. Considering it takes ~1.25V to get 1500MHz with an RX580
 
Awesome, I prefer "blower-style" anyway, my real question is though, no bsing around, will my EVGA G2 650W really struggle to run a RX VEGA GPU?

With a stock quad core cpu I don't think 650w total will be a problem for a stock air cooled card. looks like a water cooled card with OC could burn 350-450w.

In the words of the late Roy Scheider: 'We're gonna need a bigger boat'.
 
considering if he really did get 1600MHz at 1.075V, that shows they really did alter the architecture for higher clocks. Considering it takes ~1.25V to get 1500MHz with an RX580

They did alright, PcGamesHardware.de has part of it as their review.

Will be interesting to see how all this affects RX Vega as well, especially since those don't need to focus on workstation and compute loads that seem to need more voltage for stability l.
 
Yeah, the Pro applications may need the higher voltage. I thought it might be the games that needed it.

To be honest if it crashed on pro applications, I wouldn't be running it undervolted gaming either as at some point, one scene in a game or one game will trip it up.
 
There seems to be a level of stubborn ignorance with the Vega cards.

It seems that amd were pretty determined that they were going to have hbm and have probably spent a lot of time and resources getting it to work.

They should have probably used gddr5 or 5x and concentrated on the GPU core. Basically a larger 580 with faster memory.

I really want to replace my now single 290x but it still seems like there isn't going to be a sensible amd upgrade so I can keep using freesync on my monitor. Only time will tell.
 
Status
Not open for further replies.
Back
Top Bottom