• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
It is not that obvious for professional software, because the VEga FE doesn't have certified drivers and is no faster than $800 Nvidia quadro cards with certified drivers.
Why do you keep spreading that same lie? 5500$ P6000 is having hard times in some programs with Vega FE
https://forums.overclockers.co.uk/posts/31020834

I hear ya, I'm getting pretty fed up of reading "ohhhh they had professionals in mind when building Vega, gaming was an afterthought", and this excuse being perpetuated by tech youtubers and such . When all indicates the contrary, they noticed their Vega wasn't going to do tremendously well for gaming compared to the competition, so they shifted the marketing towards pro workloads and adapted themselves as they could.
No one designs high end gpu for gaming anymore. All of them are made for professional workload, HPC etc. High end gaming chip cannot pay itself back. Nvidias advantage is they are able to cut more stuff out from their professional cards, when AMD has to use that same card in Pro/consumer market.
 
It's very plausible the Vega56 will consistently beat the 1070, if the Vega64 consistently matches the 1080.

The 1070 is actually very cut-down. Being cut-down 33%, or having 0.75x the shaders of the 1080. As well as lower memory bandwidth, and a worse form of memory so there's no chance to overclock to the same as the 1080.

On average it's ~25% slower than the 1080, and this can grow to 30%+ in highly optimised games like DOOM: https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Mini/28.html

The Vega56 is only 14% cut-down, or has 0.875x the shaders of the Vega64. If it can clock to the same level, and there's no reason to believe it can't, it will be less of a drop compared to the Vega64 as the 1070 is to the 1080.

Therefore the only way the Vega56 won't beat the 1070 on average is if it clocks terribly for some reason, or the Vega64 consistently loses to the 1080 by ~10%. And from what we know at the moment, neither of those seem to likely.

Also I say this as a 1070 owner.
The 64 from the slides is the liquid version so it has much higher clock speeds compared to the 56 which will have a big impact on performance?
 
I would be surprised at anyone spending £3k and not doing some basic research. Yes 1080ti is the best GPU by some margin but it's hardly a guaranteed 4K ultra/60fps card.
 
I managed fine with 4K with my 1070/1080 to be honest. You just need to know what to turn off or lower without loosing IQ. Way I see it is, in most games at 4K, optimising sliders to get fps, the game still looks better than 1440p with everything whacked up on maximum. These days many effects named Ultra or Nightmare you cannot see the difference when playing vs a setting below, yet the performance difference in fps is like owning a 1080 to a 1080Ti. So by optimising and unlocking all the fps, I get a free upgrade :D




It really does look good right Loadsa? I would have bought it had it been £500.
^ Oh dear, even if you did want one for gaming it looks like you might not be able to get one then.

Same with most gpu launches. The 1080 launch was pretty bad thinking back especially for those wanting an aib cooled card. AMD launches have not been great as far as launch supplies for as long as i can remember now.
 
I would be surprised at anyone spending £3k and not doing some basic research. Yes 1080ti is the best GPU by some margin but it's hardly a guaranteed 4K ultra/60fps card.

I recommended it to him as it's the best 4k card for his budget. 4k is just not single card ready yet and sli/crossfire suck so for me it was the best option. I do the research for my friends mainly but when spending so much you expect better than what's possible. I guess what he needs to do is try the same settings on his 290x to see the gains.
 
AMD Radeon RX Vega 64 to be great for mining



https://videocardz.com/71591/rumor-amd-radeon-rx-vega-64-to-be-great-for-mining

Not good news if true :(

Also...

https://www.overclock3d.net/news/gp...vels_of_mining_performance_on_amd_s_rx_vega/1

"
Retail staff reports high levels of mining performance with AMD's RX Vega

A member of Overclockers UK staff, Gibbo, has reported that AMD's Vega series cards offer some insane levels of mining performance with hash rates of between 70 and 100 per card. To put this into perspective RX 580 GPUs can achieve hashrates of around 26-29. "
 
Well... someone just walked into the office and handed me a temporary card. A 4870. Been sat on a shelf so no idea if it works he said... but saves me having to buy a crappy card until I get Vega 56 :) Awesome. This means that my Ryzen build can probably go ahead now as I won't -at least for the moment- need to spend money on a graphics card. Things are looking a little bit more promising :D
 
Also...

https://www.overclock3d.net/news/gp...vels_of_mining_performance_on_amd_s_rx_vega/1

"
Retail staff reports high levels of mining performance with AMD's RX Vega

A member of Overclockers UK staff, Gibbo, has reported that AMD's Vega series cards offer some insane levels of mining performance with hash rates of between 70 and 100 per card. To put this into perspective RX 580 GPUs can achieve hashrates of around 26-29. "

That didn't take them long to get an "article" out on it :p
 
Vega the disappointment I expected from AMD, lets be fair if it was so good AMD would be letting slip benchmarks, touting its performance from the roof tops and laying claim to a top GFX part named Vega. So far its late, not as fast as it should have been, power hungry and has trouble beating year old Pascal 1070 and 1080 cards.
yet again AMD fail to deliver the touted goods, can I buy a vega card today? NO do I want to buy a slower card NO But I do wish Vega was all it was cracked up to be to offer NVidia competition.
 
Vega the disappointment I expected from AMD, lets be fair if it was so good AMD would be letting slip benchmarks, touting its performance from the roof tops and laying claim to a top GFX part named Vega. So far its late, not as fast as it should have been, power hungry and has trouble beating year old Pascal 1070 and 1080 cards.
yet again AMD fail to deliver the touted goods, can I buy a vega card today? NO do I want to buy a slower card NO But I do wish Vega was all it was cracked up to be to offer NVidia competition.
It doesn't matter now how it games if it's really good for mining!

wuhJWVX.jpg
wuhJWVX.jpg


In an Adored accent just for @Gerard :D
 
Last edited:
Also...

https://www.overclock3d.net/news/gp...vels_of_mining_performance_on_amd_s_rx_vega/1

"
Retail staff reports high levels of mining performance with AMD's RX Vega

A member of Overclockers UK staff, Gibbo, has reported that AMD's Vega series cards offer some insane levels of mining performance with hash rates of between 70 and 100 per card. To put this into perspective RX 580 GPUs can achieve hashrates of around 26-29. "
I didn't see the sauce, only the numbers :p
 
Also...

https://www.overclock3d.net/news/gp...vels_of_mining_performance_on_amd_s_rx_vega/1

"
Retail staff reports high levels of mining performance with AMD's RX Vega

A member of Overclockers UK staff, Gibbo, has reported that AMD's Vega series cards offer some insane levels of mining performance with hash rates of between 70 and 100 per card. To put this into perspective RX 580 GPUs can achieve hashrates of around 26-29. "
Lovely! lol

Yep, Gibbo will be pricing the Vega 56 not at £375, but £500 now.
 
I cant seem to determine, will the water cooled version be available for purchase by itself when the AIO partners release their products?

I don't need a new monitor and that monitor is buggy with freesync but do need a new cpu/mobo
 
I cant seem to determine, will the water cooled version be available for purchase by itself when the AIO partners release their products?

I don't need a new monitor and that monitor is buggy with freesync but do need a new cpu/mobo

As fare as i understand only the normal 64 and 56 version is purchasable outside the bundles. If you want the Aqua or limited edition you have to go with a bundle. I heard several sources saying this and its how i understand it as well.
 
I agree.
Please take a breath and look at what you saying.

1 - game developer uses 6 bit colours as thats compatible with every display
2 - owners of 8/10 bit panels have colours not used as a result.
3 - some other developer makes software to use the unused colours and as a result contrast is increased, more vibrancy etc.

The hardware you talk of in these situations is already present, just not utilised without HDR.

There is no specific hardware needed for actual HDR processing, the hardware required is just hardware that can output the extended range of colours. So basically the only requirement for 10bit HDR is a 10bit panel.

Of course if you are a commercial company wanting to maximise your revenues, you will want to make HDR a profitable feature, and as such you convince people they need a new tv for it. You make deals with companies that output to tv's (such as console manufacturers) that they will only enable HDR on HDR marketed displays. This can be achieved by checking for the presence of a so called HDR chip.

This is not complicated, its happened numerous times in the past and will happen again.

Its a similar thing with 3d graphics as well.

A cpu can generate the same visuals as e.g. a 1080TI, the difference is it does it a "lot" slower.
Early 3d games made a software 3d mode available where the game was still 3d but with lower performance. Now days no such software mode is available, instead games will just check for the presence of a GPU that can accelerate 3d graphics.

GSYNC is a very recent example as well.

Nvidia chose a path that required a chip, display companies jumped on it as they could sell monitors at a premium, now freesync has came out, which ironically also has a premium attached but is a smaller premium, nvidia cannot just start supporting freesync as they very likely have commercial agreements with the GSYNC display manufacturers to make sure demand is kept for those products.

It is all about money.
Do you understand what local dimming is, and why OLED is best for HDR?

I'll give you a hint, it has nothing to do with colour bit depth.
 
Status
Not open for further replies.
Back
Top Bottom