Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Why do you keep spreading that same lie? 5500$ P6000 is having hard times in some programs with Vega FEIt is not that obvious for professional software, because the VEga FE doesn't have certified drivers and is no faster than $800 Nvidia quadro cards with certified drivers.
No one designs high end gpu for gaming anymore. All of them are made for professional workload, HPC etc. High end gaming chip cannot pay itself back. Nvidias advantage is they are able to cut more stuff out from their professional cards, when AMD has to use that same card in Pro/consumer market.I hear ya, I'm getting pretty fed up of reading "ohhhh they had professionals in mind when building Vega, gaming was an afterthought", and this excuse being perpetuated by tech youtubers and such . When all indicates the contrary, they noticed their Vega wasn't going to do tremendously well for gaming compared to the competition, so they shifted the marketing towards pro workloads and adapted themselves as they could.
Is there any benefit to them doing both cpus and gpus though? It's not as if more gpus run faster on AMD cpus, in fact, for years even amd gpus ran faster on intel cpus![]()
The 64 from the slides is the liquid version so it has much higher clock speeds compared to the 56 which will have a big impact on performance?It's very plausible the Vega56 will consistently beat the 1070, if the Vega64 consistently matches the 1080.
The 1070 is actually very cut-down. Being cut-down 33%, or having 0.75x the shaders of the 1080. As well as lower memory bandwidth, and a worse form of memory so there's no chance to overclock to the same as the 1080.
On average it's ~25% slower than the 1080, and this can grow to 30%+ in highly optimised games like DOOM: https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Mini/28.html
The Vega56 is only 14% cut-down, or has 0.875x the shaders of the Vega64. If it can clock to the same level, and there's no reason to believe it can't, it will be less of a drop compared to the Vega64 as the 1070 is to the 1080.
Therefore the only way the Vega56 won't beat the 1070 on average is if it clocks terribly for some reason, or the Vega64 consistently loses to the 1080 by ~10%. And from what we know at the moment, neither of those seem to likely.
Also I say this as a 1070 owner.
I managed fine with 4K with my 1070/1080 to be honest. You just need to know what to turn off or lower without loosing IQ. Way I see it is, in most games at 4K, optimising sliders to get fps, the game still looks better than 1440p with everything whacked up on maximum. These days many effects named Ultra or Nightmare you cannot see the difference when playing vs a setting below, yet the performance difference in fps is like owning a 1080 to a 1080Ti. So by optimising and unlocking all the fps, I get a free upgrade
It really does look good right Loadsa? I would have bought it had it been £500.
^ Oh dear, even if you did want one for gaming it looks like you might not be able to get one then.
http://www.guru3d.com/articles_pages/amd_radeon_r9_nano_review,22.html
It was SLOWEST out of the loot and not cheap either.
I would be surprised at anyone spending £3k and not doing some basic research. Yes 1080ti is the best GPU by some margin but it's hardly a guaranteed 4K ultra/60fps card.
AMD Radeon RX Vega 64 to be great for mining
https://videocardz.com/71591/rumor-amd-radeon-rx-vega-64-to-be-great-for-mining
Not good news if true![]()
Also...
https://www.overclock3d.net/news/gp...vels_of_mining_performance_on_amd_s_rx_vega/1
"
Retail staff reports high levels of mining performance with AMD's RX Vega
A member of Overclockers UK staff, Gibbo, has reported that AMD's Vega series cards offer some insane levels of mining performance with hash rates of between 70 and 100 per card. To put this into perspective RX 580 GPUs can achieve hashrates of around 26-29. "
That didn't take them long to get an "article" out on it![]()
It doesn't matter now how it games if it's really good for mining!Vega the disappointment I expected from AMD, lets be fair if it was so good AMD would be letting slip benchmarks, touting its performance from the roof tops and laying claim to a top GFX part named Vega. So far its late, not as fast as it should have been, power hungry and has trouble beating year old Pascal 1070 and 1080 cards.
yet again AMD fail to deliver the touted goods, can I buy a vega card today? NO do I want to buy a slower card NO But I do wish Vega was all it was cracked up to be to offer NVidia competition.
I didn't see the sauce, only the numbersAlso...
https://www.overclock3d.net/news/gp...vels_of_mining_performance_on_amd_s_rx_vega/1
"
Retail staff reports high levels of mining performance with AMD's RX Vega
A member of Overclockers UK staff, Gibbo, has reported that AMD's Vega series cards offer some insane levels of mining performance with hash rates of between 70 and 100 per card. To put this into perspective RX 580 GPUs can achieve hashrates of around 26-29. "
Lovely! lolAlso...
https://www.overclock3d.net/news/gp...vels_of_mining_performance_on_amd_s_rx_vega/1
"
Retail staff reports high levels of mining performance with AMD's RX Vega
A member of Overclockers UK staff, Gibbo, has reported that AMD's Vega series cards offer some insane levels of mining performance with hash rates of between 70 and 100 per card. To put this into perspective RX 580 GPUs can achieve hashrates of around 26-29. "
It really does look good right Loadsa? I would have bought it had it been £500.
I cant seem to determine, will the water cooled version be available for purchase by itself when the AIO partners release their products?
I don't need a new monitor and that monitor is buggy with freesync but do need a new cpu/mobo
Do you understand what local dimming is, and why OLED is best for HDR?I agree.
Please take a breath and look at what you saying.
1 - game developer uses 6 bit colours as thats compatible with every display
2 - owners of 8/10 bit panels have colours not used as a result.
3 - some other developer makes software to use the unused colours and as a result contrast is increased, more vibrancy etc.
The hardware you talk of in these situations is already present, just not utilised without HDR.
There is no specific hardware needed for actual HDR processing, the hardware required is just hardware that can output the extended range of colours. So basically the only requirement for 10bit HDR is a 10bit panel.
Of course if you are a commercial company wanting to maximise your revenues, you will want to make HDR a profitable feature, and as such you convince people they need a new tv for it. You make deals with companies that output to tv's (such as console manufacturers) that they will only enable HDR on HDR marketed displays. This can be achieved by checking for the presence of a so called HDR chip.
This is not complicated, its happened numerous times in the past and will happen again.
Its a similar thing with 3d graphics as well.
A cpu can generate the same visuals as e.g. a 1080TI, the difference is it does it a "lot" slower.
Early 3d games made a software 3d mode available where the game was still 3d but with lower performance. Now days no such software mode is available, instead games will just check for the presence of a GPU that can accelerate 3d graphics.
GSYNC is a very recent example as well.
Nvidia chose a path that required a chip, display companies jumped on it as they could sell monitors at a premium, now freesync has came out, which ironically also has a premium attached but is a smaller premium, nvidia cannot just start supporting freesync as they very likely have commercial agreements with the GSYNC display manufacturers to make sure demand is kept for those products.
It is all about money.