• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
Yea which is why i think the end performance could be really good at least on this test. I expect much higher clocks if the process has matured like has been reported. Even at these low clocks it's beating out a gtx1080 easily enough.

Hopefully we'll get 1450-1500Mhz retail clocks with a little room for tweaking too :) (but looking at the 480 clocks then maybe not....)
 
That Vega card that is said to play Doom 4K on Ultra; it seems to me it is not even HBM2, so hopefully the HBM2 version will be more powerful.
 
Given the Doom demo, Vega 10 looks to be around 1080 levels in favourable games, probably worse in other titles and DX11..

This is what I'm half expecting too,

I'd be happier if it can match the 1080 in DX11 titles but it needs to be a bargain considering Nvidia have had that performance available for a year and are currently cutting prices as well as gearing up for a refresh.
 
I'm just hoping Vega 11 will be at 1070 levels for less money.

Despite my intentions of going big with 1080ti or big Vega, even I might be tempted to give the big guys a miss this time, if smaller Vega is 1070 performance for around £300 (aka less than what 1070 costs). Also cos I don't feel like £800 on a GPU... just out of principle.
 
I don't think it is meant in that way - but rather it is the "flagship" variant of Vega but not fully specced/clocked up. They could be completely wrong but I've seen a few people in dev centric circles claiming that what AMD has been showing off is a like 2/3rd spec GDDR5 or GDDR5X card that is being used for testing next gen console and mobile development.

I don't know in what dev centric circles you're roaming around, but whoever told you that has zero knowledge about chip design. HBM and GDDR memory interfaces are very different and you would need to build both into a chip to use both. No one would ever do that, because it's so much wasted space.
 
I don't know in what dev centric circles you're roaming around, but whoever told you that has zero knowledge about chip design. HBM and GDDR memory interfaces are very different and you would need to build both into a chip to use both. No one would ever do that, because it's so much wasted space.

Pretty sure AMD themselves said there chips were designed to make use of HBM/2/Gddr/x. I know little about this tbh but if Polaris is capable of using HBM then you would think Vega would be able to use GDDR.

http://wccftech.com/amd-confirms-polaris-will-feature-hbm2-and-gddr5/
 
Pretty sure AMD themselves said there chips were designed to make use of HBM/2/Gddr/x.

http://wccftech.com/amd-confirms-polaris-will-feature-hbm2-and-gddr5/

No they say the architecture is compatible to both, which is of course right. But you always just implement one of the two into the chip. Two many people are mixing up architecture and implementation. GCN also had the ability to support DP up to a 1:2 rate. But only hawaii had this implementation, whereas most chips were just 1:16 capable.
 
Why are you telling me that as a reply?

Even though the duo is two Nano cores it's called the Pro Duo? As I said in the post that you replied too the Pro Duo uses Nano cores which are the better Fury X cores, It's got nothing to do with the Fury pro cards.

You mixed them again. Fury Pro Duo (the dual gpu card) is using 2 Fury Pro cores of 3584 shaders each.
The Nano is using a FuryX core of 4096 shaders.
 
Last edited:
I don't know in what dev centric circles you're roaming around, but whoever told you that has zero knowledge about chip design. HBM and GDDR memory interfaces are very different and you would need to build both into a chip to use both. No one would ever do that, because it's so much wasted space.

But as you said the design supports both - if they are looking at both next gen console (which are rumoured to use GDDR5 still but probably Polaris rather than Vega but who knows) and desktop PC applications it is likely there is some development overlap - its also possible some mid-level developer involved in it just knows it as a "next gen" GPU and assumes Vega.

Anyhow now I've looked at the image they are either wrong or talking about something different as that demo card plainly doesn't have the circuitry for GDDR5X or the layout for GDDR5.
 
And they call themselves enthusiasts :D

You know I haven't even read a single review on the Duo?
When it came out I was on four weeks holiday in Greece. When I came back end of May the whole dust had settled and the GTX1080 was on mainstream.
And bought one few months latter for my bday :o

Also found the card pointless. Was more expensive than 2 FuryXs and having come off just in Jan 2016 from the 295X2 was completely not interested on dual cards.
 
Also found the card pointless. Was more expensive than 2 FuryXs

I gotta agree with that, Plus it was so late out the gates, and it had the grunt where 4 gb's definitely wasn't enough when crossfire was working well. I'm surprised we haven't seen the Duo with the sort of discounts we saw the 295 with at the end of it's stock life.

I'm glad it was you wrong and not me, I was starting to worry that I'd tripped up and fallen into an alternate reality with subtle differences, wouldn't be the first time.
 
I gotta agree with that, Plus it was so late out the gates, and it had the grunt where 4 gb's definitely wasn't enough when crossfire was working well. I'm surprised we haven't seen the Duo with the sort of discounts we saw the 295 with at the end of it's stock life.

I'm glad it was you wrong and not me, I was starting to worry that I'd tripped up and fallen into an alternate reality with subtle differences, wouldn't be the first time.

Relax happens to all, especially after a whole day at work plus 4 hours driving on the damn motorways.
At least just saw the videocardz post regarding the "rumoured" vega.
If true and given is still on testing, looks nicely placed closer to the 1080Ti than the 1080.
We shall see......
 
First RX Vega OpenCL benchmark leaked!

https://videocardz.com/67242/amd-vega-with-64-compute-units-spotted

GTX 1080 Ti is faster than RX Vega in OpenCL benchmark.

1200MHz max boost clock? Thought Vega have up to max boost 1500Mhz? :confused:

Look like 2015 all over again with Nvidia ship 38 different 1080 Ti cards range compared to about 30 980 Ti mean Nvidia will sell alots of 1080 Ti in coming months and RTG probably will ship 1000 to 2000 RX Vega X cards worldwide in a month.

Poor Vega!
 
Status
Not open for further replies.
Back
Top Bottom