• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
Yup. The only reason to switch from a Ti is if the DX12 performance absolutely smashes it...

Hopefully that will be the case, since that's Nvidia's problem and AMD's strength right now.

And what's the point of Volta being the biggest arch change since Fermi if it doesn't solve their DX12 issues?
 
Hopefully that will be the case, since that's Nvidia's problem and AMD's strength right now.

And what's the point of Volta being the biggest arch change since Fermi if it doesn't solve their DX12 issues?

It will.... and if it doesn't! :eek:

That would be the biggest proof yet that DX12 is a fail. If Nvidia cant be bothered to be the number one GPU manufacturer to not just support it but drive it and the one leading development then it clearly isn't going to be successful.
 
Yup. The only reason to switch from a Ti is if the DX12 performance absolutely smashes it...
Unlikely. While AMD cards do get a boost in DX12, Nvidia cards win almost as many DX12 titles as Polaris does and the average performance difference works out at a couple of percent typically. there seems to be a lot of misinformation that some how Nvidia cards are bad at DX12 when the truth is they simply don't see as much of a speed-up relative to DX11 because their DX11 performance is relatively so much faster. Its not that AMD cards are so great at DX12, just they under perform in DX11.

the ironic thing is if Vega is all it is cracked up to be then its DX12 performance will be much closer to its DX11, i.e., it wont have those weaknesses anymore.
 
We'll only get Dx12, once Nvidia give us a Dx12 card, until then, devs wont touch it, just like Mantle, AMD had to get rid of that as no one would touch it, and now look at it, now that Nvidia have taken it on board, what a turnaround eh.
 
Last edited:
Common sense tells us it will be a 1080 competitor and it is down to the price really. Big Vega towards Chrimbo I reckon and that will be the Ti competitor.

I really hope they do not wait until crimbo for big vega. By then 2070 will be around the corner and 1080ti performance will be £400 at most.

I ended up sending back my 4K G-sync monitor. Was underwhelmed by AOCs quality control. It had 3 pieces of dust stuck behind the anti reflective coating and the blacks were somewhat poorer than my Dell and even my current LG. G-sync is nice, but not £360 nice which is the price difference between it and my Freesync LG which I picked up in the members market.

Just hope Vega is at least 1080 performance and has good price for performance or the Freesync will go to waste as I will end up moving to a 2070. Lol.
 
Shame HBM hasn't taken off and replaced GDDR completely, I was looking forward to cards all being the length of the R9 Nano. :(

I am very glad HBM had failed to taken off and not replaced GDDR completely, I think HBM will not last much longer in next few years when next generation PCs will adapted future DDR5 memory standard that will supported 3D stacked memory chips then Hynix, Micron and Samsung will develop next generation GDDR6X or GDDR7 based on DDR5 technology that will see stacked GDDR memory chips on graphics cards that would make HBM obsolete just like it happened with Rambus RDRAM.

I am not very impressed with HBM after AMD launched Fury X back in 2015 with HBM1, it was too expensive and cannot mass produced. Then 1 year later Hynix unveiled HBM2 chip, it was absolutely massive chip measured 91.99mm2 compared to HBM1 tiny 39.94mm2 size for 1GB that explained RX Vega chip had only 2 huge 4GB HBM2 chips rather than 4 16GB chips. I wonder what HBM3 chip size will be? Possible over 183.98mm2 for one 8GB chip and RX Navi would need a massive chip package to have space for 2 massive HBM3 chips.

I am very impressed with GDDR5 and GDDR5X chips, very surprised to see next generation GDDR6 still have the SAME die size as both GDDR5 and GDDR5X with 190 micro bumps. HBM1 chip have 3,982 micro bumps, HBM2 chip have over 5,000 micro bumps and HBM3 chip will probably have up to 10,000 micro bumps.

R9 Nano length was just marketing gimmick, there are already graphic cards that has similar compact like GTX 970 Mini, GTX 970 ITX, GTX 1060 Mini, GTX 1060 ITX, GTX 1070 Mini, GTX 1070 ITX, GTX 1080 Mini etc from ASUS, EMTEK, GALAX, Gigabyte, KFA2, MSI and Zotac.
 
Big Vega towards Chrimbo I reckon
Big Vega is H1 and the other derivatives are following. Xmas is integrated vega and maybe there are laptops with Vega in Autumn, not sure if I read that. If HBM2 runs cooler and even the card overall then its an asset in small enclosed cases for premium laptops though I'd guess Nvidia still wins performance to watt?
We'll only get Dx12, once Nvidia give us a Dx12 card, until then, devs wont touch it, just like Mantle, AMD had to get rid of that as no one would touch it, and now look at it, now that Nvidia have taken it on board, what a turnaround eh.

Mantle has gone into Vulkan I thought and Vega will move that forward I think more then dx12 especially
 
We'll only get Dx12, once Nvidia give us a Dx12 card, until then, devs wont touch it, just like Mantle, AMD had to get rid of that as no one would touch it, and now look at it, now that Nvidia have taken it on board, what a turnaround eh.

Maxwell and Pascal are Dx12 cards. Devs wont touch it because it is a fundamentally flawed concept and a big step back in what developers actually want.
 
I am very glad HBM had failed to taken off and not replaced GDDR completely, I think HBM will not last much longer in next few years when next generation PCs will adapted future DDR5 memory standard that will supported 3D stacked memory chips then Hynix, Micron and Samsung will develop next generation GDDR6X or GDDR7 based on DDR5 technology that will see stacked GDDR memory chips on graphics cards that would make HBM obsolete just like it happened with Rambus RDRAM.

I am not very impressed with HBM after AMD launched Fury X back in 2015 with HBM1, it was too expensive and cannot mass produced. Then 1 year later Hynix unveiled HBM2 chip, it was absolutely massive chip measured 91.99mm2 compared to HBM1 tiny 39.94mm2 size for 1GB that explained RX Vega chip had only 2 huge 4GB HBM2 chips rather than 4 16GB chips. I wonder what HBM3 chip size will be? Possible over 183.98mm2 for one 8GB chip and RX Navi would need a massive chip package to have space for 2 massive HBM3 chips.

I am very impressed with GDDR5 and GDDR5X chips, very surprised to see next generation GDDR6 still have the SAME die size as both GDDR5 and GDDR5X with 190 micro bumps. HBM1 chip have 3,982 micro bumps, HBM2 chip have over 5,000 micro bumps and HBM3 chip will probably have up to 10,000 micro bumps.

R9 Nano length was just marketing gimmick, there are already graphic cards that has similar compact like GTX 970 Mini, GTX 970 ITX, GTX 1060 Mini, GTX 1060 ITX, GTX 1070 Mini, GTX 1070 ITX, GTX 1080 Mini etc from ASUS, EMTEK, GALAX, Gigabyte, KFA2, MSI and Zotac.



HBM3 seems to be a step a step back form HM2 going by some reports, more aimed at lower power devices. Supposed to be designed for much cheaper manufacturing but lower bandwidth, much lower than GDDr5.
 
Maxwell and Pascal are Dx12 cards. Devs wont touch it because it is a fundamentally flawed concept and a big step back in what developers actually want.
True that. I read an article from a dev saying that DX12 was a whole new headache to work with and DX11 was much easier and less time comsuming. With the big comapnies like UBI and EA, they are only going to tack DX12 on to a title with little effort. No idea if Vulkan is any easier but whilst DX12 is time consuming, titles will be few and far between with little effort put in when it is used!
 
But Devs are touching DX12 we have the games, They not little indi games either we have big known titles that are also showing great results so far.
DirectX 11 didn't just take off right away either.. Hell I remember DX9 still coming out in the mass while DX11 was ready available, It taken DX11 a good year or two might even been longer before DX9 started to fade out. Same will be with DX12 once the game engines has all been DX12 ready games will start to come faster. After all We know going by what is already available on the internet DX12 and Vulkan does speed up game development once the DX12 APIs are within the engine..
The longer process is getting it ready.
 
wow just wow lol

Yup, we've already told Microsoft to shove Dx12 up their arse, we just do not want it, we just aint bloody interested, and the devs know it too.

WHAT DO WE WANT, DX11!, WHEN DO WE WANT IT, NOW!, WHAT DO WE WANT, DX11!, WHEN DO WE WANT IT, NOW!....................................

But Devs are touching DX12 we have the games, They not little indi games either we have big known titles that are also showing great results so far.
DirectX 11 didn't just take off right away either.. Hell I remember DX9 still coming out in the mass while DX11 was ready available, It taken DX11 a good year or two might even been longer before DX9 started to fade out. Same will be with DX12 once the game engines has all been DX12 ready games will start to come faster. After all We know going by what is already available on the internet DX12 and Vulkan does speed up game development once the DX12 APIs are within the engine..
The longer process is getting it ready.

Have we hell got the games, we've got a couple of Dx11s, patched with a bit of 12 laters.

Until Nvidia give us a Dx12 card, then thats all we're going to keep on getting, the odd Dx11'er here, there, patched laters with a bit of 12.

Devs will not use Dx12, if only about 10 people can do it, its an absolute waste of time, they'll just stick to 11, that everyone can do, and with everyone keeping on buying Dx11 cards up in droves, that ain't going to change.
 
Last edited:
Big Vega is H1 and the other derivatives are following. Xmas is integrated vega and maybe there are laptops with Vega in Autumn, not sure if I read that. If HBM2 runs cooler and even the card overall then its an asset in small enclosed cases for premium laptops though I'd guess Nvidia still wins performance to watt?


Mantle has gone into Vulkan I thought and Vega will move that forward I think more then dx12 especially

Vulkan is just Mantle, Kronos took it from AMD, when they had to get rid of it, as no one was interested, they just didn't want to know about it, because of Nvidia, as they didn't want anything to do with it either, and with them owning the market..............

Once Kronos took it, it was no longer under AMD, so Nvidia then went 'right, we'll ave some of that now then', and why its the talk of the town now, and the games are a coming, in their droves.

Unless Dx12 gets taken up by them too, then its going to stay as a Mantle under AMD im afraid.

Maxwell and Pascal are Dx12 cards. Devs wont touch it because it is a fundamentally flawed concept and a big step back in what developers actually want.

More Dx11 really, as everyone says, they're crap at 12, only AMD are good at it, and why they all bang on about it, saying how great their Dx12 peformance is etc...., when its just not going to be used, until Nvidia do it as well, ala Mantle (which is now Vulkan), Tessellation.......................

Everyone knows that if Nvidia were as good, if not better than AMD at Dx12, we wouldn't be sitting here, still to this day, with just a couple of Dx11ers, patched with a bit of 12 laters, we'd be knee deep in bloody Dx12 games.
 
Last edited:
EBxcuv_DATRFV0un4_MP-iv_Qp7_Kz_Zl_Rp_YElhi_Vt_Fyz_Yn_Q.gif

Gold :D
 
DX12 will take off when Nvidia wants its to take off. If Nvidia wants its to be dead than it will be dead no matter how much people like it or hate. AMD can only be audience and watch Nvidia to decide the faith of DX12.
 
Status
Not open for further replies.
Back
Top Bottom