• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD R9 Fury X2 - Paper Launch This Month - Available Q1 2016 - But Could Face Potential Delays

Caporegime
Joined
8 Jul 2003
Posts
30,063
Location
In a house
We finally have a concrete date on the public disclosure of AMD’s upcoming dual GPU flagship: The Fury X2. The graphic card that was originally revealed at the Fury launch event, and later spotted in shipping manifests at Zauba, is going to be launched sometime this month (December). According to the publication, Benchlife.info, however, the event will only be a paper launch with real market availability sometime later in Q1 2016. While a paper launch in December usually means market availability by January, the source warns that due to scheduling issues in production the graphic card could face potential delays.

http://wccftech.com/amd-radeon-r9-fury-x2-launch/

Oh dear :/
 
Last edited:
Whats the point??

Unless Nvidia and AMD only have retail 14NM/16NM based cards at the end of next year(which could happen),seems a bit of a late launch.
 
It is very late indeed, I originally thought it will hit the market around black friday and now is delayed to Q1 2016...
Luckily I didnt wait and just bought a Fury instead, dont have patient to wait that long lol
 
whats the point in x2 when they cant get any crossfire drivers working properly.

Because they haven't tried because there's nothing in it for them. Historically drivers have always been very good at the launch of a dual GPU card.

People just assume that AMD write drivers for them because they bought two cards, doesn't work like that. AMD only bother when there's a payday involved for them.
 
AMD promote crossfire as a feature of their cards so they should be supporting it with driver updates. There's always something in it for them such as positive publicity when things work as they should. They need that right now. People going onto forums and slating them for not supporting something that they promote will only drive more people away from AMD.
 
Last edited:
What's the point of a paper launch when the cards are not available until much later? Why not launch the cards when people can actually buy them?
 
AMD promote crossfire as a feature of their cards so they should be supporting it with driver updates. There's always something in it for them such as positive publicity when things work as they should. They need that right now. People going onto forums and slating them for not supporting something that they promote will only drive more people away from AMD.

With so few users there is little point wasting money on developing the drivers.

Crossfire would never be enough to keep AMD in business.

You have to remember that these technologies are years old now and they just haven't caught on in the mainstream. Pointless tech = waste of cash. See also Physx, Hydra etc etc.

Having said that though the issue isn't really with AMD. It's with the game devs and if they can't be bothered to add in support for people running more than one card then there's little AMD can do. They can't hold them at gunpoint and force them.

I remember reading an article where they interviewed the devs of Bioshock and they said that they would literally need to compile a version for people using one card, a version for people using two cards, a version for people using three cards, ad nauseum.

And the financial rewards were nowhere near enough to make them want to bother. And the more time goes on and the more Nvidia and AMD don't embrace the game devs and get in on the project the more we won't see support for Crossfire and SLI.

Whilst no big loud announcements have been made over ditching SLI and Crossfire they've certainly made it loud and clear to me that wasting cash on a second GPU is pointless. Not only that there's a high chance it will actually break your game and ruin it completely.

I was a staunch advocate of both until ATI got caught lying and Nvidia just don't bother any more. My last SLI arrangement was last year, where I had nothing but issues and problems with Gsync and SLI, poor game support for SLI and games not even running because of SLI.

Enough is enough.
 
With so few users there is little point wasting money on developing the drivers.

Crossfire would never be enough to keep AMD in business.

You have to remember that these technologies are years old now and they just haven't caught on in the mainstream. Pointless tech = waste of cash. See also Physx, Hydra etc etc.

Having said that though the issue isn't really with AMD. It's with the game devs and if they can't be bothered to add in support for people running more than one card then there's little AMD can do. They can't hold them at gunpoint and force them.

I remember reading an article where they interviewed the devs of Bioshock and they said that they would literally need to compile a version for people using one card, a version for people using two cards, a version for people using three cards, ad nauseum.

And the financial rewards were nowhere near enough to make them want to bother. And the more time goes on and the more Nvidia and AMD don't embrace the game devs and get in on the project the more we won't see support for Crossfire and SLI.

Whilst no big loud announcements have been made over ditching SLI and Crossfire they've certainly made it loud and clear to me that wasting cash on a second GPU is pointless. Not only that there's a high chance it will actually break your game and ruin it completely.

I was a staunch advocate of both until ATI got caught lying and Nvidia just don't bother any more. My last SLI arrangement was last year, where I had nothing but issues and problems with Gsync and SLI, poor game support for SLI and games not even running because of SLI.

Enough is enough.

crossfire/sli isnt a good idea as the technical challenges are simply to big to overcome and offers a worse gaming experience than single cards.

Lucky the die shrink next year allows some real powerful cards to come out
 
Was meant to be released in the "fall" now its pushed to winter. Can't see what the delay would be, fury cards are easy to come by now, unless its to do with the liquid cooler and making sure it isn't some 50p POS that whines.
 
Disappointed that it's still using plain old crossfire over copper traces, why can't they find some way to join the interposers or something.
 
Disappointed that it's still using plain old crossfire over copper traces, why can't they find some way to join the interposers or something.

Considering how much of a PITA it was to get the interposer and hbm working with half decent yields I highly doubt they want to make it any more complicated than it needs to be.
 
Disappointed that it's still using plain old crossfire over copper traces, why can't they find some way to join the interposers or something.

Lots of reasons, primarily of which is interposer maximum size. There are definitely ways to do it but the entire ecosystem has to be set up for giant dies which is basically not going to happen.


In terms of why now, it's ready, it's a product and regardless of how much it sells it will beat the living pants off any other single card in a lot of games, that won't ever be bad marketing.

Personally don't think we'll see Pascal(of any meaningful size, 350-450mm^2) till Q3, if it's Q2 it will be more like late June than early April, so if it's available in January there is no reason to not do it because Pascal is imminent... in any meaningful way there is almost no chance of that. We might see a low end part first and earlier but I doubt that as well.

The issue of sli/xfire is difficult, I've said for a while that effectively DX11 development is dead. Nvidia flogged a dead horse putting a lot of effort into basically working around DX11 in the past 18 months but that also brought with it more issues than ever in launch titles because if a game dev is working to DX11 and Nvidia is in essence breaking DX11 to find more performance you're going to have more trouble.

The focus is and has been on DX12 for a while. Xfire/SLI will be the biggest sufferers as it takes so much extra work, however xfire/sli might be dramatically better/stronger under DX12. So we'll see if the switch to focusing on and working with DX12 drivers will also bring about vastly improved multi card performance as a more standard feature of games. It's quite possible sli/xfire will work hugely better over the coming months than ever before.
 
Nvidia flogged a dead horse putting a lot of effort into basically working around DX11 in the past 18 months but that also brought with it more issues than ever in launch titles because if a game dev is working to DX11 and Nvidia is in essence breaking DX11 to find more performance you're going to have more trouble.

That's rather a extreme statement to make, remind me how many DX12 titles are out now or are due to be released in the next 12-18 months.

If anything Multi GPU in DX12 looks shakey. There's a shift away from the vendors on the driver front making games work (including multi GPU) to the game developers. Also the emphasis will also be more on the developer in maintaining support especially for new cards.

We saw this party with Mantel and Thief. The mantel patch came along it performed great for the AMD cards at the time as that's what game was programmed for but then the R285 came out and in Mantel and the performace was much worse in Mantel (you can see this here http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed/7) . This is not a dig at mantel but a worry about "to the metal" api performance in games post release with cards released after the game.

To the metal API's are great for the larger dev's who have the resources and the inclination to get the most out PC hardware but for smaller teams or just team/projects where PC is simply not the priority they are going to stick with DX11.
 
Will this be pushed for VR though? I remember reading something about liquidVR using one GPU per display and if the VR Headsets aren't dropping until early next year they could be waiting on them.

It would make sense from a marketing point of view to use the the hype of the VR release to sell these cards instead of releasing them now while there are a few popular games with shoddy MGPU support.
 
That's rather a extreme statement to make, remind me how many DX12 titles are out now or are due to be released in the next 12-18 months.

If anything Multi GPU in DX12 looks shakey. There's a shift away from the vendors on the driver front making games work (including multi GPU) to the game developers. Also the emphasis will also be more on the developer in maintaining support especially for new cards.

We saw this party with Mantel and Thief. The mantel patch came along it performed great for the AMD cards at the time as that's what game was programmed for but then the R285 came out and in Mantel and the performace was much worse in Mantel (you can see this here http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed/7) . This is not a dig at mantel but a worry about "to the metal" api performance in games post release with cards released after the game.

To the metal API's are great for the larger dev's who have the resources and the inclination to get the most out PC hardware but for smaller teams or just team/projects where PC is simply not the priority they are going to stick with DX11.


Great post.

DX12 means developers will focus much more on GPUs that have the highest market share, and they will become more and more dependent on assistance form the hardware vendors and 3rd party libraries like gamesworks.
 
Back
Top Bottom