• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

I wonder where the Fury X2 will fit into all this. It's still to be released and we are talking about Polaris in 5 months.

I think the Fury X2 will be here very soon - AMDMatt has probably just obtained two of them, engineering samples or early products from the OEM's etc, since he sold his 4 FuryX's here very recently.
 
I think the Fury X2 will be here very soon - AMDMatt has probably just obtained two of them, engineering samples or early products from the OEM's etc, since he sold his 4 FuryX's here very recently.

In all likelihood Fury x2 will be AMD's top single card offering for a long time, out-performing the top end Polaris by a fair bit (in games where it works).

Either there won't be a dual Polaris card or there won't be one for at least 12 months, long enough for dual Fiji to have a good run.
 
You keep mentioning at launch and yet looking at your original post, with the question in red, there is no mention of at launch.

Marked in green, Fiji made three cards, not at launch it didn't.

FuryX Launched June 24th

Fury Launched July 14th

Nano Launched August 27th

Marked in orange Tahiti started with only 2.

7970 Launched Jan 9th

7950 Launched Jan 31st

and so on.


If you insist on only comparing the launch cards, you will find your original points to be incorrect and this is why I assumed you were counting all the cards made from a GPU die.

But when has a high-end GPU made 4 SKUs? I can't think of any examples.

Fiji made 3 cards, but before that 2 cards from one chip was the norm. Tahiti was 2 to start with then they introduced Tahiti lite some years later. Hawaii/Grenada was 2. Pitcairne was 2. Tonga was until recently was a single card in each generation.

Expecting two GPUs to fill the entire range is not something we've ever seen before, so I'm curious why people are making these predictions with such conviction it could be true?
 
Got a feeling Fury cards will coexist with the first upcoming Polaris cards, so we'll see a new high end flagship as well as a low power card more budget card and Nano Fury and Fury X etc will co exist at least for a while until stock is depleted. The 39X cards will likely be phased out sooner than the current Fury ones. Then later over the next year we'll see the whole stack replaced with die shrunk cards.

New stack will look like this at first imho.

Fury X 'X2' (Dual GPU) Aimed at VR.
Polaris (Single GPU) Flagship
Fury X (High End)
Fury / Nano (Mid / High)
Polaris 'lightweight' (Single GPU) Energy efficient low - mid end)

39X cards phased out first..

Then more cards released over next year and a whole new stack completed.
 
Last edited:
Got a feeling Fury cards will coexist with the first upcoming Polaris cards, so we'll see a new high end flagship as well as a low power card more budget card and Nano Fury and Fury X etc will co exist at least for a while until stock is depleted. The 39X cards will likely be phased out sooner than the current Fury ones. Then later over the next year we'll see the whole stack replaced with die shrunk cards.

New stack will look like this at first imho.

Fury X 'X2' (Dual GPU) Aimed at VR.
Polaris (Single GPU) Flagship
Fury X (High End)
Fury / Nano (Mid / High)
Polaris 'lightweight' (Single GPU) Energy efficient low - mid end)

39X cards phased out first..

Then more cards released over next year and a whole new stack completed.

With Fiji, if you mean existing inventory, well I can't see 4GB cards selling at all in 2016, esp if nV's Pascal high-end has 8GB+. The first time around people were already holding off because of the 4GB limit, a year later this is just going to be worse, on a £300+ card it's not really acceptable.

If you mean new Fiji based cards with 8GB+ HBM2, I got the impression that these would be more expensive to manufacture due to large size than the new smaller Polaris chips.

As well as having less features, being hotter and louder, etc.

All in all, apart from dual Fiji, it does seem that Fiji (whichever way you spin it), won't compete well in 2016, against 14nm competitors. The performance might also be too close to cut-down biggish Polaris.

How would you keep single-chip Fiji in the Polaris era?
 
AMDMatt has probably just obtained two of them, engineering samples or early products from the OEM's etc, since he sold his 4 FuryX's here very recently.

Still got two actually Dave.

vRSMhCz.jpg
 
Why are people so obsessed with having 6-8gb of Vram?

In the long run, higher amounts of ram as standard means higher quality textures can be used. Even the most graphical AAA games textures look like puke at the moment.

But then there have been a few means to improve this for years, but it means running DX11 as lowest API.
 
For the card lineup theres one more possibility.
What if Polaris 10 and 11 are not just different chips but slightly different architectures?
Like 10 is the gddr5 variant with 2 chips (both with a cut down variant) built on it, and 11 is the HBM2 variant for the top chips?
 
In the long run, higher amounts of ram as standard means higher quality textures can be used. Even the most graphical AAA games textures look like puke at the moment.

But then there have been a few means to improve this for years, but it means running DX11 as lowest API.


For one, current gen consoles have 8GB (unified). So why not?

8gb for everything isnt it? we have system ram they don't? That's another thing that annoys me, people recommending 32gb for gaming rigs... LOL why? 8-16gb would be enough for a good few years yet then you can spend the extra £100 on a better CPU or GPU.

If your going to get a 8gb card, but it only uses 50% of it what's the point? Obviously once 4k becomes mainstream and as Mauller said, textures get bigger then maybe, but are there really any games out there that use over 4gb?
 
Last edited:
8gb for everything isnt it? we have system ram they don't? That's another thing that annoys me, people recommending 32gb for gaming rigs... LOL why? 8-16gb would be enough for a good few years yet then you can spend the extra £100 on a better CPU or GPU.

If your going to get a 8gb card, but it only uses 50% of it what's the point? Obviously once 4k becomes mainstream and as Mauller said, textures get bigger then maybe, but are there really any games out there that use over 4gb?

On a single GPU not really but once you go MGPU you will run into VRAM issues at 4K. Having said that even for single GPU it is always better to be future proofed.
 
8gb for everything isnt it? we have system ram they don't? That's another thing that annoys me, people recommending 32gb for gaming rigs... LOL why? 8-16gb would be enough for a good few years yet then you can spend the extra £100 on a better CPU or GPU.

If your going to get a 8gb card, but it only uses 50% of it what's the point? Obviously once 4k becomes mainstream and as Mauller said, textures get bigger then maybe, but are there really any games out there that use over 4gb?

Nobody recommends 32GB system RAM for gaming ;)

Pretty much everyone will say that 8GB is fine. Would I go lower than 8? Well, you could do, but why? You won't save yourself much money, and it's nice to have the extra should you need it.

But going beyond 8GB SRAM has no benefit currently in games.

Having said that even for single GPU it is always better to be future proofed.

Yup. I'd rather the PC has the potential for better textures than the consoles, not the other way around ;)

We had 3GB cards in 2012... I don't see that 8 GB is pushing the boat out too far. And I don't think 4GB on a high-end 2016 card is acceptable.

Personally, I won't buy a 4GB card again.

Anyway, AMD would seem to agree, as they put 8GB on the 390 cards. They could easily have to stuck to 4GB - it would even have made Fiji look better in comparison - but they went with 8GB.
 
Last edited:
In the long run, higher amounts of ram as standard means higher quality textures can be used. Even the most graphical AAA games textures look like puke at the moment.

.

The developers design after the mass.
thats currently 1-2 to 4gb.

HBM and AMD has shown latency can be lowered with HBM and memory compression so adding more memory like Nvidia does isnt a solution for a gamer or a developer.

why for example AMD added 8gb to the 390 was to make it different.
adding 12gb for the titanX is the same idea to make it different.

can a developer add a better texture?
sure, but then they have the issue of time spent and money to allow that.
then wahts the customer running and then they design it for consoles.
the market states, xbox, ps4 etc..is the design target, battlefront dice crap show us that. so a PC will always have a compromize due to the developer design for another market.

so anyone buying a 12gb card or a 8gb card wont have any real usage as 4GB HBM has shown being superior to GDDR5 due to a better memory handling.
Games wont need 8gb until HDR enabled games happens with dx12 and engines that has that built in.

the reality is, developers cut corners and wont do games that are awesome on PC, true battlefront looks absoluty stunning and sounds amazing but the gameplay is crap so why even play it?
and the players agree.
12000 players online PC peak!
ps4? 103000!
xbox 64000!

If I was dice and thinking, money cow, dude I design it for console anyday of the week and sell crap as kids today buy anything. console outsell PC 10 to 1.
you wont have a game made for PC anymore even it might change with VR.
(not likely) I can sell 160000 games to console vs 12000 games to PC guess what I design 4?

AMD at least allow us new HDR screens, with DP 1.3 and a whole new design with Polaris as moat of my time I spend online playing a dx9 engine grim dawn as I simply refuse to buy battlefront as its made by dice. The artwork of grim dawn however is vastly superior to battlefront as its not about pixels all the time.
 
The developers design after the mass.
thats currently 1-2 to 4gb.

HBM and AMD has shown latency can be lowered with HBM and memory compression so adding more memory like Nvidia does isnt a solution for a gamer or a developer.

why for example AMD added 8gb to the 390 was to make it different.
adding 12gb for the titanX is the same idea to make it different.

can a developer add a better texture?
sure, but then they have the issue of time spent and money to allow that.
then wahts the customer running and then they design it for consoles.
the market states, xbox, ps4 etc..is the design target, battlefront dice crap show us that. so a PC will always have a compromize due to the developer design for another market.

so anyone buying a 12gb card or a 8gb card wont have any real usage as 4GB HBM has shown being superior to GDDR5 due to a better memory handling.
Games wont need 8gb until HDR enabled games happens with dx12 and engines that has that built in.

the reality is, developers cut corners and wont do games that are awesome on PC, true battlefront looks absoluty stunning and sounds amazing but the gameplay is crap so why even play it?
and the players agree.
12000 players online PC peak!
ps4? 103000!
xbox 64000!

If I was dice and thinking, money cow, dude I design it for console anyday of the week and sell crap as kids today buy anything. console outsell PC 10 to 1.
you wont have a game made for PC anymore even it might change with VR.
(not likely) I can sell 160000 games to console vs 12000 games to PC guess what I design 4?

AMD at least allow us new HDR screens, with DP 1.3 and a whole new design with Polaris as moat of my time I spend online playing a dx9 engine grim dawn as I simply refuse to buy battlefront as its made by dice. The artwork of grim dawn however is vastly superior to battlefront as its not about pixels all the time.

Well said Flopper, I've been playing GD also for about 2 years now, games brilliant, I also bought an Xbone at Crimbo for my boys and got Star Wars Battlefront for it, doubt I will ever buy it for PC as it holds little interest for me.

I'm finding it harder daily to justify why I game on PC now, pretty much comes down to title availability, maybe once the new gen of cards Coke and hdr monitors and VR there might be another surge into PC gaming again, as it stands right now it appears console is king with regards to target audience for most developers
 
Well said Flopper, I've been playing GD also for about 2 years now, games brilliant, I also bought an Xbone at Crimbo for my boys and got Star Wars Battlefront for it, doubt I will ever buy it for PC as it holds little interest for me.

I'm finding it harder daily to justify why I game on PC now, pretty much comes down to title availability, maybe once the new gen of cards Coke and hdr monitors and VR there might be another surge into PC gaming again, as it stands right now it appears console is king with regards to target audience for most developers

I cant say I enjoy console gaming, I tried but it isnt for me.
PC however with eyefinity or such are absolutly a superb immersive experience. I am atm running only one screen as my 3 other ones become to old so I bought a 1440p 144hz. I have soon 400 hours into grim dawn and still not a L85 avatar.

For the future I plan to have eyefinity again or VR whenever its mature for it which isnt this year. Since AMD suddenly with the market went HDR all my plans had to be revised. Now I have to have a new HDR screen and then 2 more for eyefinity with a Polaris GPU and later Zen CPU.
True I will mainly play Grim Dawn or similiar games, mostly write and facebook but when you use your computer and then have it work the way you want its simply for me something I can enjoy immensly as I spend time with it. I had my other screens for 8 years so I am not one that update or upgrade often. Zen+Polaris and HDR well it seems this is a year to update.

I am impressed that AMD has silicon up and running so the launch seems to be one that they get right this time. Fury one they messed up.
 
Games wont need 8gb until HDR enabled games happens with dx12 and engines that has that built in.

So, using your own argument... why would you buy a card in 4-6 months time with 4GB?

Isn't DX12 just around the corner? We're going to start getting DX12 games later this year, or beginning of next...

Why buy a next-gen Polaris card, supporting DX12, with 4GB mem?

You were the one who said DX12 needed 8GB :p And since that's the future, buying a 4GB card couldn't be said to make much sense, no?
 
Back
Top Bottom