• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

Associate
Joined
10 Jul 2009
Posts
1,559
Location
London
Ahhhhh FuryX2, beyond defunct at this stage ;) :p

honestly it has been defunct since last december. As soon as they delayed it 1st time, it was too late to market. If it was released september, I would have gotten one for myself. But now so close to polaris announcements, I just cannot justify buying last gen product.
 
Soldato
Joined
18 Oct 2002
Posts
11,038
Location
Romford/Hornchurch, Essex
Any dual card is defunct when crossfire and SLI doesn't work correctly with *all* games. The only games in my gaming library i got working with crossfire (when i had 4x7970s for Bitcoin mining) was Half life 1 and 2, but who needs 400-600fps?!
 
Associate
Joined
8 Jan 2014
Posts
445
Any dual card is defunct when crossfire and SLI doesn't work correctly with *all* games. The only games in my gaming library i got working with crossfire (when i had 4x7970s for Bitcoin mining) was Half life 1 and 2, but who needs 400-600fps?!

Not an issue in VR. FuryX 2 is basically a niche VR market product.
 
Soldato
Joined
30 Nov 2011
Posts
11,356
Not an issue in VR. FuryX 2 is basically a niche VR market product.

What? How is it not an issue? VR games don't just magically work in crossfire, they still need support - are there any VR games that actually support multi-GPU right now?

Both AMD and Nvidia have released additional API's for enabling per-eye SLI/Crossfire, but no game is actually using them yet, so yes game support is still a massive issue for VR
 
Associate
Joined
10 Jul 2009
Posts
1,559
Location
London
And people are overblowing supposed xfire/sli support. I have multiGPU set up and am happy with it, since all the games which have problems be run at 4k on single GPU support xfire, those which don't support xfire run fine on single GPU.
 
Caporegime
Joined
24 Sep 2008
Posts
38,280
Location
Essex innit!
http://wccftech.com/amd-gdc-2016/

This year’s GDC promises to be rather special. High ranking execs of AMD have been teasing about the event for quite some time now and throwing around words like “Spicy”. The schedule of the event was published recently and most of the events looks innocuous – but hidden anywhere in these could be a surprise announcement or tease. Considering GDC is a very good place to get the attention of the millions of gamers, we will probably see a demo at the very least (possibly of 14nm Polaris and/or Fury X2) – if not a full blown information release.
 
Soldato
Joined
7 Feb 2015
Posts
2,864
Location
South West
AMD will be talking a lot about Low abstraction API coding at GDC. They are also having a big talk on GPUopen and mGPU coding. So hopefully they will add mGPU code and examples to GPUopen and we should see some decent mGPU support coming as default.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
We'll find out in 9 days ^^^^

I bet you they don't say much/anything about Polaris at GDC :p It's going to be all about GPUOpen and that kind of thing.

They may have some "demos" set up, like the power efficiency demo, but will not release any concrete information, as we're still too far from release. Still won't know price or performance by the end of it.

You wait and see :p
 
Associate
Joined
10 Jul 2009
Posts
1,559
Location
London
The problem is AMD twitter guys know how to talk the talk, but when it comes to us, tech geeks, they won't say anything new we already did not know :) I do hope they will surprise us with polaris info, but I really doubt. I expect endless push for recent Ashes bench results as a result of their fine work ;)
 
Soldato
Joined
23 Jul 2009
Posts
14,102
Location
Bath
The problem is AMD twitter guys know how to talk the talk, but when it comes to us, tech geeks, they won't say anything new we already did not know :) I do hope they will surprise us with polaris info, but I really doubt. I expect endless push for recent Ashes bench results as a result of their fine work ;)

+1

Remember fury x? Plenty of fluff and no performance news until we got the benchmarks from reviewers.

Plenty of shouting about async performance in ashes will likely be the order of the day.
 
Associate
Joined
18 Nov 2014
Posts
185
http://wccftech.com/sk-hynix-hbm2-mass-production-q3-2016/

so they are saying 4GB hbm2 stacks in Q3 '16 and 8GB hbm2 in Q4 '16
I read sometime ago about Samsung being a little ahead of Hynix in hbm2 production, even so, this would imply that hbm2 GPU's will not be on the market until Q3?(guess) and even then there WILL be stock issues.

i'm getting curious now on what Polaris 10 and 11 will be and what type of memory they will be using.
 
Associate
Joined
4 Nov 2013
Posts
1,437
Location
Oxfordshire
http://wccftech.com/sk-hynix-hbm2-mass-production-q3-2016/

so they are saying 4GB hbm2 stacks in Q3 '16 and 8GB hbm2 in Q4 '16
I read sometime ago about Samsung being a little ahead of Hynix in hbm2 production, even so, this would imply that hbm2 GPU's will not be on the market until Q3?(guess) and even then there WILL be stock issues.

i'm getting curious now on what Polaris 10 and 11 will be and what type of memory they will be using.


Yeah a "little": http://www.digitaltrends.com/computing/samsung-mass-producing-hbm2/

They started mass production in January.
 
Associate
Joined
4 Nov 2013
Posts
1,437
Location
Oxfordshire
It doesn't make any sense. Hynix is developing the standard, and they license production to Samsung. So Hynix is waiting several quarters to start production while Samsung is pumping volumes out already.

Samsung $$$$ > Hynix $$$$ i guess.
Samsung is a bloody giant compared to Hynix, and they don't license it, HBM is an open JEDEC standard.
I think as soon as the standard was accepted the truckloads of money started rolling.
 
Back
Top Bottom