• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

Associate
Joined
8 May 2014
Posts
2,288
Location
france
Wonder why they are doing DVI again, they stopped that with Fury.

3x DP1.3 , 1xHDMI2 & 1x DVI, what's the benefit from removing DVI ? none, but you get more prospect by adding it, beside polaris is low-mid range, even if by any miracle it turns more powerfull than fury or 980Ti, it will still be mid range when vega shows at the end of the year.
AMD said they wanted to bring VR ready GPUs under 350$, and most ppl south of that still have DVI
 
Soldato
Joined
30 Nov 2011
Posts
11,356
Maybe they lied about 10 being GDDR5 then ... that or they found a way to dramatically simply the circuitry and cram all the chips into an improbably small PCB. That case is very small.

Edit - if it's **Nano** sized and has more than 4GB of RAM (which it'd need for the bandwidth with GDDR5), there is no way it it is not HBM(1).

the 970 mini is within a few mm of being "nano sized", all you would have to do is give it a 512bit interface and double the size of each chip and you could do it

it would be a little bigger than a nano, but not massively so
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
the 970 mini is within a few mm of being "nano sized", all you would have to do is give it a 512bit interface and double the size of each chip and you could do it

it would be a little bigger than a nano, but not massively so

Seriously? Each chip has a given bandwidth, you can't double the bus and double the chip size, you double the bus you double the number of traces, significantly increase power usage and double the number of chips required.

8x 32bit gddr5 chips gives 256bit bus, 8x double capacity chips gives 256bit bus. 16 x 32bit chips gives 512bit bus.

You can use two 32bit chips on a single set of traces where the memory controller uses them as effectively one chip in. This effectively gives you more chips but the same bandwidth, and as above you can use more chips for more bandwidth, but you can't do the opposite, you can't have more bandwidth from the same number of chips.
 
Soldato
Joined
7 Feb 2015
Posts
2,864
Location
South West
If anything, they would be using the new high density gddr chips Samsung was talking about a while ago. Greater capacity on a smaller bus.

So 8, 8gb chips gives 8GB on just a 256bit bus.
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,746
true, but the 980 is 4gb too, no one moans about that.

980 came out a good bit earlier though - if it had come out about the time the Fury X did and had been the halo product at that time it would have got the same stuff about only being 4GB - likewise the 980ti if it had come out with 4GB.
 
Associate
Joined
24 Nov 2010
Posts
2,314
4GB is fine for 4K (for the Nth time), unless for some reason you pointlessly like shoveling AA onto very high PPIs for no benefit. 4GB with GDDR5 in a nano size package on a card (P10) with performance that appears to exceed a FuryX will not work ... it either needs more RAM (likely 8GB) to achieve the needed bandwidth in order not to drag the chip down (meaning it can't be Nano sized) or it needs HBM.
 
Soldato
Joined
18 Oct 2002
Posts
11,038
Location
Romford/Hornchurch, Essex
4GB is fine for 4K (for the Nth time), unless for some reason you pointlessly like shoveling AA onto very high PPIs for no benefit. 4GB with GDDR5 in a nano size package on a card (P10) with performance that appears to exceed a FuryX will not work ... it either needs more RAM (likely 8GB) to achieve the needed bandwidth in order not to drag the chip down (meaning it can't be Nano sized) or it needs HBM.

thats the point of what i said though. Everyone moans that Fury is only 4gb, yet its fine if its only 4gb on an Nvidia card. When pushing over the 4gb buffer on lets say the 980ti the frame rate drops off pretty fast because the card isnt powerful enough to push 6gb of data.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
The only person that I see pushing the needing more than 4GB is Kaap and his examples consistently get shown up. Shadow of Mordor and Xcom 2 use way over 4GB of memory at 4k, if you enable the uncompressed textures. If you use the setting below which is identical quality and compressed textures and uses below 4GB and runs great on Fury X.

Games don't generally come with uncompressed textures, it's just a waste of space, bandwidth of downloading, storage space on an SSD and waste in efficiency of processing more data for absolutely no reason.

These uncompressed textures appear completely by magic in Nvidia gameworks games and when Nvidia guys on forums like to run around telling everyone you must have more than 4GB for 4k.

There is absolutely no reason at all to use uncompressed textures. For photo editing, in photoshop with images using several gigabytes of data, absolutely uncompressed raw data. For game textures, none, absolutely none.

If people can't contain themselves and use what is essentially identical but more efficient settings that is really down to them.


Effectively in Shadows of Mordor, high textures are the highest quality and Ultra is just a less efficient version of high textures, nothing more or less. So would Kaap also not recommend newer drivers which are more efficient in games, because he's purposefully telling everyone they need more memory just to use a less efficient setting within a game.
 
Man of Honour
Joined
21 May 2012
Posts
31,922
Location
Dalek flagship
The only person that I see pushing the needing more than 4GB is Kaap and his examples consistently get shown up. Shadow of Mordor and Xcom 2 use way over 4GB of memory at 4k, if you enable the uncompressed textures. If you use the setting below which is identical quality and compressed textures and uses below 4GB and runs great on Fury X.

Games don't generally come with uncompressed textures, it's just a waste of space, bandwidth of downloading, storage space on an SSD and waste in efficiency of processing more data for absolutely no reason.

These uncompressed textures appear completely by magic in Nvidia gameworks games and when Nvidia guys on forums like to run around telling everyone you must have more than 4GB for 4k.

There is absolutely no reason at all to use uncompressed textures. For photo editing, in photoshop with images using several gigabytes of data, absolutely uncompressed raw data. For game textures, none, absolutely none.

If people can't contain themselves and use what is essentially identical but more efficient settings that is really down to them.


Effectively in Shadows of Mordor, high textures are the highest quality and Ultra is just a less efficient version of high textures, nothing more or less. So would Kaap also not recommend newer drivers which are more efficient in games, because he's purposefully telling everyone they need more memory just to use a less efficient setting within a game.

Games are what they are and that is not going to get any better.

If you want to use the maximum settings you use suitable hardware which is the users choice.

As to drivers I do use the latest ones, sadly there are no good drivers that will allow a Fury X to perform well at 1080p running the ROTTR bench yet. DX11 performance is very poor and DX12 performance is twice as bad.

I measured a Fury X trying to use 7gb of memory on the ROTTR bench in DX11 @1080p.
 
Soldato
Joined
9 Dec 2006
Posts
9,286
Location
@ManCave
As much as Kaap does go on about 4GB Not good enough for 4K. he is right.

Their Are many Instances Where 4GB is limiting factor

I Bought a PG278AQ (which i love) but 980 SLI just cant cannot do it justice. soon as i hit around 3.9GB+ FPS drops & doesn't feel right even with Gsync
seen 60s drop to 35fps-45fps

Games such as:
Witcher 3 - Even on High/Ultra
Assassins Creed Syndicate - On Medium/high
Shadow of Mordor - On High

the above is without ANY AA hitting 4GB in top end AAA at 4K is easy. Even 4K requires AA is most games

many Easier games like WoW,Rocket league are fine

Im not regretting my PG278AQ, ive gone back to my PG278Q (1440p) for now till i buy my 2 Pascal cards.
 
Last edited:
Soldato
Joined
9 Dec 2006
Posts
9,286
Location
@ManCave
That's the thing though if your buying a card for 4K you NEED 4GB+ to make it look as good as 1440p

4K 4GB = on medium (that's what you need on average) so medium
NO HBAO
NO AA
NO special affects

Or
1440p 4GB = high/ultra
Hbao+
AA
All effects

99% of the time 1440 looks better due to hbao/effects. Medium textures at 4K look horrid looks like its for a ps2.

So yes you can game 4GB 4K but expect the experience to be very mixed. I love 4K but till pascal 1440p looks better and smoother.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
So... the whole Capsaicin event is done,, and we learned.... nothing? Nothing about price, performance, or release date.

When's the next "event" then? Computex in May?
 
Associate
Joined
4 Nov 2013
Posts
1,437
Location
Oxfordshire
So... the whole Capsaicin event is done,, and we learned.... nothing? Nothing about price, performance, or release date.

When's the next "event" then? Computex in May?

Because it was the Game Developers Conference...so they talked about games and VR. Just gave us a sneak peak on hardware.
 
Back
Top Bottom