• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

Associate
Joined
26 May 2012
Posts
1,582
Location
Surrey, UK
Looks like I sparked a fire with the DVI thing. Apologies folks, I didn't intend to come across a jerk with that but I will admit I favour DisplayPort. I was more asking about it due to the fact that folks who buy the very pricey cards tend to have DP or HDMI monitors. Whoever said that folks with low-mid ranged GPUs tend to own monitors with DVI is absolutely right. But that was my question. If the Polaris 10 has DVI, would that mean it's only the mid-ranged card?

*OFF-TOPIC - feel free to ignore*
FYI, I've never owned a monitor with just DVI, thus I'm not too aware of them. The earliest monitors I had were all VGA (aka d-sub) only. Then a couple years back, I bought a 1080p monitor that had all the I/O options (including DVI). Besides last year when I was connecting a HDMI laptop to the monitor via a converter cable (HDMI port was already in use), I've never used it since most high end GPUs have DP and the low end ones have VGA. The way I see it, anyone buying such a GPU is likely to have a monitor with DP or HDMI. That and old technology gets phased out. Which is why I thought it strange for nextgen (high end) GPUs to have DP.

But yeah, I will agree inclusion of an older connection doesn't hurt anyone, other than those who want to hardmod their GPUs into a single slot. I just though it strange that old tech is still being catered to, possibly without need. I don't mean to offend DVI users (rather I'm curious, when did you buy your monitor?).

So... the whole Capsaicin event is done,, and we learned.... nothing? Nothing about price, performance, or release date.

When's the next "event" then? Computex in May?

We didn't learn nothing. We learnt Polaris 10 is at least as powerful as a Fury X, as small as a Nano, will probably have DP 1.3, HDMI 2.0 and DVI and judging by the inclusion of DVI, the Polaris 10 is likely a mid-range card to be priced at 390/970 prices. With that in mind, Vega is likely to be the top-end product ala Fury X/980 ti.

Considering they had a demo, we can be confident Polaris will be releasing within a few months.
 
Soldato
Joined
30 Dec 2011
Posts
5,554
Location
Belfast
That's the thing though if your buying a card for 4K you NEED 4GB+ to make it look as good as 1440p

4K 4GB = on medium (that's what you need on average) so medium
NO HBAO
NO AA
NO special affects

Or
1440p 4GB = high/ultra
Hbao+
AA
All effects

99% of the time 1440 looks better due to hbao/effects. Medium textures at 4K look horrid looks like its for a ps2.

So yes you can game 4GB 4K but expect the experience to be very mixed. I love 4K but till pascal 1440p looks better and smoother.

This is all subjective, I have yet to play any game at 4K with a single Fury/Freesync combo at less than a mix of high - max settings. I can tolerate FPS from 35+ with Freesync but without even 45 FPS looks like crap. Turning all these pointless and IMHO hideous special effects makes games run far smoother and far better looking.

Blur
DoF
Vignette
Lens flare etc

Even Witcher 3 was mostly max settings apart from pointless DoF and vignette whatever "special effects". Of course my opinion is subjective but please don't try to claim the setting YOU feel are necessary at 4K are what everyone else needs to be comfortable with.
 
Soldato
Joined
28 May 2007
Posts
10,102
This is all subjective, I have yet to play any game at 4K with a single Fury/Freesync combo at less than a mix of high - max settings. I can tolerate FPS from 35+ with Freesync but without even 45 FPS looks like crap. Turning all these pointless and IMHO hideous special effects makes games run far smoother and far better looking.

Blur
DoF
Vignette
Lens flare etc

Even Witcher 3 was mostly max settings apart from pointless DoF and vignette whatever "special effects". Of course my opinion is subjective but please don't try to claim the setting YOU feel are necessary at 4K are what everyone else needs to be comfortable with.

I also turn these effects off as they spoil the image and add nothing bar less fps. Just about everyone i know turns these off before even playing a game. Film grain being another one that's creeping in.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
There is basically two games that 'need' 4GB at 4k, both used uncompressed versions of the same textures at the lower setting. Meaning if you use high instead of ultra you get an identical experience... but half the memory usage. The option was added (in Nvidia gameworks games) that magically drop performance but don't increase IQ at all.

These options were added by a company pushing over 4GB high end cards and creating a reason to need more than 4GB of memory, nothing more or less.
 
Man of Honour
Joined
21 May 2012
Posts
31,922
Location
Dalek flagship
There is basically two games that 'need' 4GB at 4k, both used uncompressed versions of the same textures at the lower setting. Meaning if you use high instead of ultra you get an identical experience... but half the memory usage. The option was added (in Nvidia gameworks games) that magically drop performance but don't increase IQ at all.

These options were added by a company pushing over 4GB high end cards and creating a reason to need more than 4GB of memory, nothing more or less.

Give it a rest 4gb is not enough @2160p in a number of games now.

Unlike you I am not biased about the subject, I own 12 cards with 4gb and it is just not enough for 4k.

Just because you don't use a mGPU setup @2160p does not mean that no one should.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
Give it a rest 4gb is not enough @2160p in a number of games now.

Unlike you I am not biased about the subject, I own 12 cards with 4gb and it is just not enough for 4k.

Just because you don't use a mGPU setup @2160p does not mean that no one should.

Give it a rest, biased, lol. Firstly how can I be biased against more than 4GB because? I recommend against wasting money on the 8GB version of a 290x before there was ever a Fury X or a Titan X.... but don't let that stop you.

Then you yourself, you've stopped saying what games need it, just broadly stating it. You also repeatedly use a illogical argument that if you haven't seen the card physically in person you know less about it. Like your inability to understand why a core 30% larger with the same power output will run cooler, which is a basic fact, while calling Fury X a hotter running core than Hawaii... but you know better because you've physically taken the cooler off one which trumps mere physics and huge amounts of evidence.

You cited TWO games last time you actually cited games with this spurious claim, Shadows of Mordor and xcom 2, two games which have uncompressed textures of the same quality level as the compressed textures of the lower texture setting. Both are Nvidia games, which is a compete coincidence and both came out as Nvidia pushes cards with a higher price with more memory.

If there was an 8GB version of Fury X, I'd recommend the cheaper 4GB. There is a 8GB version of 290x/390, in both cases I DO recommend the cheaper 4GB version. I have told people who are convinced they will go Nvidia already to get a 980ti because the extra memory of the Titan X is worthless for gaming.

If you care anything at all about IQ, you don't need 4+ GB, if you only care about pushing every option to the max regardless of IQ difference, then you need more than 4GB but only because Nvidia have worse memory usage(they have since the x800) and they will pay devs to increase memory usage to make them look better.
 
Soldato
Joined
30 Nov 2011
Posts
11,356
If you aren't interested in multi gpu then don't even bother talking 4k. It's pointless running 4k on low settings when a lower resolution on high actually looks better. I think most people buying in to 4k realise what the requirements are that go with it.
 
Associate
Joined
25 Aug 2015
Posts
120
Give it a rest 4gb is not enough @2160p in a number of games now.

Unlike you I am not biased about the subject, I own 12 cards with 4gb and it is just not enough for 4k.

Just because you don't use a mGPU setup @2160p does not mean that no one should.

Well my experience at going from a 4GB Nvidia card to an 6GB Nvidia card at 1440p was that it was a worthwhile upgrade. Yes I do like to turn up everything to max, and 6GB made a difference. From my own personal experience games look better with the gfx options turned up, and for that I needed more than 4GB.
 
Man of Honour
Joined
21 May 2012
Posts
31,922
Location
Dalek flagship
Give it a rest, biased, lol. Firstly how can I be biased against more than 4GB because? I recommend against wasting money on the 8GB version of a 290x before there was ever a Fury X or a Titan X.... but don't let that stop you.

Then you yourself, you've stopped saying what games need it, just broadly stating it. You also repeatedly use a illogical argument that if you haven't seen the card physically in person you know less about it. Like your inability to understand why a core 30% larger with the same power output will run cooler, which is a basic fact, while calling Fury X a hotter running core than Hawaii... but you know better because you've physically taken the cooler off one which trumps mere physics and huge amounts of evidence.

You cited TWO games last time you actually cited games with this spurious claim, Shadows of Mordor and xcom 2, two games which have uncompressed textures of the same quality level as the compressed textures of the lower texture setting. Both are Nvidia games, which is a compete coincidence and both came out as Nvidia pushes cards with a higher price with more memory.

If there was an 8GB version of Fury X, I'd recommend the cheaper 4GB. There is a 8GB version of 290x/390, in both cases I DO recommend the cheaper 4GB version. I have told people who are convinced they will go Nvidia already to get a 980ti because the extra memory of the Titan X is worthless for gaming.

If you care anything at all about IQ, you don't need 4+ GB, if you only care about pushing every option to the max regardless of IQ difference, then you need more than 4GB but only because Nvidia have worse memory usage(they have since the x800) and they will pay devs to increase memory usage to make them look better.

In a years time would you like to buy a 4gb card, I may even give you a special discount off one of mine but I suspect in a years time you will have an 8gb card lol.

As to games that use more than 4gb I have posted the GPUZ screenshots on a number of occasions for various games, perhaps if you called your Watch Dogs off you may spot them. People could be committing Grand Theft Auto right in front of you and you would not know what is going on.:D
 
Man of Honour
Joined
21 May 2012
Posts
31,922
Location
Dalek flagship
The very same games that even a 980ti can't play at useable fps at those settings? Yay you need 4.5gb VRAM to play at 20fps. No thanks. And don't bring up mgpu, I'm not spending £1200 when 99% of games don't work with sli/crossfire, just to claim 4gb is not enough.

This is the bit that DM never understands.

Every gamer will have a different view how they want their games to run and they should be free to do this rather than having to accept what is good for DM is good for everyone else.
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
It doesn't, look at the Fury non X. Full size cooler on a small board. The gtx970/980 are really small boards but sport full size coolers.

That's what gives The Sapphire Fury Tri-x an advantage over the Strix Fury, Having no back blocking the third fan allows air flow to move through a lot easier resulting in the cooler temps at lower fan speeds making it quieter as well.

I love my Fury Tri-x, To live with it's close to perfect. Silent and cool,
If they do a next gen replacement with at least 8 gb's of ram and 50%- 75% more grunt that'll be my 3440x1440 card.
 
Back
Top Bottom