• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

Soldato
Joined
26 Aug 2004
Posts
5,052
Location
South Wales
I think some are expecting way too much from the next gen high end cards, i mean if newer titles are pushing a 980Ti quite hard at 1080p, then new cards will be decent for 1440p for a little while but for 4k? No chance.. Obviously if you don't care about maintaining high frame rate then it isn't going to matter much.

This is partly the reason why i haven't even stepped up to 1440p because it's not even fully conquered with 1 high end GPU yet, which is pretty evident now. Plus I'm used to 100+hz and wouldn't be going back to 60 in a hurry.

Dealing with 4k means SLI/Crossfire which has its own issues I'd rather not bother with. So someone like me is screwed if wanting 4k at least.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
Even if those "lower settings" are only a few minor tweaks, such as gameworks effects, or extreme tessellation that makes no difference to IQ?

I obviously run textures, shadows, lighting, HBAO etc all at max. Those are the main goodies IMO. They all look so much better at 1440P.

People often run maximum needlessly because they believe maxed means everything.

For a decade we've had games that have high settings that look terrible or reduce IQ and high settings that look great.

DOF/Blur, certain shadow settings, film grain/sun glare, a lot of effects cost performance and reduce IQ and frankly I have to laugh at people who insist on 'maxing' settings by enabling all these things and having less performance to do so.


The one thing I'm starting to really really dislike about game devs is constantly including settings that reduce IQ. I mean devs spend time screwing around with motion blur, enable it by default and 95% of people hate it. FXAA, how a single dev has ever included this in a game I don't know, how a single dev enables it as the default AA method is beyond comprehension. Using sweetfx to sharpen FXAA in for instance FO4 gives a night and day difference in IQ for almost invisible performance cost while they enable light rays that are frequently inaccurate and have a needlessly high performance sapping setting for it.

Gamers really aren't doing enough to be vocal about it, we should have petitions going that tell devs to stop using FXAA without sharpening it afterwards because it looks ****** terrible with the blur. Probably 90% of gamers, meh, 99%, won't go and mess with sweetfx and injecting settings into a game and are frankly not getting the experience they deserve. The mental thing is something like sharpening FXAA is going to be 20 mins work for a dev after maybe 5 years making a game... yet it's ignored.

I'd take 1440p, push for 70fps + average and disable DOF/blur/excessively performance killing shadows, awful post processing effects and terrible quality AA/glare/over the top effects over 1080p and setting everything to high with no regard for if it actually looks good or not.

So many people believe very high or ultra automatically looks better than high/medium because logically it should... in reality this isn't true for every effect and in most games these days.
 
Last edited:
Soldato
Joined
3 Sep 2010
Posts
2,846
I think some are expecting way too much from the next gen high end cards, i mean if newer titles are pushing a 980Ti quite hard at 1080p, then new cards will be decent for 1440p for a little while but for 4k? No chance.. Obviously if you don't care about maintaining high frame rate then it isn't going to matter much.

This is partly the reason why i haven't even stepped up to 1440p because it's not even fully conquered with 1 high end GPU yet, which is pretty evident now. Plus I'm used to 100+hz and wouldn't be going back to 60 in a hurry.

Dealing with 4k means SLI/Crossfire which has its own issues I'd rather not bother with. So someone like me is screwed if wanting 4k at least.

I was running BF4 with 115 fps in eyefinity with a single amd card. 5040x1050. I dont understand what you talk about. I own kids in spite of having a slight inputlag with eyefinity at age 52.

Polaris going to be superb.
the die shrink is whats making most of performance boost anyhow as we seen with 28nm that there is as much engineers can do with moving electrons.

since I am highly sensitive to latency etc..I run single cards as sli as such is to bad.
AMD has stepped up on that latency thing and their cards totally blows nvidia out of the water.

4k is viable already with Nano.
With Polaris we are seeing some amazing progress in that arena and its just the start. Cards and drivers are already up and running. great days ahead.
 
Associate
Joined
8 May 2014
Posts
2,288
Location
france
Polaris HDR, displayport1.3...4k 120hz, crossfire flagship polaris will probably get you 120fps at 4k if not, 3way definitly will.
most high end today stomps on 1440p, even R9 390 handles it pretty well for a 250£ budget, Fury and 980Ti get most games to 60fps or close at high settings, few teaks and you can get a steady 60fps average.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
HDR uses a LOT more bandwidth to the monitor. From what I recall HDR 4k won't be doable at 120hz on DP 1.3. It will enable 120hz(maybe 144 or even more) at 4k without HDR or 60hz with HDR.

The sweet spot for gaming will probably still continue to be 1440p(either normal or super wide) 144hz and with HDR or 4k/120hz no HDR.
 
Associate
Joined
8 May 2014
Posts
2,288
Location
france
HDR uses a LOT more bandwidth to the monitor. From what I recall HDR 4k won't be doable at 120hz on DP 1.3. It will enable 120hz(maybe 144 or even more) at 4k without HDR or 60hz with HDR.

The sweet spot for gaming will probably still continue to be 1440p(either normal or super wide) 144hz and with HDR or 4k/120hz no HDR.


yes 4K 144hz SDR, and 4K 60hz HRD, and 5k 60hz SDR, thats as far as displayport 1.3 goes, or go thunderbolt.
there is no content for HDR anyways, games need to be made with HDR in mind, i dont think there is like some kind of backward compatibility.
but untill then 4k at 120hz is pretty sweet
 
Associate
Joined
3 Jan 2010
Posts
1,379
yes 4K 144hz SDR, and 4K 60hz HRD, and 5k 60hz SDR, thats as far as displayport 1.3 goes, or go thunderbolt.
there is no content for HDR anyways, games need to be made with HDR in mind, i dont think there is like some kind of backward compatibility.
but untill then 4k at 120hz is pretty sweet
Dang it could have been awesome if we could have it all and not get gsynch prices too :0 looks like it will be 60hz but HDR for me, I tend to run single cards so don't expect millions of fps but I love having some nice screen tech to make the image better so 4k HDR sounds like the next way forward for me.
 
Soldato
Joined
22 Aug 2008
Posts
8,338
There are cabling standards that can provide the needed bandwidth. Although DP1.4 is the likely thing moving forward:

On January 27, 2016, VESA announced that DisplayPort version 1.4 will support the Rec. 2020 color space.

https://en.wikipedia.org/wiki/Rec._2020
http://www.prnewswire.com/news-rele...ons-and-richer-display-content-300210420.html
https://en.wikipedia.org/wiki/DisplayPort#1.4

1.3 hasn't even arrived yet in monitors so who knows how long we'll have to wait.

I'm gonna switch to a 4K120 OLED this year, so I'll be fine.
 
Last edited:
Soldato
Joined
3 Sep 2010
Posts
2,846
Great news. And the $4999 price tag is a good sign I think, considering it's 4K 120 Hz, and 30".

Perhaps we'll see some reasonably priced 27" 1440p ones over the next year.

4k oled make sense, 1440p dont.
reasonable priced oled happens maybe 2018.
Unless they made a huge breaktrough with oled production it will continue to be expensive.

oled and HDR and 120hz+ would be a dream in 4k to have for sure but I am not hopeful with the pricing there.

Polaris will run it with great brightness, starlike even
 
Soldato
Joined
7 Feb 2015
Posts
2,864
Location
South West
4k oled make sense, 1440p dont.
reasonable priced oled happens maybe 2018.
Unless they made a huge breaktrough with oled production it will continue to be expensive.

oled and HDR and 120hz+ would be a dream in 4k to have for sure but I am not hopeful with the pricing there.

Polaris will run it with great brightness, starlike even

1440 and 1600p will be the only resolutions we will see HDR 120-144hz. DP1.3a does not support any higher than 60hz HDR at 4k.
Although i guess some may make 120hz HDR 4k monitors but with dual display port inputs if it is possible.
 
Last edited:
Soldato
Joined
26 Aug 2004
Posts
5,052
Location
South Wales
I was running BF4 with 115 fps in eyefinity with a single amd card. 5040x1050. I dont understand what you talk about. I own kids in spite of having a slight inputlag with eyefinity at age 52.

Polaris going to be superb.
the die shrink is whats making most of performance boost anyhow as we seen with 28nm that there is as much engineers can do with moving electrons.

since I am highly sensitive to latency etc..I run single cards as sli as such is to bad.
AMD has stepped up on that latency thing and their cards totally blows nvidia out of the water.

4k is viable already with Nano.
With Polaris we are seeing some amazing progress in that arena and its just the start. Cards and drivers are already up and running. great days ahead.

Yeah I'm not saying you can't get a decently playable experience with 4k, I was just having a bit of a rant.. wishing single cards were a lot more powerful than they are. More fps is always a good thing :)
 
Last edited:
Associate
Joined
2 Aug 2009
Posts
1,108
Location
UK
Will somebody ELI5 hdr monitors? Is it a new panel type named after high dynamic range? Will it support high refresh rates and low response times?
 
Soldato
Joined
3 Sep 2010
Posts
2,846
Will somebody ELI5 hdr monitors? Is it a new panel type named after high dynamic range? Will it support high refresh rates and low response times?

Not sure what you ask for, OLED support high refresh rates and low latency and adding HDR allows a more dynamic range of contrast and colors and light.

All supported by Polaris GPU from AMD.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Great news. And the $4999 price tag is a good sign I think, considering it's 4K 120 Hz, and 30".

Perhaps we'll see some reasonably priced 27" 1440p ones over the next year.

Wow, that's cheap. Let me check down the side of the sofa.

Hmm... wait... that feels like money... there, 20p! Maybe next year, then.
 
Back
Top Bottom