• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Next Gen Display Experiences! HDR Polaris graphics

What does the graphics card do where HDR is concerned? I thought it was monitor based?

The GPU still needs to be able to process the HDR content. The grunt that a current GPU may be enough but it depends on the software for processing.

Basically at the moment we transmit in 8-bit which offers 16.7 million shades, HDR-10 offers 1 billion shades and Dolby Vision 68 billion shades. The source content needs to be HDR and then the TV processes from the source.

Of course with gaming the PC is the source so the GPU has to render the game in HDR to send to the TV/Monitor. That is my understanding to it.
 
would they need to redo all the textures you think?
it needs to display enough colors right?

I believe a lot of the textures used in game already offer a greater dynamic range and are actually HDRR (high dynamic range rendering), they are just not visible due to GPU/Monitor options.

I believe it was 2004 when Valve showed off HDRR in 2004.

AMD currently supports 30-bit monitors/10-bit colours from their R300 range. NVIDIA also support it in form of their Quadro cards although does not appear for their 900 range. Not sure on the 10 series.

Now TV's are finally catching up to this and thus AMD starting to push it for people to see.

An OLED HDR 4k/5k with Freesync/Gsync would be awesome but it is normally input lag that will cause issues until things mature further.
 
4k 120Hz OLED HDR monitor is the nirvana with freesync/g-sync as approriate.

Of course with the (extra) vurden of HDR (?) we will all need 1280TI cards by then plus I cant see HDR OLED monitors been affordable for another few years yet.
 
4k 120Hz OLED HDR monitor is the nirvana with freesync/g-sync as approriate.

Of course with the (extra) vurden of HDR (?) we will all need 1280TI cards by then plus I cant see HDR OLED monitors been affordable for another few years yet.

The card itself will push out a HDR picture assuming it has DP 1.3 or DP 1.4 as it is bandwidth that we are lacking not processing power of the source content.

The 4k 120Hz I would suggest we will be pushing for silly Hz as bandwidth increases before 8k hits the market so might even end up with 4k 240Hz and then Volta should be out by the time we are pushing towards any of this. Although with them so far rumoured to be sticking with 16nm chip that could maybe hold things back.
 
4K OLED HDR VRR 144Hz ULMB Ultrawide when?

Maybe don't need the ULMB with OLED. Or the 144Hz upper limit. So probably not the VRR either... 4K OLED HDR Ultrawide when?...
 
Last edited:
I read that HDR gives you little when using with a monitor its more suitable for a large screen TV viewed at regular/average viewing difference. As it loses impact sitting close to the screen.

I'm going to respectfully but utterly disagree with this. HDR gives you a dramatic improvement to how good an image looks. I've not seen it in games but in all image-based work and movies, it makes a real difference.

I sit around 60cm away from my monitors. 4K is fairly meaningless for me on 24" monitors. Okay, not meaningless but definitely in the "sure, if it's cheap" range on my scale of values. But HDR I will notice immediately and the difference will be very noticeable. I've not bothered getting a new monitor for 4K, I will for HDR.
 
What does the graphics card do where HDR is concerned? I thought it was monitor based?

Graphics cards need to be able to send more detailed information to the monitor. It's not the case that the monitor sees two instructions to make a pixel green and decides to split them up into slightly different brightnesses of green through some cleverness, but rather that the graphics card must tell it to do pixel x,y at value This and its neighbour at value That.

Honestly, I think HDR support is the number one reason not to buy an older graphics card at the moment.
 
Dont you think the average consumer is going to be confused as hell though.
4k, UHD, Dolby Vision, HDR, most people are just getting their heads round HD even then the older generation still are happy with SD no way are they going to know what HDR is !
Only way it would change would be the removal of SD altogether bit like when they switched off analogue TV signals :)
 
Depends if Screen vendors actually implement it across a wide range.

Free-Sync at 1440P is pretty dire, i know i was recently looking and gave up.

G-Sync is even worse.

https://forums.overclockers.co.uk/showthread.php?t=18739709

Sorry, slightly off-topic but in response to your old thread I recommend the Dell U2515H for 1440p. It can be found for around £250 and although it's not g-sync or freesync I genuinely haven't had any issues with screen tearing/ghosting etc. It's an absolutely cracking monitor for the price.
 
Back
Top Bottom