• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Next Gen Display Experiences! HDR Polaris graphics

The vast majority of PC gamers(and gamers in general) get along with 60hz displays just fine.

The majority of gamers also get along on integrated graphics just fine too. The majority of gamers don't know any better.

The majority of people who try higher hz monitors won't go back to 60hz monitors.
 
well the new g6 and i guess e6 are meant to be 35ms thats not so bad
i do think the samsung has better colors tho! and yeh half the response time

Not sure where you tested/compared but I would say that OLED should always produce better colours especially out the box. I don't know which Samsung model you are referring too either.

If you have only seen them in different shops I would honestly disregard that as there are too many external factors to consider. A proper demo with them calibrated at a proper audio/visual store or to be allowed them to be taken home to demo in your own space and calibrated to that room.

35ms is better but until they are down to 20ms I think I will be waiting personally.

The Samsung, I don't know which model you are talking about directly but they have never been known for great colour production in honesty.
 
Hell yeah. 4K all the way for me. Had 144Hz IPS FreeSync for a few days and it was nothing to write home about as I do not play much FPS anyway, much less competitive FPS.

The IQ between my Dells 2160p and the 1440P Asus MG279Q was night and day, not to mention the Asus had severe backlight bleed.

I will take IQ every time, this is why I look forward to HDR :D.

That's fine, IQ is the most important thing to you. To some people smooth, tear free gaming is more important.

You were unlucky with Asus, my first one was bad but the one I have now is perfect. No backlight bleed, very little IPS glow.

And you are right Freesync isn't for everyone. I made that point in an earlier post, if you aren't the type to notice the difference between 60hz and 144hz when using a monitor than Freesync isn't going to make a difference, as you know. So Humbug should just buy a good 1440p monitor and be done with it and forget about freesync.
 
The Samsung, I don't know which model you are talking about directly but they have never been known for great colour production in honesty.
Use to love how Samsung TVs are with how vibrancy their colour is...but after spending lots of time looking at other brands' TV as well, I now find the Samsung's colour is a bit OTT, and look almost artificial and far from the natural colour.
 
Use to love how Samsung TVs are with how vibrancy their colour is...but after spending lots of time looking at other brands' TV as well, I now find the Samsung's colour is a bit OTT, and look almost artificial and far from the natural colour.

It's the "Loudness War" of displays, imo. (If the term isn't familiar I recommend searching as it's interesting).
 
Use to love how Samsung TVs are with how vibrancy their colour is...but after spending lots of time looking at other brands' TV as well, I now find the Samsung's colour is a bit OTT, and look almost artificial and far from the natural colour.

Aye they are often over saturated and push brightness to make them artificially look better. I can see why people may like this but it looses details and depth of field compared to a true calibrated colour that others provide out the box much closer these days.
 
Not sure where you tested/compared but I would say that OLED should always produce better colours especially out the box. I don't know which Samsung model you are referring too either.

If you have only seen them in different shops I would honestly disregard that as there are too many external factors to consider. A proper demo with them calibrated at a proper audio/visual store or to be allowed them to be taken home to demo in your own space and calibrated to that room.

35ms is better but until they are down to 20ms I think I will be waiting personally.

The Samsung, I don't know which model you are talking about directly but they have never been known for great colour production in honesty.

the new samsung, im bad with model names
KS something
contrast the oled will always win, watching anything in the dark
but the samsung is brighter and i think the colors look better
and not just by a little bit
no ive not been lucky enough to have a side by side test so im going by memory, one that cant remember model names :p
 
Why? My monitor burns my retinas out at 75% brightness. WHy do I want/need a 1000 nits display?

Okay for TV I can understand it but not pc monitors.

Current displays do not operate in the same way.

First of all HDR vastly increases the colour gamut from 16 million to over 1 billion colours. So finally proper or more natural colour tones can be better generated as can also the definition between the dynamic range of the sharpest whites and darkest blacks yet maintain much better detail.

TVs have more processing options than perhaps monitors will have to adjust the output, see video below to get an idea of how the Dolby Vision (technically superior) but on OLED with less nits compares to HDR with additional processing on a 1000+ nits LED.

So far LG is the only main backer for US/EU market to adopt Dolby Vision.
No games as yet have been announced to use it that I am aware of and likely requires licensing fees for the developer. Does not seem to be the demand for it over standard HDR for games. The UHD Premium Alliance is the alternative to form some kind of standard. Sony for example did not produce a model this year under the UHD Premium as the main specs were 1000 and Sonys is only approx 750 IIRC to fully support the HDR10 usage. Also models from Phillips and others are sold as being HDR compatible but again do not have the specs to offer 1000 nits. Perhaps by next year models they offer they will join Samsung SUHD in towards the "UHD Premium Standard". This way it helps a consumer knowing they at least are buying a model that has specifications desired to fully accommodate it.

So potentially a monitor with 600 compared to a model with 1000 can produce more range within scene's shown and while not perhaps making a massive difference at these levels it does mean that the quality will gradually increase. Unlike resolution that is harder to determine from a distance HDR will make more of an impact to general image quality.

HDR is best suited for relaxed or darker room environments.


Good HDR DEMO content available HERE

See Nvidia Gameworks Blog articles also.

This video goes into all the tech detail on the subject from Nvidia.

Xbox OneS will be the first device to have a full released title built with HDR in mind with Horizion 3. I am not aware if the PC version will support HDR from release or if it comes later by patch but certainly Microsoft are bringing HDR via DirectX12. I hope Digital Foundry or others look into it all in detail come the release. Other games were announced at E3 to also use it on PC but in time it will become more adopted.

jQiKOGJl.png

It will make a big difference on good displays...
fTChGHPl.jpg.png

From Nvidia Gameworks Blog article...
HDR Left standard on Right. Notice the lack of detail in high contrast areas and how more natural HDR appears with no loss of detail lost in light and shadow areas.
YSVejhdl.jpg.png
 
Last edited:
i can see the benefit in HDR but surely nits is how bright something is? So my screen has a max brightness of around 450 so why isnt a 1000 nits screen way to bright for a monitor?

Yes I think the better dynamic range is fantastic but why can't you have a HDR screen with just 450nits?


My gfx card and monitor are both already 10-bit anyway. So its realy only the dynamic range gained.
 
the new samsung, im bad with model names
KS something
contrast the oled will always win, watching anything in the dark
but the samsung is brighter and i think the colors look better
and not just by a little bit
no ive not been lucky enough to have a side by side test so im going by memory, one that cant remember model names :p

If it was a KS model they are the top end of the market bar non with the range topper being £17k and we don't know the OLED model.

I would say they don't look brilliant in extreme lit sales rooms from places like curries although John Lewis are pretty good at showing them off well. So that may well be the case.

Pop into a local Richer Sounds or Seven Oaks and just ask them to compare and I think you will be pleasantly surprised.
 
Hell yeah. 4K all the way for me. Had 144Hz IPS FreeSync for a few days and it was nothing to write home about as I do not play much FPS anyway, much less competitive FPS.

The IQ between my Dells 2160p and the 1440P Asus MG279Q was night and day, not to mention the Asus had severe backlight bleed.

I will take IQ every time, this is why I look forward to HDR :D



Yeah, it works perfectly fine for me.

Sure I would not mind 120Hz+ but not at the expense of IQ.

How is the 1070 paired with a 4k monitor? What kind of frames do you get at what settings?
 
i can see the benefit in HDR but surely nits is how bright something is? So my screen has a max brightness of around 450 so why isnt a 1000 nits screen way to bright for a monitor?

Yes I think the better dynamic range is fantastic but why can't you have a HDR screen with just 450nits?


My gfx card and monitor are both already 10-bit anyway. So its realy only the dynamic range gained.

The greater nits is required to be able to produce areas on a screen or in a scene with sharp whites to the darkest shadows. Only the required areas of the display are using the max amount of nits.

Its not about total screen output brightness or display settings that blinds you with over saturated colours and blinding or blooming in whites. Your screen cannot produce the detail or colour gamut range with RGB that 2020 provides with HDR. Everything can become more natural looking.

You are looking at it the wrong way, watch some of the videos and read about it in the links if you want to learn.
See the difference below in the sky colours to also the shadow detail on the hillside.

 
Last edited:
i think anyone into movies will choose dolby vision right now?

does feel like another gsync/freesync thing :(
 
i can see the benefit in HDR but surely nits is how bright something is? So my screen has a max brightness of around 450 so why isnt a 1000 nits screen way to bright for a monitor?

Yes I think the better dynamic range is fantastic but why can't you have a HDR screen with just 450nits?


My gfx card and monitor are both already 10-bit anyway. So its realy only the dynamic range gained.

HDR requires the higher nits to give greater shades, contrast and depth to the picture. It does not however mean that screen brightness is 3 times that of a non HDR screen as that is not how Nits work.

Above someone posted some further info that reinforces this. You can have a HDR screen with less Nits but it wouldn't be as good because it would be synthetically reproducing detail/information instead of naturally.

Just because the graphics card and screen are 10bit doesn't mean the source you are providing is coded in 10bit so that unfortunately isn't the case either.

You need the source (game/movie etc) to be 10bit encoded, the processor (GPU/Bluray palyer) to be able to process the encoded data and then the display (TV/Monitor) to be able to understand what the processor is doing to show the content.

You don't always have a processor in that your TV will be able to process also but that is new tech as well and the source is then Netflix/Other online services at this time.
 
That's fine, IQ is the most important thing to you. To some people smooth, tear free gaming is more important.

You were unlucky with Asus, my first one was bad but the one I have now is perfect. No backlight bleed, very little IPS glow.

And you are right Freesync isn't for everyone. I made that point in an earlier post, if you aren't the type to notice the difference between 60hz and 144hz when using a monitor than Freesync isn't going to make a difference, as you know. So Humbug should just buy a good 1440p monitor and be done with it and forget about freesync.

I noticed the difference, but it was not as much of a big deal for me vs image quality.

As for the Asus, even if the backlight was was perfect, the IQ was just no way near my Dell. You have to see it side by side to see what I mean. The colour, the image sharpness and quality are just much better.
 
i think anyone into movies will choose dolby vision right now?

does feel like another gsync/freesync thing :(

I don't think so. Dolby are not trying to lock anyone into it but it does need different processing to produce the better results. They are just ahead of the others in performance at this time.

Anyone can choose to use Dolby Vision, they just need to pay for the licence. With gsync you are locked to Nvidia which would be like locking LG to Dolby for instance. That wont happen though.

I would choose Dolby Vision if I had the corresponding parts to complete the loop, source information being the biggest part at this time.

If people produce the content for Blu-ray and online services to use Dolby Vision then the rest will fall into place, if not then HDR10 will win out. I believe we will go this route as you can get HDR certified with HDR10 which is free in comparison to Dolby license.

Kind of the same reason that LG have not managed/chosen not to go down the THX certified for their OLED range compared to Panasonic which have and is the first OLED screen to do so. This was possible due to their custom processor and work with someone from Hollywood who specialises in colouring films accordingly.
 
How is the 1070 paired with a 4k monitor? What kind of frames do you get at what settings?

Doom gets 50-60fps with everything on Ultra, I do have a few things unchecked though, like depth of field and blurring effects which I do not care for. They are there to cost fps and make IQ worse for me.

Crysis 3 everything on Very high was getting 30-40fps and that is still probably one of the most demanding titles at 4K.

The 1070 plays 99% of my steam library above 60fps.

If you want minimum 60fps and must have every setting maxed on triple a 2016 titles though, even a 1080 won't be enough. But if like me you are willing to tinker and get rid of silly effects that make things look worse and do not bother with settings that just bring no IQ benefit, then 1070 is good enough :)

That said I think the cards that will be perfect for 4K will be big Vega and big Pascal which I will likely get next year.
 
Snip from Eurogamer article on GOW with HDR...

"As striking as the E3 stage demo was, it's the 25-minute demonstration shown off later that impressed us the most. The game's co-op mode was demonstrated using both the Xbox One and PC versions of the game thanks to the Xbox Play Anywhere initiative.

The PC was connected to a massive 98-inch 4K LCD display with the game operating at this extremely high resolution and it looked great.

Next to this, however, was an Xbox One S connected to a 65" HDR-capable LG OLED display. Despite the higher resolution on PC, the strength of HDR combined with the remarkable performance of the OLED display resulted in a breathtaking presentation. We had our doubts about HDR displays but, after the demo, we immediately started price-checking HDR-capable OLED TVs. It was that impressive."
 
Last edited:
Back
Top Bottom