• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Next Gen Display Experiences! HDR Polaris graphics

As i said in the thread i linked, people who don't spend mega £ on GPU's to drive 1440P at high frame rates to keep the game smooth are also not going to spend mega £ on a screen.

I will. I want to see photos the best quality that I can when I work on them. Ditto for videos or movies I want to watch. Neither requires a high-end GPU.

When I game I've done so happily on my R9 285 at 1080p. I believe the image quality gain from HDR is far more than the quality gain I get from a 24" monitor at 4K when it's situated 60cm from my face. At that distance and size, 4K is just a "I'll take it if it's offered" deal, not something I particularly care about. And whilst I'm sure 144Hz might be better than 60Hz I don't know that it's that cries out to me to have. I'll factor it in if the cost difference is small next time I buy a monitor, but it's not something I need.

But HDR I am sold on. I might hold off if it's very expensive, but it's definitely something I want. I set far more value on it than I do on a monitor being huge or having a super-high refresh rate.

EDIT: Also, everything the great GoogalyMoogaly said. :D
 
Although I want to see it as cheap as possible of course, I think this isn't necessarily true.

A monitor can last you 3 GPU changes easily. So I think people may be inclined to spend up to double what they'd spend on a GPU.

That would be £600, GPU Budget of £300 = £600 on screens, I don't buy that.

Would you really consider a 60hz monitor for gaming? Because those are the only 1440p monitors available for under £300. The Acer XF270 is one of the cheapest 1440p 144hz monitors out there, and as an added bonus it's an IPS monitor.

Yes, i'm not interested in 144Hz as I don't do competitive gaming, I have a 60Hz IPS screen right now, have had it for 4 or 5 years, gorgeous image quality, that matters far more to me than 130+ FPS game play, to me that's the same as 60, it has no value to me.
 
Until I bought my 1080 I had always spent double to triple on my screen over my card.

After all I stare at it for long periods.

Obviously needs to be matched with the card, but a poor screen won't do a 1080 any justice or viva versa.
 
That would be £600, GPU Budget of £300 = £600 on screens, I don't buy that.



Yes, i'm not interested in 144Hz as I don't do competitive gaming, I have a 60Hz IPS screen right now, have had it for 4 or 5 years, gorgeous image quality, that matters far more to me than 130+ FPS game play, to me that's the same as 60, it has no value to me.

144hz isn't just for competitive gaming. I presume you want freesync for smoothness? The difference between using a monitor with 60hz and 144hz is huge. It's so smooth and less motion blur and it's just so much nicer to use from both a gaming and a general desktop use point of view.

There was the excuse for not buying 144hz monitors is that they were TN, but, that's not the case anymore. There are 144hz IPS monitors now.

If you are the kind of person who doesn't the notice the difference between using a 60hz monitor and a 144hz monitor then don't waste your time buying a freesync monitor, as you are not going to see the difference there either.

So if I was you, just buy a good 1440p IPS monitor and be happy.

I don't understand the other nonsense though, what has your GPU choice go to do with the price you pay for the monitor? The only reason to buy an expensive GPU is if you game, or maybe do workstation 3d modelling type work. But there are tons of reasons for buying a good monitor, movies, photo editing, video editing, more desktop space for work, nicer to look at it. More features, etc.

Spending money on a good monitor is money well spent. A good monitor will last through several computer builds. Why would anyone cheap out on a monitor??
 
Would you really consider a 60hz monitor for gaming?

Above 60Hz gaming is still a tiny niche.

Obviously it's the goal for most gamers, but they're currently expensive (or TN panels) and require a luxury-grade card to run.


That would be £600, GPU Budget of £300 = £600 on screens, I don't buy that.


Yes, i'm not interested in 144Hz as I don't do competitive gaming, I have a 60Hz IPS screen right now, have had it for 4 or 5 years, gorgeous image quality, that matters far more to me than 130+ FPS game play, to me that's the same as 60, it has no value to me.

A couple of years ago I bought a £250 card and two £240 monitors (24" 1080p IPS monitors). I've bought 2 more cards since, GTX 780 and now 1070, but haven't changed the monitors yet.

I would currently consider buying a monitor up to ~£600 if it was something special, like 144 Hz 34" IPS freesync/gsync. Or maybe I'll settle for 27" 2560x1440, but again 144 Hz freesync/gsync. And I guess I should get HDR too for future proofing.

Also, although I bemoan paying a premium for 144 Hz (and gsync for that matter), there is a thing your brain does which increases the perception of quality if an image if it's moving (like why you suddenly notice how low-res a movie is if you pause it) so if you had an identical panel running at 60 or 144, you will perceive the 144 is higher image quality.

If everything pans out like I imagine, I'd like to end up with either the top AMD Vega card, or the one down from top, and a large 1440p+ 144 Hz freesync IPS HDR monitor (shame OLED likely won't be on the cards).
 
Last edited:
144hz isn't just for competitive gaming. I presume you want freesync for smoothness? The difference between using a monitor with 60hz and 144hz is huge. It's so smooth and less motion blur and it's just so much nicer to use from both a gaming and a general desktop use point of view.

There was the excuse for not buying 144hz monitors is that they were TN, but, that's not the case anymore. There are 144hz IPS monitors now.

If you are the kind of person who doesn't the notice the difference between using a 60hz monitor and a 144hz monitor then don't waste your time buying a freesync monitor, as you are not going to see the difference there either.

So if I was you, just buy a good 1440p IPS monitor and be happy.

I don't understand the other nonsense though, what has your GPU choice go to do with the price you pay for the monitor? The only reason to buy an expensive GPU is if you game, or maybe do workstation 3d modelling type work. But there are tons of reasons for buying a good monitor, movies, photo editing, video editing, more desktop space for work, nicer to look at it. More features, etc.

Spending money on a good monitor is money well spent. A good monitor will last through several computer builds. Why would anyone cheap out on a monitor??

A sense of perspective, a £250 to £300 screen is not a cheap one, there are plenty of very good quality monitors around that price, i don't see why a good Free-Sync screen must be priced higher than that.

I have also explained sevral times now that to drive 1440P @ 140 FPS would require some serious and expensive GPU power.

with Free-Sync there is no need for that.
 
Last edited:
Yes, there is a standard agreed upon now.

10-bit color.
90% P3 color gamut
1000 nits brightness

Current monitors are approx 250-350 nits.
From what I've read some of the first HDR monitors will be around 600, perhaps some of the more expensive models will reach 1000.
Certainly it is expected CES 2017 will show options including the latest 4K HDR TV models will be up to 2000.
 
Last edited:
Would you really consider a 60hz monitor for gaming? Because those are the only 1440p monitors available for under £300. The Acer XF270 is one of the cheapest 1440p 144hz monitors out there, and as an added bonus it's an IPS monitor.

Hell yeah. 4K all the way for me. Had 144Hz IPS FreeSync for a few days and it was nothing to write home about as I do not play much FPS anyway, much less competitive FPS.

The IQ between my Dells 2160p and the 1440P Asus MG279Q was night and day, not to mention the Asus had severe backlight bleed.

I will take IQ every time, this is why I look forward to HDR :D

The vast majority of PC gamers(and gamers in general) get along with 60hz displays just fine.

Yeah, it works perfectly fine for me.

Sure I would not mind 120Hz+ but not at the expense of IQ.
 
Current monitors are approx 250-350 nits.
From what I've read some of the first HDR monitors will be around 600, perhaps some of the more expensive models will reach 1000.
Certainly it is expected CES 2017 will show options including the latest 4K HDR TV models will be up to 2000.



1-2000 will be ok if your watching\playing in the dark. But 3000+Nits should be ok.
 
I have also explained sevral times now that to drive 1440P @ 140 FPS would require some serious and expensive GPU power.

That is true and disappointing at the same time and while I'm still good with my current monitor being 120hz @ 1080p, what you said makes it difficult for me to decide my next upgrade be it 1440p or 4K.

Not an easy decision by any means, and I am not going back to 60hz now. Ideally a 4K screen that can scale perfectly without interpolation to at least 1440p or something would be amazing, unfortunately they don't work that way.

Still, I am fine with what I have for at least another year. Need to see what happens with GPU's but yes 1440p and high refresh is tough to drive well.
 
Is there a HDR standard at the moment? How many bits are we talking? NVidia cards can do 12bit colour depth. This sort of thing has come a long on drivers for a while. I would assume it'd be down to drivers and connection bandwidth?
]

I posted about this earlier. There are two HDR standards at moment, HDR-10 and Dolby Vision. HDR-10 is 10bit and 1000Nits and Dolby Vision is 12bit and 4000Nits.

Now the GPU can handle the data in that it can process it and this is likely the case for a number of years, however the connection types haven't existed so at the moment the only GPUs that support full HDR as it is intended is AMD 300 series and up.

Currently only the professional line of Nvidia GPU's support HDR although this may way be possible for Pascal to do with firmwire update.

Should also point out that AMD only supports HDR-10 not Dolby Vision. Dolby Vision requires further processing hardware and thus is not supported by most HDR TV's and no GPU's. It is a better quality but the cost is higher and I feel that HDR-10 will be the standard agreed over the next year or so.

Further to that though, a lot of films now are recorded in Dolby Vision and so these can go on UHD Blu-ray, however the player needs to support it as well then. Again most if not all at moment are HDR-10 spec.

Hope that helps a little.
 
or 540~ nits with better blacks :cool:

Someone posted Nvidia's HDR SDK the other day on this forum (Sorry, I forget who) and to my surprise, it caused a 980Ti to output HDR to my 55EG960V-ZD.

That isn't true HDR though. Nvidia do not officially support HDR with anything but their professional line up of cards, at least that was true with anything up to the 9 series but I do believe they have added HDR support in Pascal.

Anything prior though sorry but it isn't HDR. Further to that as the standard is not agreed but HDR-10 is going to be the most likely as easier to implement the current Nvidia SDK is not in line and so they would not be able to market that as HDR.
 
Current monitors are approx 250-350 nits.
From what I've read some of the first HDR monitors will be around 600, perhaps some of the more expensive models will reach 1000.
Certainly it is expected CES 2017 will show options including the latest 4K HDR TV models will be up to 2000.

Why? My monitor burns my retinas out at 75% brightness. WHy do I want/need a 1000 nits display?

Okay for TV I can understand it but not pc monitors.
 
I posted about this earlier. There are two HDR standards at moment, HDR-10 and Dolby Vision. HDR-10 is 10bit and 1000Nits and Dolby Vision is 12bit and 4000Nits.

Now the GPU can handle the data in that it can process it and this is likely the case for a number of years, however the connection types haven't existed so at the moment the only GPUs that support full HDR as it is intended is AMD 300 series and up.

Currently only the professional line of Nvidia GPU's support HDR although this may way be possible for Pascal to do with firmwire update.

Should also point out that AMD only supports HDR-10 not Dolby Vision. Dolby Vision requires further processing hardware and thus is not supported by most HDR TV's and no GPU's. It is a better quality but the cost is higher and I feel that HDR-10 will be the standard agreed over the next year or so.

Further to that though, a lot of films now are recorded in Dolby Vision and so these can go on UHD Blu-ray, however the player needs to support it as well then. Again most if not all at moment are HDR-10 spec.

Hope that helps a little.

That's a shame as I'm going to make an educated guess that the cost difference between the two is pretty small as an overall percentage of the final product. Reminds me of Firewire vs. USB. The former was much superior because it handled the connections itself and therefore didn't trouble the CPU in the slightest - it just fed you your data. Whereas USB was more of a dumb socket that the CPU had to manage (and still does). The cost difference on a motherboard was something like 20p for the manufacturer, iirc. And USB won because of cost!

Given that the early buyers of this technology are probably going to be the well-heeled tech enthusiasts, you would think you'd be able to win the standards war on this with the Dolby system, but if no manufacturers produce consumer products for it... :(
 
Problem is that although we as enthusiasts want this and you can get Dolby Vision branded TV's but they are premium and we are talking hundreds of quid different not pennies unfortunately.

The problem is really that officially HDR10 is the one maintained by Ultra HD Blu-Ray certification so although there are Dolby Vision content and films recorded in this, due to the extra data it isn't certified on Blu-ray releases. The minimum spec is HDR10 for any branded HDR TV where as the Dolby Vision is an option.

Dolby Vision is also capable of playing HDR10 but not the other way around. The cost isn't just the part either. It is that Dolby have a licence agreement that needs paying due to the cost of research etc it is how they make their money, not from products.
 
Also doesn't help that manufactures don't state on adverts if it's HDR10 or Dobly Vision so a lot of people will HDR and not know there are two versions at this time.

The cheapest HDR Dolby Vision TV I have seen is the LG49UH770 which retails around £899 which isn't bad but you can get an equivalent HDR10 TV from LG is £100 less.
 
Last edited:
Problem is that although we as enthusiasts want this and you can get Dolby Vision branded TV's but they are premium and we are talking hundreds of quid different not pennies unfortunately.

The problem is really that officially HDR10 is the one maintained by Ultra HD Blu-Ray certification so although there are Dolby Vision content and films recorded in this, due to the extra data it isn't certified on Blu-ray releases. The minimum spec is HDR10 for any branded HDR TV where as the Dolby Vision is an option.

Dolby Vision is also capable of playing HDR10 but not the other way around. The cost isn't just the part either. It is that Dolby have a licence agreement that needs paying due to the cost of research etc it is how they make their money, not from products.

Also doesn't help that manufactures don't state on adverts if it's HDR10 or Dobly Vision so a lot of people will HDR and not know there are two versions at this time.

The cheapest HDR Dolby Vision TV I have seen is the LG49UH770 which retails around £899 which isn't bad but you can get an equivalent HDR10 TV from LG is £100 less.

Hmmm. I didn't realize it was that great a difference. Though I'm of the opinion that when you're spending upwards of £700 on a TV, the extra £100 is probably worthwhile to get a significant boost. The more you spend, the smaller a percentage that premium is.

I appreciate Dolby needing to recoup their costs but I hope (against the odds it sounds) that it doesn't stop this becoming a common standard for consumer level products.
 
Since only Panasonic and LG are providing OLED which is the response time issue with most monitors then I assume you are talking about a backlit LED model by Samsung?

well the new g6 and i guess e6 are meant to be 35ms thats not so bad
i do think the samsung has better colors tho! and yeh half the response time
 
A sense of perspective, a £250 to £300 screen is not a cheap one, there are plenty of very good quality monitors around that price, i don't see why a good Free-Sync screen must be priced higher than that.

I have also explained sevral times now that to drive 1440P @ 140 FPS would require some serious and expensive GPU power.

with Free-Sync there is no need for that.

Why would you need 140fps on a 60hz monitor?

And Freesync can only work to the max refresh rate of the monitor. If you want a bigger range you are going to have to get a higher hz monitor. That's the way sync tech works.

As I pointed out, the Acer 1440p freesync monitor is one of the cheapest 144hz, 1440p monitor out there.

It sounds like you don't know what you want or what you are trying to say.
 
Back
Top Bottom