1080p gaming on a 4K Monitor

Associate
Joined
15 Oct 2014
Posts
746
Location
Somerset England
Hey guys,

Having been very dissapointed with the QC on the new PG279Q (sent mine back) I am considering seriously considering the Asus PA328Q as I spend most of my time in the desktop anyway and Im only a casual gamer.

My question is, can you easily run all games at 1080p on a monitor like this? I know they wont look wonderful, but as long as they run smooth im starting to wonder if that would even bother me.

I would be using the monitor for programming, digital art, web browsing and games like BF4. Also are there any cons in terms of scaling on the desktop i should be aware off?


many thanks! :)
 
Assuming your 4k monitor can run 1920x1080@120hz (most 4k 60hz monitors may be capable of that--check the product manual),

Since 4k on a native 1080p screen (via downsampling) is exactly 4x supersampling, a 4k screen can scale to 1080p perfectly at half resolution, without needing to use a 1:1 aspect option. Unfortunately, Gsync 4k monitors will add multiple frames of input lag, as Gsync does not support hardware display scaling, and Nvidia's GPU scaling has been adding extra unwanted frames of input lag for ages (AMD does this much better). A freesync 4k monitor or a traditional scaler should scale to 1080p nicely with minimal input lag, as long as the full screen OSD option is used.

according to Eizo's manual on how 1:1 works, using the 1:1 option via display scaling can add up to 2 frames of input lag (still far less than Nvidia's gpu scaling does).
 
In my case unless the monitor is at native 4k, it just does not look good. Always better playing at monitors native res.
 
It's really not worth it. Interpolation completely destroys the perceived contrast ratio of the display.

Try it for yourself: If you have a 1920*1080 monitor, run a custom resolution of 960*540....then sit back and enjoy all the misty, vaseline smeared loveliness :-)

As TNA said: Always native
 
It's really not worth it. Interpolation completely destroys the perceived contrast ratio of the display.

Try it for yourself: If you have a 1920*1080 monitor, run a custom resolution of 960*540....then sit back and enjoy all the misty, vaseline smeared loveliness :-)

As TNA said: Always native

Lol@vaseline smeared loveliness... well I'm glad I know its not a good idea then :D
 
Lol@vaseline smeared loveliness... well I'm glad I know its not a good idea then :D

Indeed.....although I do understand your logic as I went through the same thought process: 3840*2160 for the desktop and older games, 1920*1080 for more demanding AAA games and FPS.

...I then went on to consider a larger 4K display but running the demanding games at 3840*1645...you'll keep the pixel integrity but change the aspect ratio to 21:9. You'll save approx 2 million pixels to render but downside is you'll obviously have black horizontal bars to contend with.

..compromises....always compromises. It's like buying a house really... :)
 
It's really not worth it. Interpolation completely destroys the perceived contrast ratio of the display.

Try it for yourself: If you have a 1920*1080 monitor, run a custom resolution of 960*540....then sit back and enjoy all the misty, vaseline smeared loveliness :-)

As TNA said: Always native

1920x1080 is a quarter of 3840x2160 though. So there is no interpolation and it just looks like if you were running a 1080p monitor.
 
1920x1080 is a quarter of 3840x2160 though. So there is no interpolation and it just looks like if you were running a 1080p monitor.

Lovely theory and in an ideal world yes, but unfortunately most UHD monitors still use interpolation to display 1920 x 1080.
 
Lovely theory and in an ideal world yes, but unfortunately most UHD monitors still use interpolation to display 1920 x 1080.

What interpolation is happening and does it really have such a major effect because i personally cant tell the difference? I dont understand if its an exact multiple there is no need for any?
 
Too bad really, CRT was better in that respect. Kinda like viewing DVD's on a 1080p TV, they actually look better on a 720p screen i guess since there's less to interpolate.
 
What interpolation is happening and does it really have such a major effect because i personally cant tell the difference? I dont understand if its an exact multiple there is no need for any?

It isn't about having a need for any. Unfortunately most monitors don't distinguish between 1920 x 1080 and resolutions that don't have exact multiples. And they apply exactly the same interpolation process regardless. I cover this in my reviews of UHD monitors - and I have reviewed several now and climbing.
 
1080p 60hz looked pretty good on my PB287Q that I had. In terms of gaming.
1440p looked pretty decent in Windows Desktop although a bit soft.

But didn't dropping to 1080 cause a noticeable drop in contrast ratio in your eyes? It certainly did for me.

I could put up with the decrease in resolution and increase in aliasing, but the blacks looked so grey I thought that I had developed cataracts.
 
But didn't dropping to 1080 cause a noticeable drop in contrast ratio in your eyes? It certainly did for me.

I could put up with the decrease in resolution and increase in aliasing, but the blacks looked so grey I thought that I had developed cataracts.

Looked the same to me.
 
Wat? I'm a bit lost. So to run any resolution inferior to the spec'd one of the monitor, the monitor must have an internal scaler or not? Because I knew that gsync monitors don't have one so they simply shouldn't downscale and run at lower res, is that correct? And at the same time I thought that windows or the nvidia/amd control panel was handling all that, so it simply shouldn't show resolutions not supported by the monitor/scaler if you go to modify it. Correct again? Wrong both? Welp :confused:
 
Assuming your 4k monitor can run 1920x1080@120hz (most 4k 60hz monitors may be capable of that--check the product manual),

Since 4k on a native 1080p screen (via downsampling) is exactly 4x supersampling, a 4k screen can scale to 1080p perfectly at half resolution, without needing to use a 1:1 aspect option. Unfortunately, Gsync 4k monitors will add multiple frames of input lag, as Gsync does not support hardware display scaling, and Nvidia's GPU scaling has been adding extra unwanted frames of input lag for ages (AMD does this much better). A freesync 4k monitor or a traditional scaler should scale to 1080p nicely with minimal input lag, as long as the full screen OSD option is used.

according to Eizo's manual on how 1:1 works, using the 1:1 option via display scaling can add up to 2 frames of input lag (still far less than Nvidia's gpu scaling does).

Ignoring 1:1 mapping with no scaling.

Don't think there are any 4K 60Hz panels that will do 1080p at true 120Hz? (there are a few TVs but that isn't a true 120Hz) - best I've seen is 75-80Hz.

Most 4k panels do fairly well with 1920x1080 resolution but there is still a slight softening of the image at the very least but for gaming its fairly useable - a lot better than the results of 1080p downscaled on a native 2560x1440 panel.

The input latency tends to take quite a bit though never mind G-SYNC, etc.
 
The contrast ratio probably looked wrong because you had not selected the correct limited or full range option which sometimes happens when using 1080p. Nvidia at least now has the option in the control panel to select full range which is what most monitors expect.

The PA328Q has the best scaling you can get for 1080p, yes it's interpolated but it's of very high quality and using the Vivid pixel option at 25 it looks superb. I use PS4 and XBone on it as well as PC at 4k. I would buy this monitor without a doubt i love mine. Not all monitors are equal in terms of scaling but this one is certainly top notch. If no interpolation took place the image would look very blocky at close range which obviously monitors are used at.

The input lag at 1080p is just as good as it is at 4k, under 10ms.
 
Last edited:
If what you guys are saying is true then it might persuade me to look further into 4k again.

The problem I have is that the next monitor that I'll buy will have either g-sync or freesync. I'm interested in the upcoming XB321HK at it's 4k IPS with G-Sync but I'm still not convinced that a 4k G-Sync panel will display 1080 as efficiently as a native 1080p one.

Does anyone own the Acer XB280HK and have they tried gaming in 1080 res?
 
Back
Top Bottom