Forcing 1440p on a 1080p monitor

Associate
Joined
9 Dec 2008
Posts
322
Had the temptation to purchase a new monitor, my friend has all the top hardware inside his rig and he has the ASUS ROG SWIFT PG278QR monitor so I can assume that thats up there too.

I recently stumbled across the AMD control panel settings though that lets you force higher than supported resolutions on your monitor, Virtual Super Resolution and GPU Scaling.

I am now running my monitor successfully in 2560x1440 @ 100hz

My monitor is the Acer GD245HQ 24" 120hz
GPU is R9 290 as well.

My questions are:

Am I really at that resolution or what exactly is going on?
If I am, is this any different to if I was actually using a 1440p monitor?
If no or not really much different, then is it actually worth it spending £500 for a new monitor?
 
If it's anything like DSR on nvidia cards it works by processing the image at the higher resolution then down sampling it to your screens resolution. So the fps you see is what you would get if you bought a 1440p monitor but the resolution you are seeing is only 1080p. People use it as down sampling can improve graphic quality.

Using a 1440p monitor would mean you get the same performance but would also see the higher resolution.
 
Had the temptation to purchase a new monitor, my friend has all the top hardware inside his rig and he has the ASUS ROG SWIFT PG278QR monitor so I can assume that thats up there too.

I recently stumbled across the AMD control panel settings though that lets you force higher than supported resolutions on your monitor, Virtual Super Resolution and GPU Scaling.

I am now running my monitor successfully in 2560x1440 @ 100hz

My monitor is the Acer GD245HQ 24" 120hz
GPU is R9 290 as well.

My questions are:

Am I really at that resolution or what exactly is going on?
If I am, is this any different to if I was actually using a 1440p monitor?
If no or not really much different, then is it actually worth it spending £500 for a new monitor?

Posts like this hurt my soul.
 
:l

The software hasn't added more pixels, friend.

In the same way that a driver update doesn't turn your R9 290 into an R9 Fury X. :p
 
Posts like this hurt my soul.
Were you born with this knowledge? :/ You were just as ignorant as this person is at one point.

Anyways OP, running a higher resolution than your monitor natively supports is what people call downsampling or supersampling. The app is indeed running that higher resolution internally, meaning it comes with all the higher demands of doing so, but your monitor can only output a lower resolution, so it 'down'scales the image.

The primary reason for doing this is AA. This is *the* most effective form of anti-aliasing, though obviously it is also the most costly, by far. It also produces a *slightly* sharper image. How noticeable these things will be to you depends on your sensitivity to them. Not everybody is as bothered by aliasing as others. It might not be worth it for you and may prefer to lower the resolution in order to increase your performance. It could also depend on which game you're playing as not all games suffer from aliasing equally.

But no, it is not quite comparable to playing it on a true 1440p monitor, where you not only get all those same benefits, but the image also sees a *substantial* increase in clarity due to being able to actually display all those pixels.
 
Back
Top Bottom