Different display resolutions

Associate
Joined
2 Jul 2021
Posts
256
Location
United Kingdom
Hi everyone, I'm quite new to gaming so please excuse me being a noob! I have a gaming monitor which is 1080p. I had a PC which plugged in and worked fine.

I have upgraded to a laptop which has a 2560p screen. I have set all the games to 1080p in the settings and they all work fine on my external monitor. Occasionally I like to game just on the laptop screen, I note that most games switch over to the 2560p resolution automatically when doing so.

I assume that is normal and good. Flight Sim doesn't automatically change back though when I plug in my external monitor with extended screen, it still thinks that 2560p even though it's displaying on 1080p. Is there anything wrong with that? I can change it in the settings back to 1080p but it displays fine anyway on 2560p on 1080p screen.

I am just trying to understand how it all works. The guy in the shop said you can use different display resolutions, but I don't fully understand. Cheers!
 
If you are running at a higher resolution than your card it will just be downscale it, so you will not be getting any benifit, but your frame rate will be lower. You do have super resolution which runs the game at a higher resolution and then properly downscales it, but as far as a a aware it might not be doing that in all cases and in your case might be doing a more premitive downscalling. You're best just running it at the native resolution.

If you run it at a lower resolution the image will look blurry as 1440 does perfectly go into 1080. You can do this if you are having performance issues, but it is not desirable. If you are having more serious issues then you could try half the resolution, which is a perfect factor, but that's 720p and is not great.

Just run at the native resolution, if it doesn't switch than change it manually. If you are running an older game and getting really high frame rates then you could use super resolution. I prefer to use whole factors for it, so 2160p for 1080p or 2880p for 1440p, you don't have to though and can select anything. But honestly unless you are getting like 400 fps it's not really worth it as it can cause issues.
 
If you are running at a higher resolution than your card it will just be downscale it, so you will not be getting any benifit, but your frame rate will be lower. You do have super resolution which runs the game at a higher resolution and then properly downscales it, but as far as a a aware it might not be doing that in all cases and in your case might be doing a more premitive downscalling. You're best just running it at the native resolution.

If you run it at a lower resolution the image will look blurry as 1440 does perfectly go into 1080. You can do this if you are having performance issues, but it is not desirable. If you are having more serious issues then you could try half the resolution, which is a perfect factor, but that's 720p and is not great.

Just run at the native resolution, if it doesn't switch than change it manually. If you are running an older game and getting really high frame rates then you could use super resolution. I prefer to use whole factors for it, so 2160p for 1080p or 2880p for 1440p, you don't have to though and can select anything. But honestly unless you are getting like 400 fps it's not really worth it as it can cause issues.
Thank you very much.
 
Back
Top Bottom