1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How would a 1080p monitor display an output of 1280 x 1024?

Discussion in 'Monitors' started by GodAtum, Sep 11, 2016.

  1. GodAtum

    Wise Guy

    Joined: Sep 27, 2009

    Posts: 1,367

    How would a 1920x1080 monitor display an output of 1280 x 1024? I have a Dell R610 with a VGA output and need to buy a monitor to connect to it.
     
  2. Falkentyne

    Gangster

    Joined: Apr 7, 2006

    Posts: 204

    Either GPU scaling or display scaling.

    Since the monitor has a hardware scalar, you may get more quality (marginally) than using the GPU. So you can use ToastyX Custom resolution utility to create a custom 1280x1024 resolution (not sure if LCD automatic or CRT standard) and enter your refresh rate.

    Then at the refresh rate you specified, you can use the monitor OSD to select aspect or 1:1 centered scaling or full screen. For the other refresh rates that you did not specify in the custom resolution, your *GPU* would be scaling them from 1920x1080, so you have to use the video card scaling settings in its driver panel (full/aspect/1:1).

    Any resolution that the monitor can identify in its OSD directly is a hardware display scaled resolution and the monitor controls the scaling settings. If the monitor says it's 1920x1080, then it's the GPU doing the scaling. Entering EDID overrides in ToastyX CRU will force that refresh rate you specify to be display scaled.

    Note: display scaling non native resolutions may not work in directX 10+ games; sometimes the game will switch to a different refresh rate than the one specified in the EDID override, in which case you need to force GPU scaling instead.