Hypothetical question about resolution.

Soldato
Joined
8 Apr 2008
Posts
3,076
Location
a galaxy far far away
Would forcing a console to output at a lower resolution increase framerate of a game?

I remember back in the old days of PC gaming. People used tovrun at low resolutions to get higher fps. Can you do this on consoles?
 
Yep, its what the Xbox 1 does with most games.

OOOOHHHH shots fired. lol

Nah but seriously, yes lower res = more fps, has been since the dawn of time on all platforms.
 
No.

When you change the output resolution all that happens is the fixed resolution rendered output from the GPU gets put through a hardware scaler to match the desired settings.

e.g. Most games are 900p. Setting 1080p means your game gets rendered at 900p and then upscaled to 1080p. Setting 720p means your game gets rendered at 900p and then downscaled to 720p.

It's always been this way on all consoles to date, with some quirks here and there around the old 50/60hz PAL/NTSC conversions.
 
Last edited:
No.

When you change the output resolution all that happens is the fixed resolution rendered output from the GPU gets put through a hardware scaler to match the desired settings.

e.g. Most games are 900p. Setting 1080p means your game gets rendered at 900p and then upscaled to 1080p. Setting 720p means your game gets rendered at 900p and then downscaled to 720p.

It's always been this way on all consoles to date, with some quirks here and there around the old 50/60hz PAL/NTSC conversions.

I thought it wasn't necessarily the case. Googling the answer gotva lot of mixed answers
 
if a game is 30fps at 900p.

you cannot switch to 720p output and now hit 45-60fps.

same way as if you switch to 1080p it won't now be 15fps.

there isn't a single console game with variable fps at differing resolutions i would think.

tomb raider however has variable fps @ 1080p on the ps4 (30-60fps) but it's locked to 30fps on xb1.

e.g. switching to 720p on tomb raider won't make it guaranteed 60fps on either console it will still be varied on ps4 and 30fps on xb1.

a game will be made to run @ a consistent fps.

destiny for example is 30fps on ps3, ps4, xbox 360 and xb1.

it could easily be 60fps on ps4 but it's not because it was never designed to be
 
there isn't a single console game with variable fps at differing resolutions i would think.

I believe one of the Gran Turismos did. Could be wrong, but there's definitely at least one or two games out there that could run at different resolutions (last gen, that is).
 
The old days, people still do it now, 800x600 on battlefield 4 to cheat :eek:

And strip away all of the terrain on very low graphics settings. Stopped playing online fps on PC so many people using various tactics like this just to give them the edge.

I know some folk using XIM on PS4 now as well, pretty much cheating as well.
 
And strip away all of the terrain on very low graphics settings. Stopped playing online fps on PC so many people using various tactics like this just to give them the edge.

I know some folk using XIM on PS4 now as well, pretty much cheating as well.

Yes, agreed, its a bit unfair but what do you do? They have to win at all costs. Personally playing a game with totally stripped back gfx at supid low res to get the edge is a bit of shame, but as I said, they want to win and get their kill ratios up.
 
And strip away all of the terrain on very low graphics settings. Stopped playing online fps on PC so many people using various tactics like this just to give them the edge.

I know some folk using XIM on PS4 now as well, pretty much cheating as well.

I remember doing this once in Quake 3 Arena to get extremely high FPS after reading about it on Quake3Arena, oh biy it looked fugly but the advantage was huge!

I couldn't put up with the graphics though.
 
I thought Black Ops 3 dynamically adjusts resolution to smooth frame rate.

Some of the new titles (BO3 & Halo 5 to name two) are using dynamic resolution to hit a target FPS (or at least that is the aim). It is something we are probably going to see more and more as engines get more feature rich.

One thing to factor in with consoles is that the developer will be developing the game with a set target in mind. This could be a certain graphical level or framerate target. How they achieve this will be a careful balancing act factoring in their own knowledge / resources and the systems limitations (Xbone ESRAM for example)*.

Simply dropping the resolution after the fact won't automatically increase FPS as the "balance" originally used to meet the initial target will be thrown out (CPU limitation being one example).

*I am massively over simplifying the above but it is fascinating to read how some developers go about building a game on the consoles (Eurogamers Q&A with the developer of Titanfall is one good example).
 
There is an important distinction between rendering resolution and output resolution. Rendering resolution is what the game engine and xbox 'compute' the graphics at, which is usually 900-1080p as everyone is saying. This is the fundamental aspect effecting the framerate.

Output resolution is the resolution you set the xbox to output to, upscaling or downscaling the rendered image. This doesn't affect framerate (meaningfully anyway). I'm not sure if setting the rendering resolution would technically be possible on a console (maybe based on Halo 5's dynamic resolution stuff) but it might open a can of worms in terms of online games' performance.

The only console game I've seen that has graphics settings is Saints Row 2, and even then I think it is only vsync.
 
Would forcing a console to output at a lower resolution increase framerate of a game?

I remember back in the old days of PC gaming. People used tovrun at low resolutions to get higher fps. Can you do this on consoles?

Only if a game is GPU limited which in the case of PS4/Xbone is generally not the case due to the laughably slow CPU they have, most recent games have tended to be CPU bottlenecked which is why developers are able to bump up the resolution on PS4 (due to much better GPU) without it impacting on frame-rate at all.
 
Last edited:
I very much doubt it with consoles. Thats the good part about PCs, you can customise.

Like the joy of playing Unreal in 320 x 240 so it would work without a hardware GPU! It looked like pure fudge but at least it was playable unlike 640 x 480 that ran about about 5fps. lol
 
No.

When you change the output resolution all that happens is the fixed resolution rendered output from the GPU gets put through a hardware scaler to match the desired settings.

e.g. Most games are 900p. Setting 1080p means your game gets rendered at 900p and then upscaled to 1080p. Setting 720p means your game gets rendered at 900p and then downscaled to 720p.

It's always been this way on all consoles to date, with some quirks here and there around the old 50/60hz PAL/NTSC conversions.

This +1....
 
Back
Top Bottom