What Does 1080p Mean For Game Developers?

Associate
Joined
15 Jun 2006
Posts
2,178
Location
Amsterdam
Microsoft's game dev blog Ozymandias has some thoughts and stats on High Definition Game Rendering:
Many developers, gamers, and journalists are confused by 1080p. They think that 1080p is somehow more challenging for game developers than 1080i, and they forget that 1080 (i or p) requires significant tradeoffs compared to 720p. Some facts to remember:
  • 2.25x: that's how many more pixels there are in 1920x1080 compared to 1280x720
  • 55.5%: that's how much less time you have to spend on each pixel when rendering 1920x1080 compared to 1280x720-the point being that at higher resolutions you have more pixels, but they necessarily can't look as good
  • 1.0x: that's how much harder it is for a game engine to render a game in 1080p as compared to 1080i-the number of pixels is identical so the cost is identical
    There is no such thing as a 1080p frame buffer. The frame buffer is 1080 pixels tall (and presumably 1920 wide) regardless of whether it is ultimately sent to the TV as an interlaced or as a progressive signal.
  • 1280x720 with 4x AA will generally look better than 1920x1080 with no anti-aliasing (there are more total samples).
Any game could be made to run at 1920x1080. However, it is a tradeoff. It means that you can show more detail (although you need larger textures and models to really get this benefit) but it means that you have much less time to run complex pixel shaders. Most games can't justify running at higher than 1280x720-it would actually make them look worse because of the compromises they will have to make in other areas.

1080p is a higher bandwidth connection from the frame buffer to the TV than 1080i. However the frame buffer itself is identical. 1080p will look better than 1080i-interlaced flicker is not a good thing-but it makes precisely zero difference to the game developer. Just as most Xbox 1 games let users choose 480i or 480p, because it was no extra work, 1080p versus 1080i is no extra work. It's just different settings on the display chip.
 
All makes sense but there is one thing that confuses me about the difference between progressive and interlaced. So if interlaced displays half a frame every 60th (or 50th) of a second, so one complete frame every 30th of a second, does that mean progressive just displays one complete frame every 30th of a second? Surely that would mean games can go no faster than 30fps? I must be wrong about something here, I'm just not sure what.
 
Kreeeee said:
1080p can run at 60fps, it's just that 1080p renders the whole screen while 1080i renders alternating lines (odds then evens then odds ...). 1080i doesn't use that much less processing power and if you pop over to hardocp or OcUK monitors you can read all about it :)
Ok I think I know where I was getting confused. I was thinking that 1080i would render half of one frame, then the other half. When really it renders half of one frame, then half of the next. Makes sense now.
 
Psyk said:
All makes sense but there is one thing that confuses me about the difference between progressive and interlaced. So if interlaced displays half a frame every 60th (or 50th) of a second, so one complete frame every 30th of a second, does that mean progressive just displays one complete frame every 30th of a second? Surely that would mean games can go no faster than 30fps? I must be wrong about something here, I'm just not sure what.

Exactly right if games were 30p but games run at 60p or 50p for pal so you will get much smoother games than at 60i/50i and a higher resolution because it's displaying a full frame not half a frame.
 
1080p has several frame rates depending on what the filmer decides to use - 24fps for films, 25 or 50fps for PAL, 30 or 60fps for NTSC.

I love how all these console developers are whinging about 1080p and saying it's not much better than 720p. On the PC side we've had 1200p for years! I can tell the difference between 1080p and 720p quite easily.
 
Echo toxin said:
I love how all these console developers are whinging about 1080p and saying it's not much better than 720p. On the PC side we've had 1200p for years! I can tell the difference between 1080p and 720p quite easily.

  • 2.25x: that’s how many more pixels there are in 1920x1080 compared to 1280x720
  • 55.5%: that’s how much less time you have to spend on each pixel when rendering 1920x1080 compared to 1280x720—the point being that at higher resolutions you have more pixels, but they necessarily can’t look as good
  • 1.0x: that’s how much harder it is for a game engine to render a game in 1080p as compared to 1080i—the number of pixels is identical so the cost is identical
    There is no such thing as a 1080p frame buffer. The frame buffer is 1080 pixels tall (and presumably 1920 wide) regardless of whether it is ultimately sent to the TV as an interlaced or as a progressive signal.
  • 1280x720 with 4x AA will generally look better than 1920x1080 with no anti-aliasing (there are more total samples).
 
Im sure i read somewhere that 1080i wont be that different if at all becuase flat screen lcd are progressive in nature or something like that so it deinterlaces i think

*gets coat
 
Echo toxin said:
1080p has several frame rates depending on what the filmer decides to use - 24fps for films, 25 or 50fps for PAL, 30 or 60fps for NTSC.

I love how all these console developers are whinging about 1080p and saying it's not much better than 720p. On the PC side we've had 1200p for years! I can tell the difference between 1080p and 720p quite easily.

We've had 1536P for years.
 
Echo toxin said:
1080p has several frame rates depending on what the filmer decides to use - 24fps for films, 25 or 50fps for PAL, 30 or 60fps for NTSC.

I love how all these console developers are whinging about 1080p and saying it's not much better than 720p. On the PC side we've had 1200p for years! I can tell the difference between 1080p and 720p quite easily.
Yes, you can tell the difference quite easily, but that's for comparing like for like picture with same graphics details settings. What the article is saying that if you go from 720p to 1080p on the same hardware, that you may have to turn down some graphics settings and features to get it to run as well as it does on 720p, meaning 1080p could make a game look worse than it can potentially look.

It's all great running things at 1080p++, but I'd rather run at an acceptable framerate and a perfectly fine lower resolution on high detail than run a game on a higher resolution with less detail.
 
Last edited:
Back
Top Bottom