Screen Tearing with PS3 & 360 games.

but turning V-sync on does tax the gpu quite a bit;

?

By turning "v-sync" on, means that from a code point of view you wait until the vertical sync before switching the framebuffer to display - ie the program sits idle until the video is about to start drawing from the top of the screen again. And therefore usually with double buffering if you wait for the v-sync the GPU will be idle for part of each frame, not stressed out. (Although if tripple buffering you tend to just get the GPU rendering another frame for the display queue...)
 
Well I was probably talking out of my bum once again. But on PC games when I turn vsync on, my frame rate suffers, and I thought it was because it taxes the GPU some more.
 
Well I was probably talking out of my bum once again. But on PC games when I turn vsync on, my frame rate suffers, and I thought it was because it taxes the GPU some more.

It reduces the load on the GPU, as it's sitting idle waiting for the vertical sync.

But yes, the frame rate is reduced. Because it won't start drawing the next rendered frame, until the monitor has displayed the frame that's currently being displayed.

Lets take an example, say your monitor is refreshed at 60Hz, that means it takes 16.6ms for it to display a frame.

If the game engine renders a frame in 10ms - it's running at 100fps. Now if you just swap what's being drawn up when the next frame is ready, to get 100fps, the monitor will only draw about 2/3's of each frame, before it switches to the next - this will result in the tearing effect that we are talking about.

Tearing can also result is it takes longer than the refresh time, say it takes the engine 20ms to render a frame, that means with the monitor refresh of 16.6ms, it draws the frame, then starts drawing the same frame again for 4ms before it flips to the next - again, it will tear. This is the usual case on the PS3 and 360 as it's doing too much rendering work, or the code is too slow to do everything in the required time to match the refresh.

With v-sync on, generally as the PC can render faster than the monitor refresh rate, it will result in a lower fps - it will match the monitor refresh. So going back to the example of it rendering quicker - say 10ms to render a frame. With v-sync enabled, it will wait until the monitor is ready to display the next frame so it has to sit idle for the remaining 6.6ms of that frame. Your fps will drop to match the monitor refresh - 60Hz / 60fps. But there won't be any tearing as the frame being displayed doesnt change while the monitor is partway through displaying a frame.
 
Vsync issues are a big problem for many of us with large HDTV's as it is very noticeable and ruins the experience in places.

Worst examples I can give are Dead Rising, Splinter Cell Double Agent, Assassins Creed on the 360 & Drakes Fortune on the PS3.

Strange thing is that if I switch on the 360 from VGA to Component then the tearing is much harder to notice although still there but then you lose the sharper PQ of VGA. Wonder if this issue is related to some HDTV's say run @ 59.4Mhz and some @ 59.4Mhz and some @ 60.0Mhz and on different inputs the sync rate is slightly different.

We should just be given an ingame option to enable vsync even if it means 30FPS which is still good enough anyway if its constant.
 
Wonder if this issue is related to some HDTV's say run @ 59.4Mhz and some @ 59.4Mhz and some @ 60.0Mhz and on different inputs the sync rate is slightly different.

No. TVs and Monitors are dumb creatures :) They display the picture that is fed to them, that's all.

They don't create any sync information or timings, all that information (vertical sync and horizontal sync pulses etc) are fed to them as part of the input video signal.
 
Last edited:
No. TVs and Monitors are dumb creatures :) They display the picture that is fed to them, that's all.

They don't create any sync information or timings, all that information (vertical sync and horizontal sync pulses etc) are fed to them as part of the input video signal.
Well they must handle the inputs slightly differently then as Component on the 360 has a lot less tearing than VGA. HDTV's must have slightly different methods for handling these inputs otherwise you would get the same amount of tearing on both sockets.
 
We should just be given an ingame option to enable vsync even if it means 30FPS which is still good enough anyway if its constant.

You do in Bioshock and Saints Row, but part the beauty of console gaming is that we don't / shouldn't have to arse on sacrificing iq settings to avoid framerate dips.

That been said, I rate Bioshock and Saints Row as two of the best games on the 360, and I played them both with vsync disabled...
 
It's just like that PC demo ".kkrieger" that was released a few years ago which was a 3D game with the engine on par with the likes of Quake3 - but it only takes up 96k! It shows what can be done with when something is so drastically optmisied! (http://212.202.219.162/kkrieger

That's incredible :eek: didn't realise that you could cram so much into just 96k.!!!!
 
Heh yeah it is pretty amazing, I'm surprised you haven't seen it. It's been knocking around for years.

Reminds me of the old demo scene on the Amiga then PC, some amazing stuff created back then.
 
Back
Top Bottom