Refresh rate & LCD's

Soldato
Joined
14 Mar 2004
Posts
8,040
Location
Brit in the USA
I'm sure this must have been asked before, but I've searched and am still slightly confused :)

With my new LCD display I can set it to 60Hz or 75Hz. 60Hz is recommended. But why? What disadvantage is there to running at 75Hz? :confused: I like to play with vsync on, so 75Hz would be prefered.....so why does everywhere say to run at 60Hz?

Please help an LCD newbie ;)

Oh, my new monitor is a HP L1940T 19" and I absolutely adore it. My wife got one for her PC a while back and I've been glaring jealously at it ever since. So I had to treat myself to one with some Xmas money ;) Lovely screens - not a single dead pixel on either one!
 
LCD's don't work like CRT's do. While a CRT will literally refresh the entire screen 75 times a second at 75hz a LCD will only refresh what actually changes, pixel by pixel. On an LCD the "refresh rate" is more related to the frequency that the panels run at, which is why you should run them at the recommend setting, usually 60hz. Most monitors "downscale" the signal to 60hz no matter that its set at, resulting in a reduction in quality.

Generally you will want to play with vsync off anyway as it can cause problems in some games.
 
Another way to look at the difference between refreshing on a CRT and a TFT is that on the CRT the pixel is a phosphor blob that gets hit by an electron beam , goes bright then starts decaying/dimming so needs "refreshing" while on a TFT the pixel is "switched" on and stays that way until "switched" off.
 
why do we still see line tearing on lcd without vsync when the screen is no longer updated in lines ?
 
Last edited:
MadMatty said:
why do we still see line tearing on lcd without vsync when the screen is no longer updated in lines ?

It is still updated a pixel at a time :), a LCD is still a serial device and if the information in the graphics card buffer changes mid frame then you "could" see a mismatch, ie a "tear".
 
That's odd, because double/triple buffering should prevent that from ever happening. The frame shown on the screen for games is one that has been completely finished one or two frames in the past. In windowed mode tearing could be explained as the buffer is being blitted to another graphic buffer by the OS, but surely not in fullscreen mode?

If output on LCD doesnt happen in scanlines like CRT then the worst that should happen for higher than internal 60 fps is frame skipping. Can someone please clarify why tearing happens on LCD?
 
MadMatty said:
why do we still see line tearing on lcd without vsync when the screen is no longer updated in lines ?
Tearing occurs because the contents of the framebuffer (on the video card) changes while the screen is drawing, so you get part of one frame and part of the next one, hence the tear. This tells me that the data isn't sent from video card to monitor as a whole package, but rather it must be sent in either pixels or lines, and it must take a period of time to send it all. So the problem is that in the time the frame data is being sent from video card to monitor, it can change, the game can flip the new frame from the backbuffer onto the framebuffer (which is instantaneous btw), the data isn't locked down in any way.

So it doesn't matter how the screen draws (and indeed LCDs may still update the pixels with a line scan), you will still get tearing unless the whole screen can either be transmitted instantly, or the data can be locked down until it's all been sent (which is what v-sync achieves but at the cost of restricting access to the framebuffer and limiting when the backbuffer is available for your game to draw onto). Another solution would be for the frame data to be copied to another buffer to protect it's integrity during transmission to the montior.

I'm sure video card manufacturers could fix this if they really wanted to. I guess they don't see tearing as a big problem, afterall you even get tearing on brand new consoles like the 360.

So to sum up, it's not a monitor draw problem, it's more of a data transmission problem.
 
fish99 said:
So to sum up, it's not a monitor draw problem, it's more of a data transmission problem.

Yup , that's what V-Sync (vertical synchronisation) is for . Turn it of and you run the risk of the display / GFX card running out of sync giving "tearing"
 
could the effect be sevearly lessoned if the game implemented a flame limiter that matched the vsync ? that way vsync could be dissabled so the up and down frame rate doesnt happen and also the game never tries to draw more then the monitor can display which in theory stop overdraw/tearing.
 
Last edited:
MadMatty said:
could the effect be sevearly lessoned if the game implemented a flame limiter that matched the vsync ? that way vsync could be dissabled so the up and down frame rate doesnt happen and also the game never tries to draw more then the monitor can display which in theory stop overdraw/tearing.

I think the Doom3 engine already does this.
 
The Doom3 engine is just limited to 60fps (unless you use a console command to remove it)

Just use V-sync with triple buffering on. Then you get no tearing and don't suddenly drop to half-fps if your PC can't handle it.
 
Back
Top Bottom