• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

is there any way to stop screen tearing?

Because you can only fit one frame into each refresh your monitor makes, hence if your monitor refresh rate is 60, the absolute maxiumum number of unique frames you can fit into there per second is 60. If you graphics card was outputting more than 60fps, you'd have a situation where frames would be fighting for space on each monitor refresh, hence the screen tearing.

so some people are saying too high an FPS causes tearing,

Frames per second has nothing to do with tearing (except with some frame rates making it potentially more or less obvious).

A game could output flawless 60 fps without ever dropping a beat, and EVERY frame could be torn. The vsync isn't about frame rate but precisely when the frames switch.

In other words, to properly stop tearing, only vsync will do it.

others are saying FPS has nothing to do with it :confused:
 
I currently have V-Sync and Triple Buffering forced on in my Nvidia control panel but what does Triple buffering actually do? I only forced it on because I was told it makes the input lag a little bit better...
 
so some people are saying too high an FPS causes tearing,

That's the whole point, which is why vsync doesn't allow the fps to go above the refresh rate (otherwise you'd get tearing).



others are saying FPS has nothing to do with it :confused:

What they mean is that regardless of the fps, you can still get tearing. In fact, even if the the framerate was exactly 60fps, you can still get tearing on a 60hz monitor, unless each frame coincides with each refresh, i.e. the each frame is shown at exactly the same time the screen refreshes. You can test this yourself on certain games, by limiting the fps to your refresh rate (say 60fps), you'll still notice some tearing, because simply limiting the fps is not the same as vsync.

Vsync actually synchronises every frame output from your graphics card, with each screen refresh, so that a whole frame is shown for every screen refresh (hence no tearing). The side effect of this is that the maximium fps you can show is the same as your monitor refresh rate (e.g. 60fps on 60hz monitor).
 
Last edited:
so whats the 'rubbery' difference between 60FPS and 120FPS? and are most consoles not locked to something like 25FPS for PAL? if so do these then suffer from this 'rubbery' effect?
 
I currently have V-Sync and Triple Buffering forced on in my Nvidia control panel but what does Triple buffering actually do? I only forced it on because I was told it makes the input lag a little bit better...

Double buffering, triple buffering, rendering N frames ahead, all of these things have pros and cons attached.

Mostly the difference between double and triple buffering is about letting the CPU get on with work instead of idling. If it's a game that is 'cpu hungry', forced idle times on the CPU aren't good.

With double buffering, you're showing one frame to the monitor screen, and in the background (on the 'back buffer') you've finished drawing the next one coming up, and your CPU is now stuck with nothing it can work on whilst waiting for the vsync flip to give you the space to start drawing the next frame that follows again.

The drawback to rendering too much data ahead of time, is you do run the risk of always playing in the past, whatever actions you do now based on the data you see on screen could be 1, 2, or more pre-generated frames out of date already. If surplus frames aren't discarded when this happens, this could be felt as a kind of lagged 'elastic band' input delay. If they are discarded, the frame rate will feel less smooth as it'll jump a bit irregularly.

I'm loathe to suggest there's a perfect setting out there, it often depends on the hardware and the software combined. You also have to be careful of games that have their own home-rolled method of doing things like frame management, because if you also use a video card's control panel settings, you could double-up needlessly.

The only general advice I'd make is that if the game has a feature to control anything along these lines, use it, if it doesn't, then force/override it with the graphics panel as needed.
 
Try downloading (google it)

d3doverrider standalone

And disable vsync in all ur games, make d3doveride start with windows (helps with all things d3d) and set detection to low.

No more screen tear or fps drop. :)

this seemed to fix it, other than i cant use the tripple buffer as it makes the games lag :( more vram i imagine.

everything on ultra at the moment (only 1024x768 as i left it all on defaults, just turned off ingame vsync) and seems to be running smooth :)
 
Why do you require over 60FPS? It's unnecessary unless you have superhuman vision and can notice the difference between 60 and 120...

No it's not. Mouse movements feel smoother with a higher FPS.

I used to play cod4 at 333 FPS, then one day I joined a modded server which capped the FPS at 120 (to prevent certain trick-jumps) and I could immediately feel the difference, and I didn't even have it displayed on screen until I turned it on. Movements just weren't as smooth.

To the OP: It seems to me like some combinations of FPS and screen refresh cause very bad tearing. My mate says that if he caps his FPS in CSS at a multiples of his refresh rate (120Hz) he gets horrible tearing. So he has to offset it slightly.
 
Running above 60fps is, for me anyway, something that feels better on the eyes, even if I can't 'see' the difference I can assure anyone that suffers from migraines it can be a great help.

Andi.

in css for example, over 100fps = a lot less recoil on weapons +better spray accuracy, so even if you "cant see the difference" theres still an advantage to higher framerates
 
Back
Top Bottom