• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

is there any way to stop screen tearing?

Nope they don't, the ATI option is only for OpenGL games of which 99% of games aren't, you need to download the D3DOverrider thing that comes with Rivatuner to do it on ATI cards.

Queue ATi fanboys justifying ATI's lack of such a simple option being because it's against Microsoft's wishes despite NVidia's addition of a force v-sync being 1> purely optional and 2> extremely useful.

OK then seeing as you know why then tell us why AMD can force V-Sync in CCC for OpenGL but not for DX if it was not the the guidelines ? because you have not justified your comment in itself.

Explaining the way it is does not make anyone a fan boy because i wish for the option of CCC having Force V-sync in DX games as well.

Its seems all to common lately for some members to throw around the fanboy card to try & stop any counter comment simply because they don't like the way things are.

A fanboy is someone who knowing twists the truth & exaggerates & you have not proven any of that.
 
Last edited:
so from what i have deducted from all of this, if you go over the monitors refresh rate you get screen tearing?
so why have a vsync option, why isnt it something that is just built into windows and forced on everyone, as it will automatically lock it to the highest usable FPS anyway :confused:

Because theres much more to it than just framerate - 60Hz vwait induces quite a bit of input latency which for a lot of people is very noticeable - whereas personally I only notice tearing on areas with lots of parallel lines like radiators and blinds and would much rather that than feeling the slightly "elastic" bandy feeling of input lag.

One nice feature in the upcoming game from idsoftware - RAGE - is advanced vsync which is dictated at software level and can be applied adaptively allowing minimal tearing when the framerate is dropping to keep smooth performance and also helps to reduce noticeable input latency.
 
test it for what though? this is what i dont understand, whats the point in having a card that can do 121hz+ if you will get tearing if you actually use it :confused:

because most of the time the card will not send out the next frame too early, causing screen tear. some people will get it a massive amount, whereas some people will be looking at this thread with their SLI 580's and saying "screen tear? whats that?" sadly i am in the 'vsync is compulsary' club and get huge screen tearing without it

also, every game has about a billion settings on it, and requires different amounts of processing power to make each frame. for example, while im playing CS:S i get 300 FPS if i turn vsync off, but in crysis i only get 30-60FPS with one of the custom configs.

also, just to make sure everyone understands this (im sure most of you do, but it looks like some of you dont) Vsync does not lock the framerate to 60FPS (or whatever your monitor can do), it prepares the next frame as usual, forces the graphics card to wait untill the screen is ready for it, then sends it to the screen.

you could end up getting something like this:

frame 1: a
frame 2: b
frame 3: b
frame 4: c
frame 5: d
frame 6: d

so you can see it doesnt 'lock' the framerate.

*edit*
Because theres much more to it than just framerate - 60Hz vwait induces quite a bit of input latency which for a lot of people is very noticeable - whereas personally I only notice tearing on areas with lots of parallel lines like radiators and blinds and would much rather that than feeling the slightly "elastic" bandy feeling of input lag.

ah yes, theres that too. some games have it more than others. try www.thehunter.com with Vsync forced on at a driver level and you'll see what he's on about
 
Last edited:
Glad you posted that as it also nicely illustrates why vsync is bad for input latency as you can see from frames 2/3 and 5/6 you have duplicate data so you end up with an additional 16.6ms lag at 60Hz on top of the already 16.6+ms lag - and hence why it feels "rubber bandy" as the interval between input and the result being shown can vary so much - when the CPU/GPU is probably buffering up an additional 1-2 frames also you can hit 100ms input latency which is rather nasty.
 
Because theres much more to it than just framerate - 60Hz vwait induces quite a bit of input latency which for a lot of people is very noticeable - whereas personally I only notice tearing on areas with lots of parallel lines like radiators and blinds and would much rather that than feeling the slightly "elastic" bandy feeling of input lag.

Indeed.

Also if the option to cap rate in there then tearing can be reduced to near unnoticeable levels if division of the refresh rates are used.

120fps cap on 60hz will look tear free most of the time ect...

But if you get the frame rates high enough in the lets say 240 & i have been as high as 400fps you just don't notice it.
 
OK then seeing as you know why then tell us why AMD can force V-Sync in CCC for OpenGL but not for DX if it was not the the guidelines ? because you have not justified your comment in itself.

Explaining the way it is does not make anyone a fan boy because i wish for the option of CCC having Force V-sync in DX games as well.

Its seems all to common lately for some members to throw around the fanboy card to try & stop any counter comment simply because they don't like the way things are.

A fanboy is someone who knowing twists the truth & exaggerates & you have not proven any of that.

You don't help yourself when you swing out every time someone says anything vaguely negative of AMD/ATI, regardless of how constructive/backed up or not their post might be.

Neither company has nailed Vsync and/or triple buffering support, but thats as much a MS issue as anything else - I believe nVidia defied MS guidelines to include forced VSync options.
 
You don't help yourself when you swing out every time someone says anything vaguely negative of AMD/ATI, regardless of how constructive/backed up or not their post might be.

I don't swing out as soon as anything vaguely negative of AMD/ATI the problem is that to often people add unnecessary crap to it.
His comments making claims like with what he added to the end & that needs explaining because is far to much misinformation generated.
Constructive/backed up i don't have a problem with either & is just your opinion because you happen to share the same view's as the other person at times which are usually far from backed up.


Nope they don't, the ATI option is only for OpenGL games of which 99% of games aren't, you need to download the D3DOverrider thing that comes with Rivatuner to do it on ATI cards.

That's the truth & all that is all that is needed & i would not of said a word.

Negative is not the problem, its all the BS that in thrown into the mix that is & the BS comes from the preference for brand with users exaggerating & twisting the negatives or positives to be worse or better than what they really are & that in fact makes them more of what they are accusing other people of being .
 
Last edited:
Neither company has nailed Vsync and/or triple buffering support, but thats as much a MS issue as anything else - I believe nVidia defied MS guidelines to include forced VSync options.

Indeed & i wish AMD did too but they did not & that's that & im not going to throw a fit over a company that sticks to guidelines no matter how much i want a feature, if that feature is important enough to an individual & the only way is to use a GPU from a company that has that feature then you use that brand, now if the GPU company was mine then i would add loads of features that defied MS guidelines as long as it would not break anything.

99% of the reason why so many things work together is because of sticking to guidelines & if all could just do as they like all the time then you could kiss goodbye to it all.
 
Last edited:
ok, now im even more confused, so i dont get screen tearing because the FPS is going over 60FPS but because of something else?
maybe screen tearing is the wrong term and i experience something else then but got it confused with this.
 
because most of the time the card will not send out the next frame too early, causing screen tear. some people will get it a massive amount, whereas some people will be looking at this thread with their SLI 580's and saying "screen tear? whats that?" sadly i am in the 'vsync is compulsary' club and get huge screen tearing without it
Also there can be a real big difference shown between different 60hz displays....

My 60hz plasma is really terrible for showing up screen tearing and makes any game with the vsync option disabled unplayable...
 
ok, now im even more confused, so i dont get screen tearing because the FPS is going over 60FPS but because of something else?
maybe screen tearing is the wrong term and i experience something else then but got it confused with this.

The tearing is due to the GPU rendering data out of sync with the monitor regardless of framerate - which is sort of related to framerate.

The discussion then went on a bit to the side effects of different methods of controlling rendering to reduce or eliminate tearing.
 
The tearing is due to the GPU rendering data out of sync with the monitor regardless of framerate - which is sort of related to framerate.

The discussion then went on a bit to the side effects of different methods of controlling rendering to reduce or eliminate tearing.

so by selecting vsync on games that have it, it fixes the issue with some form of magic behind the scenes, but it doesnt just cap the mx FPS to 60FPS thats just a side-effect?
 
Just like to point out that tearing can occur when the framerate drops below the monitor refresh rate as well, no just when it's higher. In fact it can occur even when the frame rate is exactly the same as the monitor refresh rate (or at a multiple or factor of the monitor refresh rate), unless those frames are being sent to the monitor at the same time it refreshes, i.e. the frame output is in sync with the monitor refreshes. Of course, whether you can notice the tearing in such cases is another matter.

Personally I would like to see monitors that have variable refresh rate, dictated by the fps the graphics card is putting out (if thats even possible).
 
so can anyone explain to me what my (considering im stuck with the need of vsync) settings should be in CCC or will this vary from game to game?
 
Why do you require over 60FPS? It's unnecessary unless you have superhuman vision and can notice the difference between 60 and 120...

My husbands a RAF pilot, And they do a FPS test with a screen with which is white and you write down the black letter which is RNG'ted you see on the screen, The FPS goes up and up till you can't see it, He can see upto 106FPS, So all the 60FPS stuff is actually rubbish, Everyone is different, Personelly i can't tell the difference between 70 and 80 fps but my hubber can :eek:.

Nat.
 
My husbands a RAF pilot, And they do a FPS test with a screen with which is white and you write down the black letter which is RNG'ted you see on the screen, The FPS goes up and up till you can't see it, He can see upto 106FPS, So all the 60FPS stuff is actually rubbish, Everyone is different, Personelly i can't tell the difference between 70 and 80 fps but my hubber can :eek:.

Nat.

Perhaps that's one of the reasons he's more suited to being an RAF pilot than the rest of us.
 
Frames per second has nothing to do with tearing (except with some frame rates making it potentially more or less obvious).

A game could output flawless 60 fps without ever dropping a beat, and EVERY frame could be torn. The vsync isn't about frame rate but precisely when the frames switch.

In other words, to properly stop tearing, only vsync will do it.
 
Frames per second has nothing to do with tearing (except with some frame rates making it potentially more or less obvious).

A game could output flawless 60 fps without ever dropping a beat, and EVERY frame could be torn. The vsync isn't about frame rate but precisely when the frames switch.

In other words, to properly stop tearing, only vsync will do it.

so why with vsync on do my FPS not go over 60? and people with 120hz monitors with vsync on it will not go over 120? some kind of coincidence?
 
so why with vsync on do my FPS not go over 60? and people with 120hz monitors with vsync on it will not go over 120? some kind of coincidence?

Because you can only fit one frame into each refresh your monitor makes, hence if your monitor refresh rate is 60, the absolute maxiumum number of unique frames you can fit into there per second is 60. If you graphics card was outputting more than 60fps, you'd have a situation where frames would be fighting for space on each monitor refresh, hence the screen tearing.
 
Back
Top Bottom