• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Possible to gobally cap FPS?

Associate
Joined
21 May 2007
Posts
1,464
Might seems like a strange question, but can you force all Dx games to limit themselves to a certain FPS? I know many games do it internally (Q4, Doom3, GPL, COD4 for example), so it must be technically possible.


I basically want to limit many games to 30FPS.

Some of my games, notably driving ones (and MSTS), can't QUITE manage 60FPS all the time. If you use vsync, a game running at 55FPS is horrible looking. It draws 50 frames each second which remain on screen for a single refresh, but 5 every second with hold on screen for twice as long. It almost makes a game at 55FPS look like one doing 5FPS, in terms of juddering (5 twitches a second). Turning vsync off stops this as no frames need to be "held", but then of course you get horrid tearing.

30FPS looks fabulous compared to 50 or 55, so how do I tell directX that "I want 30FPS, and once you've got 30 done, take the rest of the second off" :P


Any thoughts?
 
No, you can't do this.
The closest you will get is by enabling vsync and setting your refresh rate for that resolution and colour depth to a given value.
 
Which will always be 60+ :(
Needs to be a value that even the trickiest games can exceed.

You'd think something like this would come in handy for developers, wouldn't you?

MSTS is the worst offended, doing 120MPH with the world doing 5 double length frames out the window gives me a headache after a while. But at least MSTS isn't TOO bad with vsync turned off, things like rfactor, there's just no happy solution, because one's camera view swings around so much that tearing's most distracting.
 
u can cap 30 fps in some games like call of duty series if you put a command in like com_maxfps 30 but if you play online you wont be able to play in public server as you will be kicked of the server.

but you cannot cap using directx or anything else
 
I think I feel better that the heregathered wise ones have come up with all the same answers I managed myself :D

It's ironic, FPS's have all the clever controls to set a framelimit, but don't need it as the effects of a few doubled frames don't show up when dashing around in random directions. Yet driving games tend not to even have a console, and are the worst affected owing to all the straightline motion.

Was worth a try askin you lot anyway.
Cheers
 
are you sure vsync can cause that? i always use vsync and have never seen anything like that happen, it just doesnt make sense.


try forcing triple buffering using rivatuner's tool as this negates the lost framerate of vsync
 
Last edited:
You'd think something like this would come in handy for developers, wouldn't you?

If it's something the are concerned about, they can easily implement a framerate limiter themselves within their game.

A driver-level FPS cap independent of refresh rate would be inherently difficult to implement, because it is dependent on whatever the refresh rate is. With vsync, it's relatively simple, the graphics card just issues whatever is in the framebuffer at regular intervals
 
are you sure vsync can cause that? i always use vsync and have never seen anything like that happen, it just doesnt make sense.


try forcing triple buffering using rivatuner's tool as this negates the lost framerate of vsync


Vsync on:
there are 60 "ticks" to one second.
Frames may only be drawn on a tick.
You cannot draw 2 frames on any tick.
Therefor the only possible thing to happen when the machine can only generate 55 FPS is for 5 of those frame to be held for 2 ticks=60 ticks in total.

It's like playing 3 beats in a 4/4 bar.....vsync off allows "Triplet time",3 evenly spaced beats, but vsync on will result in two notes that last a beat, and one that lasts two.

I seem to be one of the few to notice it though. As I said, it's forward motion in TrainSim that REALLY makes it noticable.

Besides read any graphics tweaking guide for vsync and it will say that it eliminates tearing at the cost of potential ocassional stuttering.
 
try forcing triple buffering using rivatuner's tool as this negates the lost framerate of vsync

i understand what you are saying but it sounds dumb. surely all forcing 30fps would do is mean that you only get a frame every other tick, thus just making the pause _every other frame_
 
The thing is, when the game settles at 30fps, it looks just as nice as 60 really, much nicer than 55, because it's even and regular.

Like I say, most folks don't seem to notice it.

Tripple buffering is of limited use in something like MS TrainSim because it will settle down at say 55 now and then for quite a while, tripple buffering seems to help much more during momentary dips.

It's a nightmare of a game. I had a 2800XP with a rad9800pro until I started playing it. Both boxes in my sig are almost entirely the result of a quest to get through clampham junction at 20+fps. Managed it at last, but still a LONG way from 60, who needs vantage when you have a cantankerous old MS bugfest that ignores all the best bits of your hardware, and is full of 2008 spec addons.
 
Vsync on:
there are 60 "ticks" to one second.
Frames may only be drawn on a tick.
You cannot draw 2 frames on any tick.
Therefor the only possible thing to happen when the machine can only generate 55 FPS is for 5 of those frame to be held for 2 ticks=60 ticks in total.
This isn't quite the way that it works I'm afraid. You need to be thinking of the game update not in frames per second, but the amount of time needed to render the frame.
For instance, if your machine can only push 55 fps with vsync off then that means a frame will take on average 18.1 ms to render the frame. Enabling vertical syncing means that if the game update takes longer than 16.6 ms then the frame is not sent to the screen on that vertical refresh but on the next one - ie at 33.3ms. 1 / 33.3ms = 30 frames per second. If your machine is too slow to make 30 fps (eg 29 fps without vsync) then when using vsync your computer will miss both the 16.6 ms and 33.3 ms refreshes and the one at 49.9 ms will be used instead - giving an effective 20 fps.

To summarise - if your computer can only make 55 fps average in your game then vsync will force the game to 30 fps.
 
This isn't quite the way that it works I'm afraid. You need to be thinking of the game update not in frames per second, but the amount of time needed to render the frame.
For instance, if your machine can only push 55 fps with vsync off then that means a frame will take on average 18.1 ms to render the frame. Enabling vertical syncing means that if the game update takes longer than 16.6 ms then the frame is not sent to the screen on that vertical refresh but on the next one - ie at 33.3ms. 1 / 33.3ms = 30 frames per second. If your machine is too slow to make 30 fps (eg 29 fps without vsync) then when using vsync your computer will miss both the 16.6 ms and 33.3 ms refreshes and the one at 49.9 ms will be used instead - giving an effective 20 fps.

To summarise - if your computer can only make 55 fps average in your game then vsync will force the game to 30 fps.


This man has it correct :)

The 'twitching' you refer to could be when your framerate is hovering around 60fps. When the maximum possible output framerate is constantly dancing between 59 and 60 fps (ie just above and below 16.6 ms frametime) you can get some strange looking behavior as it switches between the 30fps and 60fps regimes.

Personally, I never enable vsync except for in old games where I can guarantee >60fps constantly. The occasional switching from 60 to 30fps is FAR more annoying (to me) than a small amount of tearing. Plus, there is the input response issue ('mouse lag') when switching between the two regimes. At least with vsync turned off we don't have this issue.

I recently switched from my trusty CRT to a 24" LCD with 2ms response, and I was very pleasantly surprised at the lack of tearing. The LCD (running at 60hz) is comparable in tearing to the CRT running at 100Hz. If per chance you're running with a CRT, try pushing up the refresh rate. Over 100Hz tearing becomes a lot less noticable (except for in strobe-lit areas where it still sticks out like a sore thumb).


edit - we really should petition microsoft (and nvidia / ATI) to get triple-buffer vsync enabled as a *working* option inside Direct-X. This way we wouldn't have any of the tearing or framerate-drop issues.
 
Last edited:
This isn't quite the way that it works I'm afraid. You need to be thinking of the game update not in frames per second, but the amount of time needed to render the frame.
For instance, if your machine can only push 55 fps with vsync off then that means a frame will take on average 18.1 ms to render the frame. Enabling vertical syncing means that if the game update takes longer than 16.6 ms then the frame is not sent to the screen on that vertical refresh but on the next one - ie at 33.3ms. 1 / 33.3ms = 30 frames per second. If your machine is too slow to make 30 fps (eg 29 fps without vsync) then when using vsync your computer will miss both the 16.6 ms and 33.3 ms refreshes and the one at 49.9 ms will be used instead - giving an effective 20 fps.

To summarise - if your computer can only make 55 fps average in your game then vsync will force the game to 30 fps.
that is only assuming every frame takes longer than each interval to tick, there may only be a few frames a second that take longer than 16.6ms to tick over.
 
that is only assuming every frame takes longer than each interval to tick, there may only be a few frames a second that take longer than 16.6ms to tick over.
Yes, totally. As duff-man says, if it's right on the edge at that magical 16.6 ms mark then it'll bounce between 30 and 60 fps and will annoy the hell out of you ;)
 
until we run out of vram.

There is no proccessing to do, so the only storage requirement is a single image. Uncompressed, at 1920*1200, with 32bit colour, this is of size:

1900*1200*32/8 bytes ~ 8.70Mb.

I think you can spare 9Mb of VRAM to get a nice framerate without tearing, can't you?!
 
Well, all I can say is that 30 and 60FPS are smooth as glass, and 45,50, 55 and so on, look terrible, so we can assume from that that 55fps isn't actually just 30 as suggested, if it was it'd be smooth. Only in MSTS though, because of constant speed forward motion, without that you'd never notice the effect at all.

Guess I just have to deal with playing it with vsync off :(

I could make a video, but that would need to be at 60fps to show aht I'm talking about LOL

(and somehow, I think FRAPS might just upset MSTS a little, I'd have to record it off the hideous video in on my other machine)
 
Last edited:
Well, all I can say is that 30 and 60FPS are smooth as glass, and 45,50, 55 and so on, look terrible, so we can assume from that that 55fps isn't actually just 30 as suggested, if it was it'd be smooth. Only in MSTS though, because of constant speed forward motion, without that you'd never notice the effect at all.

Guess I just have to deal with playing it with vsync off :(

I could make a video, but that would need to be at 60fps to show aht I'm talking about LOL

(and somehow, I think FRAPS might just upset MSTS a little, I'd have to record it off the hideous video in on my other machine)

With vsync on, your output (as read by FRAPS) will be either 60 or 30. The only time it will read anything in between is when you have been switching between the two within the last two seconds.

FRAPS works on averaging over the last few frames of data (usually between one and two seconds worth), so if you're seeing framerates which are NOT either 30 or 60, then you're getting the "transitioning" that were have been talking about. Hence the 'horrible flicking'.

This assumes you're using a monitor with a 60hz refresh rate, like 99% of LCDs, of course.
 
Isn't there a solution simple? If the poster is so near the 60fps that with vsync on it jumps around then downspeed the graphics card/cpu (depending on whether cpu/gfx dependant) or just use the control panel to force some AA on the game (that will bring the framerate down)

For example if I force 16xAAQ on The Whitcher my rates drop from around 50 fps to about 15 fps. Play around a little and a get a setting which doesn't give you any more than say 50fps and switch on vsync and triple buffering and you will have a constant 30 fps.
 
Isn't there a solution simple? If the poster is so near the 60fps that with vsync on it jumps around then downspeed the graphics card/cpu (depending on whether cpu/gfx dependant) or just use the control panel to force some AA on the game (that will bring the framerate down)

For example if I force 16xAAQ on The Whitcher my rates drop from around 50 fps to about 15 fps. Play around a little and a get a setting which doesn't give you any more than say 50fps and switch on vsync and triple buffering and you will have a constant 30 fps.

If we all got a constant framerate throughout a game, that would be fine. In reality, framerates often vary by a factor of 4 or more throughout play. Adjusting detail levels such that the maximum framerate stays below 60 will definitely cause the minimum framerate to drop below 30 in places (leading to a 15fps output).
 
Back
Top Bottom