Frames per second

In games like cod4 you need as much FPS as possible - You actually see things a split second before anyone else (say they have 30 and you have 125)

And i get terrible pings on cod4, minimum 75-80 :(
 
Games in the future will hopefully all be having some form of motion blurring. For instance TF2 runs at quite a low framerate on most computers but few people notice it because of the motion blurring. Basically if it's running at just 30fps on a 75hz monitor each frame will be on the screen for an average of 2 and a half refreshes. Because each of those frames though are blurred between each frame it helps the brain to see it as motion rather then still images.

Basically a 30fps game with motion blurring looks more smooth then a game running at 45fps without motion blurring. The 15fps loss of performance is worth the actually smoother seeming gameplay.
 
As stated above, the reason commercial movies look smooth at 24 fps is because of the motion blur which is apparent in any filmed production. If you pause a film you can see this effect. The same obviously does not occur in video games.
 
The same obviously does not occur in video games.

Apart from some :)

Crysis also employs motion blurring but not quite as well as TF2. I didn't even notice it in TF2 until I took a screenshot when spinning rapidly. Test Drive unlimited uses a simple method of putting old frames on top of the new frame with a higher transparency the older they are, this works quite well but can get obvious if there is a lot of movement between frames.
 
sure is :)

i play rtcw a lot and find 125 fps helps no end, i can notice the difference between 125 and 76 (the 2nd lowest magic number) but that's mainly due to the characters movement rather than actuall smoothness
 
Also, at the cinema at least, extra blank frames are inserted between frames to smooth the projection. I think it's two, so in effect you see 72 fps.
 
been reading in a few forums on here about frames per seconds in games, people wanting daft amounts like 150fps, correct me if im wrong but i though our films were made to abuot 24 fps as this is about the limit what we can see.
so why the need for more:?

In films/Tv, frames also blur into each other. So the jump from one frame to the next is not too big, and if it is bigg there is normally a lot of blur. This gives a nice moition blur effect. With gmaes, you have harsh differences between frames which means the frames are more easilly noticible, there is no moition blur (will, for the most part).

One should aim for about 75fps constant FR to match your monitor. 60 is ok, but you can be sensitive to it. Like the 50Hz hummm you can get from lights.
 
We can tell the difference between 200 and 400 fps quite easily.

It's pointless having 400fps for gaming though.
 
Last edited:
I thought the human eye could see at 60fps? :confused: hence why some games like Doom 3 for example are limited to it.

The "frame rate" of the human eye isn't muhc more than 10FPS,, but it is not constant so to avoid aliasing we nee need to sample at around a factor 10 times that.

It takes around 100ms to fully process a visual image, leading to 10FPS maximum processing rate. But certain senses such as motion can be perceived much faster, even if the motion is not accurately observed and measured it is detected as part of a flight or fight "early" vision system in the old cortex.
 
My cousin, who plays a lot of CSS, likes to have at least a hundred to be as competitive as possible. Although I don't understand why, since LCDs can only produce up to a 75 refresh rate.
 
Aint that the magic number for trick jumping?

On The Edge ( Q2 ), you could get from the steps to the mega health without your feet ever touching the ground. :p

It was next to impossible to perform such jumps at lower fps. The difference between 60 fps and 90 fps in some games is quite noticeable.
 
I can see the difference between 85Hz and 100Hz refresh rate on my monitor. But, this is when I get used to 100 and then switch to 85Hz.
 
A lot of dodgy information in this thread :)

There is no known precise limit to what the human eye can perceive, as it varies from person to person. So people chucking out figures like "60fps" or "75fps" are wrong (I can easily tell the difference between 75fps and 160fps). Like the poster above suggests, the real way to notice the difference is if you get used to playing with a constant high framerate and refresh rate. Play for a week with 125fps/125hz and then switch back to 75fps/75hz, and it feels less smooth in a game you are well in tune with.

Couple of good articles here: http://amo.net/NT/02-21-01FPS.html
 
been reading in a few forums on here about frames per seconds in games, people wanting daft amounts like 150fps, correct me if im wrong but i though our films were made to abuot 24 fps as this is about the limit what we can see.
so why the need for more:?
Humans have correctly identified objects shown for 1/220th of a second. That's with experienced military fighter pilots, but it does show what the human visual system is capable of.

I doubt if anything above 100fps makes any difference, but the difference between 30fps and 60fps is clearly visible.

It's also worth noting, in the context of games, that the quoted figure is almost always the average fps rating. It is worthwhile having that higher than is of any real use, because it gives some indication of the minimum fps rating, which will of course be lower. It would probably be better to quote minimum fps on a game benchmark rather than average.

I doubt if an LCD display could perform 150 refreshes per second, because the quoted response time is only for changing from one specific colour to one specific colour. The average time to refresh in a game will be far higher.
 
It's variations in FPS that I notice more than anything. I've played CSS at 120+ fps and noticed slowdown when it dropped to ~60 yet I've played Oblivion at around 30fps and it's felt smooth as silk.
 
the human eye cannot see the extremely high levels of pc gaming but you can "feel" the difference, try playing at 200fps then restrict it to 30, you will know what i mean.

Plus, why the hate for the people who can get 60+fps in crysis? willy waving? jealous much?
 
A lot of dodgy information in this thread :)

There is no known precise limit to what the human eye can perceive, as it varies from person to person. So people chucking out figures like "60fps" or "75fps" are wrong (I can easily tell the difference between 75fps and 160fps). Like the poster above suggests, the real way to notice the difference is if you get used to playing with a constant high framerate and refresh rate. Play for a week with 125fps/125hz and then switch back to 75fps/75hz, and it feels less smooth in a game you are well in tune with.

Couple of good articles here: http://amo.net/NT/02-21-01FPS.html

It depends on what you call perceive.Yes we can tell the difference between 50 and 150 hz, but our Brian take 100ms to properly process an image, but this isn't digital and changes a lot.

Theoretically, a game could be made to be playable at 25fps. like TV, and we super sophisticated brain monitoring 12 FPS ought to do....
 
Back
Top Bottom