• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Does FPS even matter?

It certaintly does make a difference, I'm not sure how your eyes perceive x amount of frames although there is absolutly definatly a difference between say for example, 30 - 60 fps.

I've read an arcticle about the human eye frame capacity. I've heard the same that it's roughly 30 frame per second.

This may be true and all but there is definatly a difference.

For example on some source server/maps I can actually get up to 300fps and there's certaintly a more smooth flowing experience.
 
I've read an arcticle about the human eye frame capacity. I've heard the same that it's roughly 30 frame per second.

I thought that it was 24fps as that is why movies are filmed in 24fps and why everyone wants to be able to show their blu-ray's at 24fps.
 
Theres a big difference, I play CS at 100fps/100hz (CRT), and if I was to go on LCD and put 60/75hz and FPS it would be huge, at least in first person shooters..maybe not so much MMORPGs.
 
I can clearly detect the difference between 50/60 and 100.

In fact I can detect a dip of 10fps from 60 to 50.

I'm not sure if the human eye can only actually 'see' 24 frames where the other inbetween frames act as a fill and therefore smooth our interpretation of motion? Either way anything sub 45fps to me feels like 'slowdown' and explain why I can't play XBOX360 or PS3 without getting eyestrain because to me 30fps is VERY noticeable.
 
I thought that it was 24fps as that is why movies are filmed in 24fps and why everyone wants to be able to show their blu-ray's at 24fps.

The reason TV/Films are 24 fps is because there is motion blur to merge the frames, computer graphics doesn't have this so you need more frames per second in order to have smooth movement.
 
30 was just an example, I'm not sure on the exact amount.

There are also different theorys as to how the eyes react in different circumstances (Night and Day for example). Therefore I would imagine watching a monitor with artificial light which is currently changing color, especially from light to dark. Sets out a whole new set of variable as to how sensitive your eyes are to x amount of FPS.

Although thats a good point about movies. I wonder what watching a film at 30+ fps would be like?

To answer the first question though - Imagine it as though theres Man1 and Man2, they are watching a man walking across the beach for 10 minutes and taking pictures with there cameras. Imagine Man1 takes 5 picture and Man2 takes 48 Pictures. The picture are then put into flick books. Man1s flickbook would look very jumpy as he only had 5 pictuees and Man2s flipbook would like nice and smooth as his has 48 pictues. Although in that 10 minutes the Man walking across the beach still walked the same distance and took every step at the same time.

The reason TV/Films are 24 fps is because there is motion blur to merge the frames, computer graphics doesn't have this so you need more frames per second in order to have smooth movement.

Ah yes, thats it I remember reading something similar.

http://www.100fps.com/how_many_frames_can_humans_see.htm
 
Last edited:
I'm wondering how much influence a varying fps has on our perception of smoothness, hence the reason we all like to have V-sync on.

So a game that has an average of 45fps that fluctuates between 40 and 50 could appear choppier than a 30fps game on vsync?
 
I got a 4870 so I can use Vsync to eliminate tearing and get a constant 60fps. butter smooth yo \m/
 
Although thats a good point about movies. I wonder what watching a film at 30+ fps would be like?

This explains why increasing the fps in video is beneficial. I wonder if a similar principal applies to games?

'A new high-definition progressive scan format is not available for picture creation, but is currently being developed to operate at 1080p at 50 or 60 frames per second. This format will improve final pictures because of the benefits of "oversampling" and removal of interlacing artifacts.'

One other point about video, if you play a 50fps TV broadcast on my PC with the monitor set to 60Hz then you get stuttering in the video. I wonder if this is happening with games if there is no vsync, although it's possibly difficult to see?
 
On CSS I have consant around 200fps and while I couldn't see any difference between that and my old pc which avergaed around 50fps (same settings) I could certainly "feel" the difference, it felt smoother and the experience was smoother and more enjoyable
 
The other thing I never see talked much about when this conversation comes up is the difference between passively viewing something and interactively viewing something. People always seem to want to compare watching a film at the cinema with playing a game on your PC and imo there is a major difference. Not just is viewing environment but in the way your eyes, brain and rest of your body reacts to the experience.

With cinema/film you're just watching it but surely playing games there's some kind of input/output loop between what your doing, what your seeing and then reacting to?

If your brain knows it's moving the mouse at a certain speed and the screen updates aren't happening quick enough then you'll experience lag. Which is another thing a lot of people talk about FPS lag but isn't it really input lag? Do you need to factor in things like using a 3200dpi mouse sampling at 100hz on USB?

OK with high FPS you may not actually be seeing it but surely playing it you'd pick up on things being not as smooth as they could be. Obviously worse if the FPS isn't constant no matter how high it gets?

Another q off the back of that is, do people who get motion sickness playing FPS games still get it if they are not playing but watching the same thing?


edit: but in answer to the OP, yes it does. personally can see a big difference between 30 and 60 but ideally I'd like to run vsync at what ever the highest refresh of the monitor I'm using has, with 60 being the minimum.
 
I'm wondering how much influence a varying fps has on our perception of smoothness, hence the reason we all like to have V-sync on.

So a game that has an average of 45fps that fluctuates between 40 and 50 could appear choppier than a 30fps game on vsync?

Vsync is only on to sync (there's the clue) the frame rate output by the GPU with the refresh rate of the monitor. Vsync is at 60 only with a refresh rate of 60 (most LCDs). If your fatty Trinitron gaming CRT is refreshing at 85Hz or 120Hz then vsync attempts to lock fps at 85 or 120. If the GPU can't keep up with this fps, it automatically drops down to the next divisible fps (so, I run NWN2 vsynced- almost always at 60fps, as my monitor's refresh rate is 60Hz. On occasion with plenty shadows etc, the GPU can't sustain 60fps so it drops to 30fps, as 30fps is perfectly divisible into 60Hz). Triple buffering and other technologies open up more in-between FPSs (45fps IIRC).

Vsync is for one reason only- to avoid the GPU update of a frame happening part way through the monitor refreshing the screen. This is tearing and you see half frames from two different frames at the same time on your monitor=ugly.

Edit: The reason vsync is used really only on LCDs (and not actually with CRTs, as in my example above) is that CRTs have such a quick response time you don't notice tearing. I think. OK, I've done enough rambling. :D

This explains why increasing the fps in video is beneficial. I wonder if a similar principal applies to games?

'A new high-definition progressive scan format is not available for picture creation, but is currently being developed to operate at 1080p at 50 or 60 frames per second. This format will improve final pictures because of the benefits of "oversampling" and removal of interlacing artifacts.'

One other point about video, if you play a 50fps TV broadcast on my PC with the monitor set to 60Hz then you get stuttering in the video. I wonder if this is happening with games if there is no vsync, although it's possibly difficult to see?

1080p50 and 1080p60 already send at FPSs at 50 or 60. The problem is if film makers have filmed at 24fps (and I can't remember why this is, some proper videophiles might jog my memory here), 24 can't be multiplied evenly into 50 or 60, hence the jerkiness every so many frames. My TV has a refresh rate setting of 72Hz, so that if I send 24p each frame is repeated 3 times, no odd remainder frames for jerkiness (just need a Blu Ray player now though :() The thing is, how it was filmed originally?
 
Last edited:
if you cant notice the fps difference give up seriously :p . now seriously this post smells of trollism but we do the shizzle anyways ::D basically single player just as long as you get 30 fps should be fine.multiplayer is a whole different story :confused: 125 is what i aim for dont care if eyes cant tell but you can feel it for shaw when playing.its like walking (30 fps) and skating on ice (125 fps).it does make a hell of a difference whether you agree or not.also server lag can be a bit confusing to some novices.if you play often enough you should notice all.

most decent players do turn a lot of settings off.i do not mean me cause now i suck :p
 
Computer Games and their industry driving use of Frames Per Second
It's easy to understand the TV and Movies and the technology behind them. Computers are much more complex. The most complex being the actual physiology /neuro-ethology of the visual system. Computer Monitors of a smaller size are much more expensive in cost related to a TV CRT (Cathode Ray Tube). This is because the phosphors and the dot pitch of Computer Monitors are much smaller and much more close together making much greater detail and much higher resolutions possible. Your Computer Monitor also refreshes much more rapidly, and if you look at your monitor through your peripheral vision you can actually watch these lines being drawn on your screen. You can also observe this technology difference by watching TV where a monitor is in the background on the TV.

A frame or scene on a computer is first setup by your video card in a frame buffer. The frame/image is then sent to the RAMDAC (Random Access Memory Digital-Analog-Convertor) for final display on your display device. Liquid Crystal Displays, and FPD Plasma displays use a higher quality strictly digital representation, so the transfer of information, in this case a scene is much quicker. After the scene has been sent to the monitor it is perfectly rendered and displayed. One thing is missing however, the faster you do this, and the more frames you plan on sending to the screen per second, the better your hardware needs to be. Computer Programmers and Computer Game Developers which have been working strictly with Computers can't reproduce motion blur in these scenes. Even though 30 Frames are displaying per second the scenes don't look as smooth as on a TV. Well that is until we get to more than 30 FPS.

NVIDIA a computer video card maker who recently purchased 3dFx another computer video card maker just finished a GPU (Graphics Processing Unit) for the XBOX from Microsoft. Increasing amounts of rendering capabilities and memory as well as more transistors and instructions per second equate to more frames per second in a Computer Video Game or on Computer Displays in general. There is no motion blur, so the transition from frame to frame is not as smooth as in movies, that is at 30 FPS. In example, NVIDIA/3dfx put out a demo that runs half the screen at 30 fps, and the other half at 60 fps. The results? - there is a definite difference between the two scenes; 60 fps looking much better and smoother than the 30 fps.

Even if you could put motion blur into games, it would be a waste. The Human Eye perceives information continuously, we do not perceive the world through frames. You could say we perceive the external visual world through streams, and only lose it when our eyes blink. In games, an implemented motion blur would cause the game to behave erratically; the programming wouldn't be as precise. An example would be playing a game like Unreal Tournament, if there was motion blur used, there would be problems calculating the exact position of an object (another player), so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned, that is the object wouldn't exist at exactly coordiante XYZ. With exact frames, those without blur, each pixel, each object is exactly where it should be in the set space and time.

The overwhelming solution to a more realistic game play, or computer video has been to push the human eye past the misconception of only being able to perceive 30 FPS. Pushing the Human Eye past 30 FPS to 60 FPS and even 120 FPS is possible, ask the video card manufacturers, an eye doctor, or a Physiologist. We as humans CAN and DO see more than 60 frames a second.

With Computer Video Cards and computer programming, the actual frame rate can vary. Microsoft came up with a great way to handle this by being able to lock the frame rate when they were building one of their games (Flight Simulator).

Found this article quit old but is pretty interesting about what we see in computer games. I always thought a constant 60 was gonna be the smoothest but according to this we can see more so maybe i need a new gpu now lol. Ah was beaten to it by the man above.
 
Last edited:
Last edited:
Back
Top Bottom