Frames per second

This is down to the TV being 60hz/50hz and it having to "pulldown" the image.

a 24hz film on a 24hz TV is fine. LCDs tend to run at 60hz so I turn on Vsynch so games play at 60. any more and it can tear, any less than 40 and I can tell, especially in multiplayer FPS games.

You can even see past 60hz, I could tell between 60 and about 80 but past that and you need to be a pro with good CRT playing competatively.

aye, also down to the interlacing of the image for normal television, watch the film at 24fps at a cinema or on HD and it'll look excellent.
 
isnt FPS and Hz differnt?
i thought its... frequency/2 as it takes 2 cyces to get a frame up and then down ready for the next.
 
isnt FPS and Hz differnt?
i thought its... frequency/2 as it takes 2 cyces to get a frame up and then down ready for the next.

Maybe with interlaced signals but a modern PC monitor locked at 60Hz will display 60 full frames per second with vsync enabled.
 
It is quite simple but hard to explain, hz is simply frequency, how many times something occurs in a second so pretty much the same as FPS, the number of frames in a second.

Films are traditionally shot at 24FPS (probably due to ye old cameras back in the day). but PAL(europe) and NTSC(USA) TVs (CRTS) play at 50hz, 60hz. not so bad for the PAL peoples as 24 devides quite well into 50.

LCD TVs seem to be 60hz so to play 24hz content on em you need a 24hz TV or a 100HZ TV, this apparently fixes it. other wise it has to divide the frames up to fit the 60hz of the TV, you get juddering on slow panning scenes etc. not what you want after you buy a £1000 TV and £500 HD player.

Games simply output FPS as fast as they can if you stare at the wall you get loads of FPS. look at a load of action and it will drop. Vsynch locks the FPS to 60 which when paired with most LCDs looks great.

Interlacing is when half the lines on the screen refresh per second rather than all of them. thus requiring half the data. looks ok but not what you want if you can help it. Progressive scan is the opposite.

If my uber explination isn't good enough, wikipedia is usually good for stuff like this.
 
No, because you are missing the point. Motion detection and odd-frame detection behaviour are not the same as perceiving fluid motion.

It still says we don't have a cap, but see things as a continuous stream, that's only stopped when we blink. We aren't digital creatures, so its impossible for us to have a specific and decisive cap.
 
It still says we don't have a cap, but see things as a continuous stream, that's only stopped when we blink. We aren't digital creatures, so its impossible for us to have a specific and decisive cap.

Which is obvious. We have image data entering our brain continually. However, we are sensitive to only certain changes and if something smoothly changes at a rate above about 10FPS then our brain will interpret the changes as motion and not stuttering. There are people that can't perceive motion and for them TV is simply a fast slide show.

There is a cap which is determined by the minimum time required to do any processing n the input data. The fastest neurons operate at a frequency of 5-10ms and you will need to pass through several layers of neurons to perceive in the most basic motion concepts.

Anyway, can you see the 50Hz flicker of fluorescent lights? NO!
 
Last edited:
There's always a lot of dumb stuff repeated when this is discussed.

How can people think that some cute/convienient value is a catch-all gospel when it depends entirely on what is being viewed (what kind of motion are these frames showing?) and what kind of device it's being viewed on?
 
Last edited:
It depends on what you call perceive.Yes we can tell the difference between 50 and 150 hz, but our Brian take 100ms to properly process an image, but this isn't digital and changes a lot.

Theoretically, a game could be made to be playable at 25fps. like TV, and we super sophisticated brain monitoring 12 FPS ought to do....

It doesn't really matter how long it takes our brain to process an image though, I mean even if it took 1000ms, that wouldn't mean that 1fps is acceptable. It would just be an added delay. It's much like the argument about latency, people say "human reaction time is 10-20ms, so there's no need for a ping lower than that". But the reality is that 0+10 is lower than 10+10.

As for TV (25hz/30hz) and Film (24hz), I think part of the problem is that people aren't used to anything higher. I've watched some fast-action videos using higher framerates on a computer monitor, and they are noticably smoother than a traditional 25/30fps viewing.
 
I've explained this before, and the reason you need more FPS on games compared to TV/FILMS etc is simple, 'Motion Blur',

"the most important factor in the theater is the artifact known as "motion blur". Motion blur is the main reason why movies can be shown at 24 fps, therefore saving Hollywood money by not having to make the film any longer than possible (30 fps for a full feature film would be approximately 20% longer than a film shown at 24 fps, that turns out to be a lot of money). What motion blur does is give the impression of more intervening frames between the two actual frames. If you stop a movie during a high action scene with lots of movement, the scene that you will see will have a lot of blur, and any person or thing will be almost unrecognizable with highly blury detail. When it is played at full 24 fps, things again look good and sharp. The human eye is used to motion blur (later on that phenomena) so the movie looks fine and sharp."

Basically in game you do not have natural occurring motion blur, you see each and every frame clearly so there for notice the slow down frames allot more, mainly in fast action first person games that require you to pan the camera (your point of view) around quickly, company's have recently learned that integrating motion blur into games to emulate what you see on TV or the Movies can help give you the impression that you are getting a constant frame rate, like in the PC game Crysis for example, many people found that even at around 25 frames per second the game unlike many others at this frame rate was quite playable, this was due to the implementation of the motion blur, I think once game engines become better and they can be programmed to emulate motion blur more intelligently then the need for such high frame rates in games will not be as important, the problem with PC games atm the moment is the way in which they show you your environment is too perfect, even in real life with using your very own eyes when you pan across to view something quickly you have motion blur.
 
Last edited:
Back
Top Bottom