GT5 @ 240FPS

I thought it was between 25-40 fps depending on the person.

On film yes, but when it comes to 2d/3d computer generated animation this is not the case becuase of the lack of motion blur that occurs naturally on film and even in real life, I've explained this before, "the most important factor in the theater is the artifact known as "motion blur". Motion blur is the main reason why movies can be shown at 24 fps, therefore saving Hollywood money by not having to make the film any longer than possible (30 fps for a full feature film would be approximately 20% longer than a film shown at 24 fps, that turns out to be a lot of money). What motion blur does is give the impression of more intervening frames between the two actual frames. If you stop a movie during a high action scene with lots of movement, the scene that you will see will have a lot of blur, and any person or thing will be almost unrecognizable with highly blury detail. When it is played at full 24 fps, things again look good and sharp. The human eye is used to motion blur so the movie looks fine and sharp."

Basically in games you do not have natural occurring motion blur, you see each and every frame clearly so therefore notice the transition between frames allot more easily, mainly in fast action first person games that require you to pan the camera (your point of view) around quickly, company's have recently learned that integrating motion blur into games to emulate what you see on TV or the Movies can help give you the impression that you are getting a constant frame rate, like in the PC game Crysis for example, many people found that even at around 25 frames per second the game unlike many others at this frame rate was quite playable, this was due to the clever implementation of the motion blur, I think once game engines become more advanced and they can be programmed to emulate motion blur more intelligently then the need for such high frame rates in games will not be as important, the problem with computer games atm is the way in which they show you your environment is too perfect, even in real life with using your very own eyes when you pan across to view something quickly you have motion blur.
 
Last edited:
I love the misconceptions about human sight, and how readily people are willing to state here-say as absolute, undeniable, fact on here :D

I think you've been had orderoftheflame ;)
 
If I had a pound for the amount of times on forums Ive read that its impossible for the human eye to distinguish between 60fps and 100fps in computer games ...... :D

I gave up replying a long time ago :)
 
I've explained this before, "the most important factor in the theater is the artifact known as "motion blur". Motion blur is the main reason why movies can be shown at 24 fps, therefore saving Hollywood money by not having to make the film any longer than possible (30 fps for a full feature film would be approximately 20% longer than a film shown at 24 fps, that turns out to be a lot of money).

I think that needs a smidgeon of clarification, I think you are referring to the physical film stock, whether that be traditional film or digital? and not the actual runtime of the film?
It's been discussed on a few forums, but the physical cost of storing the film isn't that high, it's more the fact that you have to cater for the lowest common demoninator, the actual cinema projection equipment, and they are unwilling to invest in new technology unless forced to, as the punters don't seem to care, so film studios still have to cater for that, so 24fps is lingering on for a few reasons, but cost of production isn't the major concern.


And games can do artificial motion blur if required, PGR4 being a prime example, which I think works quite well!
 
I think that needs a smidgeon of clarification, I think you are referring to the physical film stock, whether that be traditional film or digital? and not the actual runtime of the film?
It's been discussed on a few forums, but the physical cost of storing the film isn't that high, it's more the fact that you have to cater for the lowest common demoninator, the actual cinema projection equipment, and they are unwilling to invest in new technology unless forced to, as the punters don't seem to care, so film studios still have to cater for that, so 24fps is lingering on for a few reasons, but cost of production isn't the major concern.

I was referring to physical stock, and whether it be digital (which is allot more expensive to store) or traditional, it's still more expensive to store.

And games can do artificial motion blur if required, PGR4 being a prime example, which I think works quite well!

Yes they can but not many games can emulate motion blur efficiently, in fact non can, Crysis is the closest I've seen but it's still a long way of, the thing about motion blur isn't the way it looks, it's how effective it is at creating the illusion of a perfectly smooth and consistent frame rate when running a game at a lower frame rate, no artificial motion blur in games is any where near that advanced atm.
 
Last edited:
Back
Top Bottom