I've done some video editing and video game programming, so I have some experience with this stuff. For the record:
- Traditionally motion pictures are shot at 24 fps. This will change going forward as more people shoot on HD digital rather than film.
- US Television (NTSC) is at 29.97 fps interlaced (59.94 fields per second)
- European TV (PAL) is at 25 fps (but is higher resolution per frame than NTSC)
- Hi-def TV formats are actually a bit flexible, from 23.97fps up to 60. The reasons for several of the frame rates being supported have to do with making it easy to convert input meant for older TVs, old movie sources, etc.
Many shenanigans ensue when converting between the various video playback formats. All of them rely on your brain doing a lot of the heavy lifting; what you see when you watch a movie isn't what's "really" there (and to some animals, it just looks like garbage). The responsiveness of human vision is highly variable depending on the person, the ambient light, your age, etc., but the low frame rate of movies (24 is actually quite low) has to do with the limits of technology back abut 80 years ago. 60fps movies would look noticeably more awesome. Video games should always shoot for 60fps as a minimum, WoW fails here quite often but whachagonnado.
Some fun reading:
http://en.wikipedia.org/wiki/HD_TV
http://en.wikipedia.org/wiki/Persistence_of_vision
http://en.wikipedia.org/wiki/Visual_perception
Enjoy!
Connect With Us