Name: Anonymous 2005-07-03 22:36
I've noticed when movies use "pan & scan" to map a larger filmed area to a smaller tv screen, the panning effect is very noticable.
To me it looks like while the movie is displayed at whatever it's framerate is, like 15 or 20 frames per second (I don't know what it's supposed to be) the panning occurs faster, say 30 frames per second.
That's what looks so odd, is that the smoother panning of the pan & scan is out of place with the jerkier motion of the movie.
So my question is, why can't the reducing the panning speed to match the original rate? It would take longer to pan to a certain location, but then it again it would not look so bad. anyone in the industry know?
To me it looks like while the movie is displayed at whatever it's framerate is, like 15 or 20 frames per second (I don't know what it's supposed to be) the panning occurs faster, say 30 frames per second.
That's what looks so odd, is that the smoother panning of the pan & scan is out of place with the jerkier motion of the movie.
So my question is, why can't the reducing the panning speed to match the original rate? It would take longer to pan to a certain location, but then it again it would not look so bad. anyone in the industry know?