

Although industry standards have been made and adopted widely before (such as USB) there are plenty of other standards that have failed to be adopted (FireWire) despite technological superiority (data rates).

#Motion smoothing tv tv
It may not necessarily make it to the TV industry though. In one, we are witnessing events happen to someone else (and exercising empathy), whereas in the other, we are experiencing the events first-hand (even in a 3rd person view, we're playing as if we are the character in view).Ī first-hand experience feels most believable when it's actually realistic (hence the need for high frame rates), but a simple stick figure can feel "believable" if we're engaged and sharing the emotion we believe it to be feeling.Īs a side note, maybe that's why I personally prefer classic (8-bit era) video games over modern 3D video games :)Īn industry standard media meta data interface for motion smoothing would be the best solution. To some degree that same experience happens in video games, but I would argue that instead of seeing the emotions and trying to understand them, we (as players) are the person feeling the emotion. In film, we see a person depicted on the screen with certain characteristics that convey emotion (just as we do with a group of lines comprising a stick figure with similar emotion), but we are forced to activate parts of our brain that span the gap between what we see (which obviously isn't reality) and what that character might feel in his/her universe. Here's how I'd describe it in more detail (just my opinion of course): True, the distinction may seem strange to some, and it's not always a clear distinction. > I find the suggestion that videogames don't need to suspect disbelief, but films do, very strange. As others have noted, a few directors have really tried to push HFR film to the public, but it has never been received well (whether or not HFR is recognized as being responsible for the uneasiness felt by audiences watching HFR content).Īs for HFR content being "easily" downsampled - it really isn't, at least not with respect to motion blur (which is an essential point that many people miss in these discussions). And finally, video games generally try to simulate reality for the player, so players are more concerned that the technology makes the experience believable rather than intentionally suspending disbelief (as in film or theater) to passively unpack the narrative nuance of artwork unfolding on a screen.įor those same reasons, sports also do well with higher frame rates, but when fiction (which is obviously disconnected with the present reality) tries to use high frame rates, it falls into "uncanny valley" territory - much like the computer-generated humans of movies like "The Polar Express". Since you need to react quickly in video games, high frame rates (combined with the lack of motion blur) are also helpful in preserving crisp edges on moving objects (so it's a practical advantage). Most video games are unable to properly simulate motion blur (some try, but it doesn't usually work well), so you have to have high frame rates for things to look smooth. Real cinematographers also know exactly how fast they can pan with a given lens without causing any noticeable strobing motion.Ĭomparisons to video games completely miss the point. That's because the camera's shutter angle is normally set such that motion blur makes the motion look smooth. However, 24 fps running at a true 24Hz is perfectly fine, and you will not see flicker - at least not if produced by competent filmmakers. Since 24 frames can't be evenly spread out over 60 frames, you'll see motion judder, and it does look terrible. Panning shots probably look awful if you're watching 24 fps content on a device running at 60Hz (many consumer devices, computers, and Blu Ray players do this by default).
