usefulbulk logo

High TV Refresh Rates or Motion Interpolation - What Is It?

There are many articles about “refresh rates,” “motion smoothing,” and “motion interpolation” for modern consumer televisions, but I share this as the personal opinion of someone who has been a media professional since 1981 and a filmmaker since 1971 (when I was eleven years old). I’m also a dyed-in-wool nerd. I started a conversation about frame rates and their cultural effects/significance in 1981 (when I met my future wife, who had been a 30 frame per second videographer while I had been shooting 18/24fps film), so I’ve given the topic more thought than most. I’m not suggesting that I’m more correct - I’m just giving what I hope is a fresh perspective so that the reader can appreciate why the topic exists.

In December 2014, a friend emailed me a question for a co-worker who was shopping for a new television:

“...he asked me about refresh rates, which apparently range from 60hz to 600hz(!).  I know that there is a point beyond which it makes no difference at all to us poor humans, but can't remember what that point is.”

This page represents my response.


"Frame rate" refers to the number of individual still images either captured by a motion-picture/video camera during a time interval, or the number of images per interval sequentially presented to a viewer during presentation, whether to a projection screen or an electronic display. Typically, frame rates are expressed in "frames per second," or fps.


Currently, nearly all motion pictures and (U.S.) scripted (as opposed to reality, game, news, etc.) television programs are shot at 24 frames per second, whether acquiring on film or digital imaging sensors. 24fps was established as a standard at the end of the 1920s with the need to ratify a projection rate for the new sound films. So for most human beings, every motion picture they have ever seen was shot and presented at 24fps.

A couple of exceptions to this acquisition/presentation frame rate exist: When shooting footage intended to distend and dilate time, cameras record at frame rates higher than the playback rate. So an event shot at 48fps and played back and 24fps takes twice as long to view. This is known as overcranking the camera, the the resulting footage is what we all know as "slow motion," or sometimes "slo-mo." When footage is shot with the camera undercranked at a lower frame rate than the presentation rate, action may appear to speed up.

More recently, a new acquisition/presentation frame rate has seen limited success: motion pictures shot and projected at 48fps. I wrote a long email to you (which I turned into this web page) two years ago regarding the transition from film projection in public cinemas to digital projection. In the treatise, I discuss the changing of production and projection frame rates. Since then, two Hobbit movies have been photographed and distributed at 48fps (and a third is opening imminently) - which have been marketed under the moniker "HFR" (High Frame Rate). This has been possible because of existing digital projector infrastructure used for 3D projection which routinely presents footage at 144fps (24 frames per second, alternating left and right eye views, and repeating each frame 3 times = 144). Director James Cameron has reportedly been shooting his two sequels for Avatar in 48 or 60fps, and intends to use this same limited HFR exhibition infrastructure.

In the U.S., our old analog television frame rate was 30 frames per second (actually, color television is 29.97fps, but that's another story). Even with the apparently subtle difference between 24 and 30fps, the psychological and cultural impact of the two frame rates is surprisingly noticeable, even to the layperson. Because of our tradition of shooting soap operas on 30fps NTSC video and movies on 24fps film, viewers seeing 30fps presentation perceive the footage as being like video, even when it's a film projector running at 30 frames per second. Likewise, video shot and projected at 24fps is perceived more like film.

In the current U.S. digital ATSC television standard, frame rates of 23.976 to 60fps are supported. Most channels broadcast at either 30 (1080i and 480p/i channels) or 60fps (720p channels). But movies and scripted TV shows shot only for television are still shot at 24fps, and are presented on 30 and 60fps video systems by a conversion process known as 3:2 pulldown.


This is actually pretty straightforward, if not necessarily intuitive or obvious. When traditional motion-picture cameras are configured to shoot at 24 frames per second, a semicircular spinning shutter disk exposes the chip of film to light for 180 degrees of its constant-speed rotation, allowing light to fall on the chemistry for about 1/48 of a second. During the interval of the shutter rotation when the film frame is occluded from the lens, an intermittent "claw" mechanism - like a sewing-machine "feed dog" - yanks the film down to the next unexposed frame region. But as anyone who has used a manually-operated still camera knows, 1/48 second is inadequate to freeze all but the slowest of motions during the exposure. As a result, 24fps motion-picture footage contains a large amount of motion-blur artifacts. But we've all been watching these artifacts for almost a century and a half (silent film frame rates were even lower, between 16 and 20fps).

Another quality that profoundly changes with frame rate is perceived noise reduction. As frame rate increases, the film grain, electronic noise, or digital compression artifacts in any given region of the frame are increasingly averaged out, and the viewer's eye/brain perceives the consistent "signal" - the intended visual information - more clearly.


Many current HDTVs have a feature which artificially creates more distinctive frames of video per second than actually exist. The highest frame rates of video currently distributed is 60fps (most is 30). In order to synthesize these additional frames, dedicated digital processors interpolate intermediate frames, algorithmically deriving where the pixels in the existing frame would appear in these non-existent frames.

Since there's the (appropriate) assumption that TV buyers don't actually know what they're looking for, and don't have a cultivated aesthetic about images - TV marketing, like toothpaste and shampoo, is functionally a matter of selling undifferentiated product. Shoppers can't tell one model from another. So features like Refresh Rates have become part of the Label Wars as you stroll down the dozens of black rectangles hanging on the wall in the Best Buy. And Refresh Rates have predictably climbed as manufacturers attempt to provide prospective buyers with any kind of comparative data with which they can make decisions.


Filmmaker Douglas Trumbull (among other things responsible for visual effect in 2001: A Space Odyssey and Blade Runner) has championed higher frame rates for acquisition and projection for over 30 years. His (now defunct) Showscan company developed high frame rate systems for several motion-rides (if you've seen the "Secrets of the Luxor Pyramid" show in the Luxor Hotel in Las Vegas, some of the presentation was 60fps 35mm film - which fairly convincingly looks like a live person on stage at points). I've heard Trumbull say that in testing, subjects report perceiving differences in frame rates approaching 300fps.  


While it may sound like snobbery, my perspective about this is that I think we should respect the artistic intentions of the creators of the content and the technology created to produce that content. Countless engineers, craftsmen and artists (cinematographers combine all three disciplines) have labored and continue to labor to create an infrastructure of acquisition (cameras), recording (film, videotape, digital media), post-production (editing, visual-effects, sound), distribution (broadcast, narrowcast and physical media) and presentation (film, digital projectors, direct-view monitors) technologies which attempt to maintain the fidelity of the recording, and predictably reproduce whatever the desired image might be based upon myriad decisions by professionals.

While televisions which promise "higher refresh rates" via synthesized interpolated frames might create results which are appealing to some viewers, it does violate the intentions of those who created it. I liken this to viewing the master works of renowned painter with special glasses that allow the viewer to change any or all of the colors, despite the fact that the painter spent half of the four months creating the piece searching far and wide for sources from which to make their pigments. Or perhaps buying a $1.7M LaFerrari and replacing what I'm assuming are probably custom tires for every corner (four custom tires) with big wheels and tires from Pep Boys that "look cool" and replacing what's probably a half-million dollar engine with a big-block Chevy because it makes more horsepower. (And yes, people spent a lot more than $1.7M to make that movie look the way it does in the theater - keeping in mind that the actors and producers might still represent 80-90 per cent of a budget.)


Good for you. As to whether going as high as the 600Hz apparently offered is desirable, I think you'll have to go and look at some footage on these TVs and compare. Be sure to look at the kind of footage you care about: sports, movies, TV shows. The perceived effect may be different on relatively slow-paced dramas versus panning around a hockey rink. I've seen some models that allow you to select from different Refresh Rates, and so far, I've never seen a TV with higher refresh rates that didn't allow you to disable the feature.


For those of us who have been in the media business for the years since film was dominant, 24fps is a part of our visual language. Look at any video or high-end still camera with "professional" aspirations, and it will offer "24P" as a video setting - 24 frames per second (progressive scanning). This is still perceived as being "film-like," and even to non-technical, non-artistic "money people" - the producers who handle the financing of movies and TV shows but probably know little of how things actually work - it's a currency of the business. So ingrained in us is the association of "quality" with motion pictures that people shoot some Food Network shows in 24fps video because they think it gives them some of the class and caché of film (which they'll probably never shoot in their life).


Will frame rates for theatrical projection and televisions increase? Almost certainly, although it will take a while to make industry-wide changes. Even if some of us can tell the difference, or have a cultural associations with qualities like frame rate and film vs. video imaging, we'll eventually die off, and new generations (many of whom are already NOT watching televisions, but just binge-watching TV shows on their smartphones) will think of a century of 24fps films as quaintly as we do 14-16fps hand-cranked silent films.