Neuroscience Altering the Way Moving Images are Produced

One of the human factors that has always been extremely relevant to display technology is the way that images are  perceived in the brain. In addition to the art and technology of producing static and moving pictures, the neuroscience of perception has slowly caught up in film and video production. Recent studies, however, may accelerate our understanding of how the brain processes images.

Filmmakers and scientists gathered together this summer to explore the science of how movies are perceived in the brain, with an examination of how modern film making affects the mental and physical responses of the audience. Held in Hollywood, the two-day event, entitled “Movies in Your Brain: The Science of Cinematic Perception,” presented a forum to discuss how experiments in neuroscience can advance our understanding of cinema, and how cinema can advance our understanding of the human brain.

Among the topics covered in the Motion Picture Academy-sponsored event were using functional magnetic resonance imaging (fMRI) to measure the brain’s responses to films, eye-tracking to measure audience gaze focus, viewer blinking and its relationship to scene cuts, and production decisions such as resolution and frame rates.

One interesting research area involves studying the way blinking is part of the human visual system. While the basic purpose of blinking is to clean and lubricate the surface of the eye, blinking is now being seen also as a physical manifestation of a psychological “scene cut.” When shifting our gaze from one object to another, the brain subconsciously “blanks” the intervening images, so as not to produce an annoying image blur across the retina. But it turns out that a viewer’s eyes almost always blink when shifting gaze, a mechanism that may serve to minimize the processing load on the brain.

James Cutting, chair of the department of psychology at Cornell University, gave an overview of the history of perceptual and cognitive processing and their relationship to film editing, frame rates, projection, and scene and narrative structure. The selection of film rates and resolution has long been a hotly-debated topic among filmmakers and videographers, a situation that the introduction of Ultra-HDTV makes even more complex. Included in the filmmakers’ event was the airing of a silent short, filmed more than 30 times with identical camera moves and identical performances, at various frame rates between 24fps and 120fps. Not surprisingly, a less-than-scientific survey of the audience brought diverse reactions from the four frame rates shown.

Filmmakers also discussed the use of eye-tracking systems that showed that background visual elements outside the center of dynamic attention could be rendered using low-bandwidth CGI processes. A film clip from the 2010 film Iron Man 2 gave a visual example of how viewers consistently focus their gaze on primary subject element, to the almost total exclusion of anything else.

In addition to its interest by film makers, eye-tracking is an area of growing interest for target markets in video game hardware and software, advertising, and other applications. Earlier this year, Sony Magic Labs showed a prototype of various gaze interaction game concepts for the PlayStation 4 that used eye tracking technology. Advertising researchers are now using the technology to measure consumers’ attention and responses objectively, both at the point of sale as well as during the production of content. While some PC, tablet and smartphone manufactures are starting to experiment with eye-tracking features, other devices have come to the point of practical maturity, such as the Tobii eye-tracking headset described earlier this year in Display Daily.

As product manufacturers and content producers are always interested in new ways to engage consumers, it should not be surprising to see new applications of perception science work their way to the marketplace. Hopefully, news and social media will serve as watchdogs to ensure that it develops outside of a “big brother” scenario.

share this post

About the Author: Aldo Cugnini