LIBRARY: Tutorials Reviews Interviews Editorials Features Business Authors RSS Feed

Setting Standards for High Frame Rate Digital Cinema

COW Library : Cinematography : Dave Stump, ASC : Setting Standards for High Frame Rate Digital Cinema
CreativeCOW presents Setting Standards for High Frame Rate Digital Cinema -- Cinematography Editorial

Los Angeles California USA All rights reserved.

The SMPTE workgroup on High Frame Rate was established eight months ago, and is co-chaired by Michael Karagosian, Kommer Kleijn, a Belgium cinematographer, and me. The work group was formed partly as a response to the creative community by several studios, which realized they would have to accommodate the eventual distribution of High Frame Rate (HFR) content. The studios distributing The Hobbit and Avatar 2 are keen to jump in and help work out solutions, but, in the end, the entire film community is going to have to grapple with the issues of HFR, including the exhibitors. Currently, SMPTE standards and recommended practices don't extend to higher frame rates or higher bit rates for compression. So the HFR working group was a necessary step towards that. Currently there are 50 to 60 people in the working group, and there is a huge amount of interest in the organization in this topic.

The working group is constructing a test plan and a shot list. We're trying to whittle that shot list down to something that is do-able in size, without allowing it to grow exponentially. We hope to have help from the studios to sponsor some tests. We will need to test with the RED Epic, ARRI Alexa, Sony F65; we'll test with all these cameras and at a variety of frame rates including 24, 25, 30, 48, 50 and 60 Hz, as well as 72 and 120 fps, which will help to create a library of test material.

(L-R) CATE BLANCHETT as Galadriel and IAN McKELLEN as Gandalf in New Line Cinema's and MGM's fantasy adventure 'THE HOBBIT: AN UNEXPECTED JOURNEY,' a Warner Bros. Pictures release. © 2012 WARNER BROS. ENTERTAINMENT INC. AND METRO-GOLDWYN-MAYER PICTURES INC.
Early screenings of "The Hobbit" raised controversy about whether 60 frames per second allows a cinematic look. At this point, however, tools for post and projection are just now becoming available, so the SMPTE Working Group on High Frame Rate Cinema still has a lot of work ahead of it. Click image for larger view.

Our primary area of interest is going to be things that stress compression, both JPEG 2000 and MPEG. That's because, ultimately, a DCP is a JPEG 2000 compressed file and both JPEG and MPEG files are on our list of deliverables. What will stress compression is different for each scheme. For JPEG 2000, when you have fine detail in darker background areas behind faces in the foreground, JPEG tries to preserve the fine detail in the faces instead of the fine detail in the background areas.

MPEG has problems with fine detail in motion, across the frame or especially in rotation. If you took a camera and pointed it at a very large stadium crowd and rolled the camera as you panned it, you could really stress MPEG encoding.

I'm interested in testing all the parameters of high frame rate and their effects on 3D viewing. There are some motion artifacts that I'm interested in testing. For example, when footage is acquired synchronously and then played back asynchronously, some subtle but funny things happen. With synchronously shot content played back asynchronously, objects moving across the frame change in depth, dependent on which direction they are traveling and how fast they are moving across the frame. Objects moving from left to right can exhibit a different apparent z depth than objects moving across the frame at the same distance from camera, but moving from right to left.

I'm concerned with HFR in 3D and 2D, but by virtue of doing the testing in 3D, we end up with bonus content in 2D as the single eye of one pair. This technique probably won't extend to 4K HFR testing and I am looking forward to comparing 4K material shot through a 3D beam splitter to material shot clean. For now, we're going to try to do all the testing in 3D, because 3D is more relevant for now as all the big HFR movies coming out are 3D.

I think it's really interested to see that, now that we're not paying by the foot for film and developing, how many of the constraints of our filmmaking past seem to be falling by the wayside, the frame rate among them. I think it was two years ago at NAB Digital Cinema Summit, Doug Trumbull and I both spoke and ended up giving almost the same talk. Doug broke it down into the four vectors of progress in motion imaging: increased resolution, increased bit rate, increased frame rate, and increased projection brightness.

Michael Keaton in Batman. copyright 1989 Warner Brothers. All Rights Reserved.
Dave Stump was part of a team that won an Academy Award for Technical Achievement for motion control systems developed for the Tim Burton Batman films.

HFR is just one of the four new vectors of motion imaging. Are all four vectors of equal importance? Nobody knows, but they all contribute to a more realistic viewing experience and ultimately it's just new tools for the creative community. This gives them the chance to make images that result in a more realistic viewing experience. You can always dumb down the tools and make it less realistic; you can always darken things, decrease resolution and so on. But it doesn't work in the opposite direction. If you don't start with all the vectors, you can't aspire to them.

I've shot things in HFR but then again I've been a VFX cinematographer for years. I used to work for Doug Trumbull in the Showscan days, which is part of what has made me so enthusiastic about high-frame rate cinema. When we used to do 60 fps 65 mm in Showscan, if you sat an audience down in front of that projection, frequently they couldn't tell what was real and what was projected. For example, Doug made a film with what appeared to be a janitor walking across a dimly lit stage, cleaning up and then noticing the audience is there. Until the first cut of the film, the audience didn't know it wasn't a real janitor walking around on stage but that it was part of the film. It was that startlingly real.

Up until now, the challenge to HFR filmmaking has been that you need more disk space, more digits, increased storage space. But that all probably obeys Moore's Law. One of my fondest memories in retrospect was working on Mars Attacks! (1996) at Warner Digital and one Friday night we had a celebratory party because we had just crossed one terabyte of memory in the facility. Back when we were doing Mars Attacks! it was cause for wild celebration. Can you imagine? Now you look back and think: how did we ever make do with only one terabyte?

Mars Attacks! Digital Model, Animation and Design ILM. Copyright 1996. All Rights Reserved.
Mars Attacks! is one of over 70 features on which Dave Stump has served as Cinematographer, Visual Effects Supervisor, and Effects Cinematographer, among other roles.

Post production should be worried about HFR, because a lot of the post tools haven't been invented yet. If you want to play back your content and edit it and synchronize sound with full files, that's a huge problem. I remember the first time I did a Viper show, I wanted to shoot at 4:4:4 and there wasn't any recording device that I could beg, borrow or steal to record that signal. Now there are devices small enough to put in your wallet pocket that will record 4:4:4. So I suspect that it's only a matter of now months before the HFR post tools are invented, but, back then, it took years to invent the tools because manufacturers didn't appreciate the need for it. Technology wasn't changing as rapidly as it is now. Now people understand the speed at which things change and the manufacturing community is quick to capitalize on a sales/ marketing opportunity and invent all kinds of things at the drop of a hat. The marketplace has become a lot more responsive.

We're pushing to get these tests done as soon as possible, but there are a lot of wheels we have to turn. Ours is, of course, a very visual business and what will happen when we have the test materials in hand is we'll put the shots up on big screens, and then discuss and debate about them for several months. Ultimately the golden eyes and the money dynamics will settle on new standards and specs that will be put into recommended practice through the standardization process, which will enable a whole new generation of hardware and software.

A lot of the manufacturers are already moving forward with their research, and they're also helping the group to drive this forward. The fact that we have server companies and projector companies involved in the study group means that we are getting all kinds of valuable suggestions outside the cinematography community on which to base our testing. The eventual result will be that audiences will see brighter, more resolved, better looking pictures.

Outside the SMPTE group, I'm working on a few little things. I'm a big proponent of 4K and I'm doing some 4K shooting. I was chasing 4K cameras back when DALSA and RED and Olympus and NHK were all pioneering ultra high def. For years and years. I've been deeply enamored of 4K and now that it's here, the reality is a great moment for me. There was a time when bloggers on CML [the Cinematographer's Mailing List] called me "Mr. 4K." I think they were teasing me as much as anything else, because of the impossibility of 4K back then, but happily, 4K is not so impossible anymore.

Dave Stump ASC
Dave Stump, ASC has worked on numerous motion pictures and television productions as Director of Photography, Visual Effects Director of Photography, Visual Effects Supervisor, and/or Stereographer, (including both live action and 2D to 3D conversions), earning an Emmy and an Academy Award for Scientific and Technical Achievement.

His credits include The Last Stand, Immortals, Quantum of Solace, X-Men 1 & 2, Into the Blue, Batman Forever, Hollow Man, Stuart Little, The Sphere, Contact, Batman & Robin, Mars Attacks!, Stargate, Free Willy, and What Love Is, among many others.

Dave is a member of AMPAS, ATAS, ASC, PGA, IATSE, SOC, SMPTE and is currently co-chair of a SMPTE study group on the subject of High Frame Rate for digital cinema. He is also participating in the AMPAS Academy Color Encoding System (ACES) file format project where he has contributed significantly in the area of metadata.

In the American Society of Cinematographers, he is currently chair of the Camera Subcommittee of the ASC Technical Committee. Under his guidance, the Producer's Guild of America and the American Society of Cinematographers recently completed both the ASC/PGA Camera Assessment Series, and the ASC/PGA Image Control Assessment Series, side by side comparisons of virtually all of the high end digital cinema cameras against film, run through industry standard 10-bit log and ACES workflows, and output to film print and Digital Cinema Package.

ACADEMY AWARDS® is the registered trademark and service mark of the Academy of Motion Picture Arts and Sciences. ALL RIGHTS ARE RESERVED.

The Emmy name and the Emmy statuette are the trademarked property of The Academy of Television Arts & Sciences ("Television Academy") and the National Academy of Television Arts & Sciences ("National Academy")

Thanks also to Debra Kaufman for coordination and additional editing on this piece.

Special consideration to Warner Bros. for images from "THE HOBBIT: AN UNEXPECTED JOURNEY":

Title image: MARTIN FREEMAN as Bilbo Baggins in New Line Cinema's and MGM's fantasy adventure "THE HOBBIT: AN UNEXPECTED JOURNEY," a Warner Bros. Pictures release.

(L-R) CATE BLANCHETT as Galadriel and IAN McKELLEN as Gandalf in New Line Cinema's and MGM's fantasy adventure "THE HOBBIT: AN UNEXPECTED JOURNEY," a Warner Bros. Pictures release. © 2012 WARNER BROS. ENTERTAINMENT INC. AND METRO-GOLDWYN-MAYER PICTURES INC.


Re: Setting Standards for High Frame Rate Digital Cinema
by Tim Wilson
Allow me to re-introduce commenter Pierre Jasmin to the rest of you: he and partner Pete Litwinowicz won an Academy Award® for the design and development of the RE: Vision Effects family of optical flow-based image manipulation plug-ins.

A long-time Creative COW Leader, we asked Pierre to walk us through his part in the development of optical flow technology for Creative COW Magazine. You can find that article here.

Always nice to listen in to experts in visual technology share their thoughts. :-)

Tim Wilson
Vice President, Editor-in-Chief
Creative COW

Re: Setting Standards for High Frame Rate Digital Cinema
by Pierre Jasmin
Off the top of my head,

One issue is probably to try to display synchronized cameras (two eyes at same time at shoot time) in alternate shutter time at projection time (so display an eye half a frame later basically). There maybe is some issues for some effets work though to have time offset streams , so shooting synched 48 FPS and exporting 2 X 24 FPS reels offseted by 1/2 frame in time from each other is probably the most of all useful increment.

The second in list is probably the capturing, mastering and delivery method for all this - there is no smart stereo compression standards , even if there is a lot of redundancy between views (block based tiles is bad for stereo) and one could say the DCI jpeg 2000 XYZ 12 bit thing is also not the greatest idea. It always felt to me some kludge designed as much for IP protection via complexity then some fundamental technological efficiency as a media format.

That said be aware of technical projection that switch from 24-48 to 72 FPS playback on still (locked-off) camera with little action. Our eyes are very good with detecting differences for such straight cut but on things where camera moves you don't resolve much the difference... we quickly adapt and forget.

Yes to have more resolution, bit depth, to be able to count on more powerful projection is all good.

On that seems 72-96 inch 2K TV (twice the amount of pixels then 1080P) would be more viable then 4K targets (2X2 HD - 3840 x 2160 pixels)as a next gen pro or consumer delivery format, as in would be a normal increment from 480 to 720 to 1080...). Seems it's building a staircase missing a step and it will take too much time to bring this to even prosumer class. Maybe Cameron and Peter Jackson can afford 4K tvs in their living-room but most people on this site, no. Also the perceptual increment on a 72-96 in. wall mounted display is much larger from 1080P to 2K then 2K to 2160P) that is the second power of 2 increment is much harder to legimate (other then for specialized purposes like passive display for stereo type of stories maybe?...). Harder to legitimate without going to 15 feet diagonal screen size at least I would say. Already with 84 inches screens who needs to go to a theater...

Not sure though about showscan as too much of a reference of some ideal or all-purpose target. Variable frame rate (playback wise) seems more ultimately useful then shifting everything to high frame rates.

Pierre Jasmin

Re: Setting Standards for High Frame Rate Digital Cinema
by denise quesnel
This is a great article, lots of valuable items here. We only joined the HFR work group just recently.

One thing I'd like to impress upon:

"We're pushing to get these SMPTE tests done as soon as possible, but there are a lot of wheels we have to turn. Ours is, of course, a very visual business and what will happen when we have the test materials in hand is we'll put the shots up on big screens, and then discuss and debate about them for several months. "

What we have found in completing production on a short HFR 3D film is how extraordinarily different the 'perceived' reaction is from the true reaction of viewers when confronted with HFR shots in an actual narrative context, rather than viewing single test shots in comparative HFR. We are blogging about the process of filming an HFR 3D narrative here:

Granted, we only filmed each shot in 24, 48 and 60 but the results are turning out to be very interesting especially once spliced together in a variable HFR sequence.


Stereo 3D Post Production Specialist, adjunct researcher at the S3D Centre in Vancouver, BC.
The S3D Centre is a premiere research, curriculum/training, production and post-production facility.

Related Articles / Tutorials:
DSLR Video
DSLRs for Digital Cinema: Their Potential, Your Responsibility

DSLRs for Digital Cinema: Their Potential, Your Responsibility

Dave Stump ASC explores the innovations and limitations of DSLR cameras and offers his insights into how and when to work with them successfully.

Feature, People / Interview
Dave Stump, ASC
Stereoscopic 3D
Setting Standards for High Frame Rate Digital Cinema PART 2

Setting Standards for High Frame Rate Digital Cinema PART 2

As part of Creative COW's ongoing look at high frame rate cinema production and exhibition, we spoke with Michael Karagosian, co-chair of SMPTE's HFR Study Group, about his thoughts on the ways that higher frame rates factor into mastering, distribution, and exhibition for cinema.

Editorial, Feature, People / Interview
Michael Karagosian
The Hobbit & The Dawn of High Frame Rate Cinema

The Hobbit & The Dawn of High Frame Rate Cinema

The technology wizards of the film/TV industry have been talking about High Frame Rate cinema for a long time; indeed, Douglas Trumbull's Showscan at 60 fps presaged the current interest over 30 years ago. But it took director Peter Jackson to take the plunge for mainstream cinema, declaring he would shoot The Hobbit in 48 fps to get momentum going. In about a year's time, manufacturers made the gear, theater exhibitors updated their movie theaters, and the studios prepared for one of the most audacious technology debuts that cinema has seen. Creative COW goes behind the scenes to see what it took for you to see The Hobbit in 48 fps.

Editorial, Feature, People / Interview
Debra Kaufman
Art of the Edit
Metadata & The Future of Filmmaking

Metadata & The Future of Filmmaking

Dave Stump, ASC has worked as a DP, effects cinematographer and VFX supervisor on dozens of films including Quantum of Solace, X-men The Bourne Identity, Star Trek: First Contact, Batman Forever and many more. He chairs the ASC subcommittees on Cameras, and Metadata, and in 2000, was part of a team that received a Technical Achievement Award from the Academy of Motion Picture Arts and Sciences, for hand-development of advanced camera data capture systems. Although this interview was first published in 2008, we've found that the ideas explored here are becoming more relevant by the day. Join Creative COW Editor-in-Chief Tim Wilson and COW Leader and camera expert Gary Adcock for this compelling look at the future of filmmaking.

Dave Stump, ASC
How To Put Yourself In Any Movie, Part 2: Greenscreen

How To Put Yourself In Any Movie, Part 2: Greenscreen

Not every VFX problem can be solved with a plug-in alone! Visual effects start with the visuals! In part two of his series on inserting yourself into any movie, filmmaker and effects artist Cody Pyper covers how to set up lighting to match shots from Hollywood movies, and how to set your camera to the best settings for shooting green screen.

Cody Pyper
The Invisible Man Cinematography, with Stefan Duscio, ACS: Go Creative Show

The Invisible Man Cinematography, with Stefan Duscio, ACS: Go Creative Show

Cinematographer Stefan Duscio, ACS and Go Creative Show host Ben Consoli discuss the technical issues behind filming an invisible character in Leigh Whannell's The Invisible Man, using a robotic camera for VFX shots and the value of unmotivated camera movement. They also discuss why Stefan still uses a light meter, filming with the Alexa Mini LF and how he prepared for an IMAX release.

Ben Consoli
The Lion King's Virtual Cinematography: Caleb Deschanel, ASC

The Lion King's Virtual Cinematography: Caleb Deschanel, ASC

Caleb Deschanel, cinematographer for Disney’s live-action The Lion King, shares how they used traditional cinematography to create the life-like virtual film. Caleb and Go Creative Show host, Ben Consoli, discuss modeling cameras and lenses for virtual filmmaking, how Caleb was able to move the sun around in virtual space to get the perfect lighting, using a real drone for the Circle of Life sequence, and more!

Ben Consoli
Shooting RED 8K for Danny Boyle's Yesterday

Shooting RED 8K for Danny Boyle's Yesterday

The magical romantic comedy Yesterday reunites cinematographer Christopher Ross BSC with director Danny Boyle to tell the story of a singer-songwriter who wakes up to discover that he's the only one in the world who remembers The Beatles. Christopher selected the RED HELIUM S35 8K sensor (with as many as 17 cameras rolling simultaneously in a single scene!) to capture a variety of looks as the story takes viewers from East Anglia to Los Angeles. With 10-15TB of footage coming in every day, this is also a workflow story, featuring DIT Thomas Patrick and the team at Mission Digital for dailies, and Goldcrest Post for online, VFX, conform, and grade.

Adrian Pennington
Spider-Man Far From Home Cinematographer Matthew Lloyd

Spider-Man Far From Home Cinematographer Matthew Lloyd

Matthew Lloyd, cinematographer for Spider-Man: Far From Home, takes us behind the scenes of the film and shares techniques for lighting and shooting massive visual effects scenes. Matthew and Go Creative Show host Ben Consoli, discuss working in Marvel’s Cinematic Universe, using pre-vis to prep for shots with VFX, creating Spider-Man’s holographic world, plus Matt’s camera and lens choice, his experience with commercial and fashion filmmaking, audience questions and so much more!

Ben Consoli
DJI Osmo Action Camera In-Depth: Taking on GoPro

DJI Osmo Action Camera In-Depth: Taking on GoPro

The DJI Osmo Action is DJI's first GoPro-like action camera. It shoots crisp 4K video at 60 FPS, and super slow motion at 240 FPS at 1080p, also with support for HDR and terrific RockSteady image stabilization. Especially interesting: TWO LCD screens to make it easy to see what you're shooting from every angle. VFX guru and filmmaker, Surfaced Studio's Tobias G puts the Osmo Action through its paces and tells all about what he likes and doesn't, with lots of sample footage for you to judge for yourself!

Tobias G
© 2020 All Rights Reserved