Setting Standards for High Frame Rate Digital Cinema
COW Library : Cinematography : Dave Stump, ASC : Setting Standards for High Frame Rate Digital Cinema
The SMPTE workgroup on High Frame Rate was established eight months ago, and is co-chaired by Michael Karagosian, Kommer Kleijn, a Belgium cinematographer, and me. The work group was formed partly as a response to the creative community by several studios, which realized they would have to accommodate the eventual distribution of High Frame Rate (HFR) content. The studios distributing The Hobbit and Avatar 2 are keen to jump in and help work out solutions, but, in the end, the entire film community is going to have to grapple with the issues of HFR, including the exhibitors. Currently, SMPTE standards and recommended practices don't extend to higher frame rates or higher bit rates for compression. So the HFR working group was a necessary step towards that. Currently there are 50 to 60 people in the working group, and there is a huge amount of interest in the organization in this topic.
The working group is constructing a test plan and a shot list. We're trying to whittle that shot list down to something that is do-able in size, without allowing it to grow exponentially. We hope to have help from the studios to sponsor some tests. We will need to test with the RED Epic, ARRI Alexa, Sony F65; we'll test with all these cameras and at a variety of frame rates including 24, 25, 30, 48, 50 and 60 Hz, as well as 72 and 120 fps, which will help to create a library of test material.
Early screenings of "The Hobbit" raised controversy about whether 60 frames per second allows a cinematic look. At this point, however, tools for post and projection are just now becoming available, so the SMPTE Working Group on High Frame Rate Cinema still has a lot of work ahead of it. Click image for larger view.
Our primary area of interest is going to be things that stress compression, both JPEG 2000 and MPEG. That's because, ultimately, a DCP is a JPEG 2000 compressed file and both JPEG and MPEG files are on our list of deliverables. What will stress compression is different for each scheme. For JPEG 2000, when you have fine detail in darker background areas behind faces in the foreground, JPEG tries to preserve the fine detail in the faces instead of the fine detail in the background areas.
MPEG has problems with fine detail in motion, across the frame or especially in rotation. If you took a camera and pointed it at a very large stadium crowd and rolled the camera as you panned it, you could really stress MPEG encoding.
I'm interested in testing all the parameters of high frame rate and their effects on 3D viewing. There are some motion artifacts that I'm interested in testing. For example, when footage is acquired synchronously and then played back asynchronously, some subtle but funny things happen. With synchronously shot content played back asynchronously, objects moving across the frame change in depth, dependent on which direction they are traveling and how fast they are moving across the frame. Objects moving from left to right can exhibit a different apparent z depth than objects moving across the frame at the same distance from camera, but moving from right to left.
I'm concerned with HFR in 3D and 2D, but by virtue of doing the testing in 3D, we end up with bonus content in 2D as the single eye of one pair. This technique probably won't extend to 4K HFR testing and I am looking forward to comparing 4K material shot through a 3D beam splitter to material shot clean. For now, we're going to try to do all the testing in 3D, because 3D is more relevant for now as all the big HFR movies coming out are 3D.
I think it's really interested to see that, now that we're not paying by the foot for film and developing, how many of the constraints of our filmmaking past seem to be falling by the wayside, the frame rate among them. I think it was two years ago at NAB Digital Cinema Summit, Doug Trumbull and I both spoke and ended up giving almost the same talk. Doug broke it down into the four vectors of progress in motion imaging: increased resolution, increased bit rate, increased frame rate, and increased projection brightness.
HFR is just one of the four new vectors of motion imaging. Are all four vectors of equal importance? Nobody knows, but they all contribute to a more realistic viewing experience and ultimately it's just new tools for the creative community. This gives them the chance to make images that result in a more realistic viewing experience. You can always dumb down the tools and make it less realistic; you can always darken things, decrease resolution and so on. But it doesn't work in the opposite direction. If you don't start with all the vectors, you can't aspire to them.
I've shot things in HFR but then again I've been a VFX cinematographer for years. I used to work for Doug Trumbull in the Showscan days, which is part of what has made me so enthusiastic about high-frame rate cinema. When we used to do 60 fps 65 mm in Showscan, if you sat an audience down in front of that projection, frequently they couldn't tell what was real and what was projected. For example, Doug made a film with what appeared to be a janitor walking across a dimly lit stage, cleaning up and then noticing the audience is there. Until the first cut of the film, the audience didn't know it wasn't a real janitor walking around on stage but that it was part of the film. It was that startlingly real.
Up until now, the challenge to HFR filmmaking has been that you need more disk space, more digits, increased storage space. But that all probably obeys Moore's Law. One of my fondest memories in retrospect was working on Mars Attacks! (1996) at Warner Digital and one Friday night we had a celebratory party because we had just crossed one terabyte of memory in the facility. Back when we were doing Mars Attacks! it was cause for wild celebration. Can you imagine? Now you look back and think: how did we ever make do with only one terabyte?
Post production should be worried about HFR, because a lot of the post tools haven't been invented yet. If you want to play back your content and edit it and synchronize sound with full files, that's a huge problem. I remember the first time I did a Viper show, I wanted to shoot at 4:4:4 and there wasn't any recording device that I could beg, borrow or steal to record that signal. Now there are devices small enough to put in your wallet pocket that will record 4:4:4. So I suspect that it's only a matter of now months before the HFR post tools are invented, but, back then, it took years to invent the tools because manufacturers didn't appreciate the need for it. Technology wasn't changing as rapidly as it is now. Now people understand the speed at which things change and the manufacturing community is quick to capitalize on a sales/ marketing opportunity and invent all kinds of things at the drop of a hat. The marketplace has become a lot more responsive.
We're pushing to get these tests done as soon as possible, but there are a lot of wheels we have to turn. Ours is, of course, a very visual business and what will happen when we have the test materials in hand is we'll put the shots up on big screens, and then discuss and debate about them for several months. Ultimately the golden eyes and the money dynamics will settle on new standards and specs that will be put into recommended practice through the standardization process, which will enable a whole new generation of hardware and software.
A lot of the manufacturers are already moving forward with their research, and they're also helping the group to drive this forward. The fact that we have server companies and projector companies involved in the study group means that we are getting all kinds of valuable suggestions outside the cinematography community on which to base our testing. The eventual result will be that audiences will see brighter, more resolved, better looking pictures.
Outside the SMPTE group, I'm working on a few little things. I'm a big proponent of 4K and I'm doing some 4K shooting. I was chasing 4K cameras back when DALSA and RED and Olympus and NHK were all pioneering ultra high def. For years and years. I've been deeply enamored of 4K and now that it's here, the reality is a great moment for me. There was a time when bloggers on CML [the Cinematographer's Mailing List] called me "Mr. 4K." I think they were teasing me as much as anything else, because of the impossibility of 4K back then, but happily, 4K is not so impossible anymore.
His credits include The Last Stand, Immortals, Quantum of Solace, X-Men 1 & 2, Into the Blue, Batman Forever, Hollow Man, Stuart Little, The Sphere, Contact, Batman & Robin, Mars Attacks!, Stargate, Free Willy, and What Love Is, among many others.
Dave is a member of AMPAS, ATAS, ASC, PGA, IATSE, SOC, SMPTE and is currently co-chair of a SMPTE study group on the subject of High Frame Rate for digital cinema. He is also participating in the AMPAS Academy Color Encoding System (ACES) file format project where he has contributed significantly in the area of metadata.
In the American Society of Cinematographers, he is currently chair of the Camera Subcommittee of the ASC Technical Committee. Under his guidance, the Producer's Guild of America and the American Society of Cinematographers recently completed both the ASC/PGA Camera Assessment Series, and the ASC/PGA Image Control Assessment Series, side by side comparisons of virtually all of the high end digital cinema cameras against film, run through industry standard 10-bit log and ACES workflows, and output to film print and Digital Cinema Package.
ACADEMY AWARDS® is the registered trademark and service mark of the Academy of Motion Picture Arts and Sciences. ALL RIGHTS ARE RESERVED.