All eyes are on High Frame Rate cinema, the latest technology shift touted by such heavyweight filmmakers as Peter Jackson and Jim Cameron. But how many frames per second is ideal? How does HFR cinema change the workflow and the bottom line? This group of experts weighed in on why we should be excited by the opportunities of HFR cinema and what we can expect in day-to-day production and post workflows.
High frame rate (HFR) cinema isn't here yet, but it's one of the most talked-about topics in the media and entertainment space. The recent news that Warner Bros. has curtailed its 48 fps release of The Hobbit to a handful of large cities was cause for yet more conversation about the viability of HFR cinema.
Douglas Trumbull is both an expert and a pioneer in high frame-rate cinema, with his development, in the late 1970s, of the 60 fps/70mm Showscan format. Thirty years later, he's as bullish as ever on the aesthetics of HFR cinema. "The higher the frame rate, the more realistic the image, and even more so with 3D," says Trumbull. "My interest is in hyper-cinema. By combining 3D with extremely high frame rate on an extremely large screen at extreme brightness, the result is more like live performance. This offers a new interesting unanticipated opportunity to make movies that are like live events. The viewer is in the movie, on the adventure."
Trumbull isn't the only advocate for HFR cinema. Most notably James Cameron has taken up the cause and plans to produce a high-frame rate sequel to Avatar. Trumbull and videotaped presentation by Cameron were just two of the speakers at a panel on the topic at SIGGRAPH 2012.
Moderated by Christie CTO Paul Salvini, the panel also consisted of
John Helliker, director of the Screen Industries Research and Training Centre (SIRT) Centre at Sheridan College in Toronto.
The general consensus was that HFR Cinema is both desirable and inevitable. In his presentation, Cameron showcased specially shot test footage that compared 24 fps, 48 fps and 60 fps versions of the same material. He railed against the judder and artifacts seen in the 24 fps footage, noting that the reason film standardized on 24 fps was because the industry's early producers were too cheap to pay for more film stock. Both he and several other speakers spoke about HFR Cinema as "lifting a restraint" that's been in place since film's earliest days. "We're here to explore its potential to deliver more immersive, impactful stories," said Salvini.
Cameron's producing partner Jon Landau extended the director's diatribe against 24 fps cinema. "We want to find technologies that disappear and transport the audience more into the narrative story," said Landau. "We thought 3D was one step in that direction. We have a responsibility as filmmakers to continue to push technology to tell stories in better ways, to tell stories that couldn't be told before and to drive people out of their homes into theatres." Landau also pointed out that HFR Cinema does not have to be in 3D, but can also raise the impact of 2D movies.
Poster advertising The 7th Voyage of Sinbad with the newest movie-making miracle... DYNAMATION. Fair use rationale
Muren noted that he comes to this from a background of loving films. "I'd seen Doug's Showscan reel, and it looked pretty darn neat," he said. Muren experimented with high frame rates at home, by playing back Ray Harryhausen's classic stop-motion animated feature The 7th Voyage of Sinbad at higher rates. "A higher frame rate did smooth out the stop motion animation," he said. "Then I noticed the Cyclops character looked like a rubber puppet in front of a production screen, which wasn't as noticeable in 24 fps. In 24 fps, it didn't look as obviously artificial." But that wasn't his only experiment. When he played back one of the old "Mac vs. PC" TV commercials at 120 fps, he found them funnier at the higher frame rate.
"I was looking at the close-ups and the performances are better at a high frame rate," he said. "You see the intent of the actors more clearly, what they tried to do. It was an incredible thing." His conclusion? "I think HFR is a great tool," he said. "It's closer to reality. You can always filter the camera or cut it back, all the things that cameraman have done to take the curse off video. But the audiences can connect more. Add 3D on top of that and you're there. I'm a big proponent of it."
Helliker spoke about the HFR research that he's conducting at Sheridan College, in partnership with Christie. "We're working in stereo 3D and live action and are working to establish an R&D center for HFR," he said. "We'll shoot HFR on different cameras and test the workflow and delivery." In addition to Christie, Sheridan's other partners are content production companies and creative professionals. "That's helped set our agenda," said Helliker. "For us, the two critical aspects are the filmmaking process, and how HFR adds to the language of film, just like composition and framing. We want to find how HFR impacts on that. The other critical aspect is audience experience. We're going to display different types of material and get feedback of how it impacts the audience, by demographic, and look at that scientifically, to get feedback that can help filmmakers make the decisions."
The SIRT Centre at Sheridan is also working closely with international groups including SMPTE's HFR working group, and looking closely at the relationship of HFR and shutter angle. "As the shutter angle goes up, the frame is captured over a longer period of time," Helliker said, as he presented footage that showed 24 fps at 180 degree shutter (normal capture); 48 fps at 180 degree, 270 and 360 degree shutters. "You would expect an increase in motion blur, and it also affects other aspects of the image."
RealD's Matt Cowan described the human factors of viewing 3D and HFR. "One of the interesting things is that HFR isn't new," he pointed out. "We see things at an infinite frame rate because there's motion everywhere. The media industry's job is to attempt to cheat to satisfy bandwidth limitations and present as realistic as possible an image. Just as Jim Cameron said, the industry has been stuck on 24 fps on film by practical camera speeds and the cost of film. But if we look back, TV has run at a higher frame rate: 50 fps in Europe and 30 or 60 in the U.S. In the early 1970s when video display terminals were in their infancy, IBM did research and found that 72 Hz was the flicker-fusion frequency that was comfortable for the viewer."
Sam Worthington as Jake Sully, Zoe Saldana as Neytiri in AVATAR.
Photo Courtesy of WETA, Twentieth Century Fox Film Corporation.
"Across the population, there is a big spread of what frame rate you need to avoid seeing flicker," he continued. "Some people don't see any flicker, others are sensitive to it. This led us in the introduction of 3D to look at flicker rates; we knew we were limited to 24 fps capture, but it caused significant artifacts. We looked at double flash, which allowed us to present 48 fps per eye, but a big portion of the population found that quite uncomfortable because the edges were soft, you had excessive flicker, and dark spots between each frame was troubling. At 72 fps per eye, you didn't see flicker but we still had motion artifacts including blurring."
"Satisfying the human visual system of between 55 to 60 fps is a necessary part of moving to the next level of experience," concluded Cowan. "The movement for high-frame rate cinema will open up the visual experience and give us the possibility of new creative output, to say nothing of being able to go brighter as well. It heralds great things for the industry!"
Trumbull discussed his early experiences with HFR Cinema, developing 60 fps/70mm Showscan. "But I could never get it implemented in the motion picture industry because it required new screens and projectors," he said. "Although the studios liked it, they wouldn't put in any projectors unless all the movies were made that way. It was a Catch 22. I left Showscan many years ago, in a state of profound disillusionment."
Trumbull directing Christopher Walken and Natalie Wood on the set of 1983's Brainstorm, which was intended to be the first Showscan film. Paramount owned Trumbull's company, and had offered some support for launching Showscan. Unfortunately, an upheaval at Paramount led to the picture ultimately moving to MGM. Then, the unbelievable struck, and Natalie Wood died. Image courtesy MGM.
Since then, however, Trumbull has revived Showscan to recreate it as a digital process. Shooting tests three years ago, the new system offers 24, 60 and 120 fps. "We no longer have shutters in cameras or projectors so it's possible to shoot with a 360 frame rate," said Trumbull. "It captures 100 percent of what's in front of the camera. The shutter never closes and never misses anything. Since the frames are all contiguous, you can connect two frames and regain the blur."
Trumbull has another idea regarding high frame rate capture. "I realized you are no longer restricted to applying any frame rate globally to a movie," he said. "You can dynamically change it on any pixel, any scene, any character or object. Frame integrated motion analysis allows you to pick the right frame rate for the scene or character." He is currently in experiments with this, using high-gain hemispheric screens, projecting 3D at 35 foot-lamberts. "Christie is helping with special lenses and we're shooting with many different cameras to figure out the parameters," he said.
Feature animation, pointed out Salvini, gives creatives complete control over many factors including the cameras and amount of motion blur. "It's very exciting to think about," said Dreamworks Animation's Wallen. "When you think of our experiences in animating characters, our audience may have an expectation that a character can hold a heavy object, which requires us to animate it in a very physical way. We create an imagined or fantasy situation that the audience sees as real but doesn't actually conform with the real world at all."
James Cameron directing on the set of AVATAR.
He plans to produce a high-frame rate sequel.
"We're used to multiple frame rates," he added. "We do this all the time in VFX and animation and anyone who creates video games will already be familiar with running different objects at different rates to create an overall effect. Putting this in the hands of cinematographers could be very exciting. There are obvious costs for image generation, rendering being the obvious one. And it could slow down the animation process. Perhaps animators would still work at 24fps and we could automatically interpolate. But the front and back end of our pipeline could handle the increased number of images in a smooth way."
Dreamworks Animation's Beshears expressed his change in attitude about HFR. "Now that I've seen it here, it makes more sense than it did when I've seen it previously," he said. "I'm more positive about it. I've also learned that if we're going to use this, we have to take it shot by shot. In some cases that compared 24 fps with 48 fps, the 48 was a less satisfying experience."
Even so, Beshears had some words of caution regarding how HFR cinema will impact the post production process. "A 3D movie has 150,000 frames, a massive amount of data to store and move around," he said. "Rendering can be daunting. With HFR cinema, that's a reality that will have to be addressed in the whole post process."
Visual effects in a high frame rate world was touched upon by Side Effects Software's Moore and Digital Domain's Grant. Moore discussed the experiments Side Effects Software has done in HFR cinema. "The nice thing about CG is that if you want to change the frame rate, it's easy to re-render," he said. "If you think of the cost of rendering 48 fps versus 24 fps, it seems that it would be double...but it isn't. At 24 fps, you're spending time computing motion blur, and you can reduce the cycles dedicated to motion blur at higher frame rates."
"Another big difference is that temporal resolution is very different between 24 and 48 fps content," Moore continued. "The effective resolution is much higher at higher frame rate and you begin to see a lot more detail, so defects are more apparent at 48 fps. Imperfections in a visual effect that you could have gotten away with at 24 fps, you can't at 48 fps. Quite a bit more effort and sizes of simulations go up with higher frame rate. There are big implications for storage and computation time, not just render time. It's easy to get used to a 24 fps world and a frame based approach, but if content needs to be re-branded, switching to a time-based approach rather than frame-based might be a paradigm shift."
Grant described what Digital Domain anticipates with regard to a VFX pipeline for HFR cinema. "There's a lot of extra work on the VFX side," he said. "As Luke [Moore] said, it's not simply rendering, but all the data to generate that image. And it's not just twice the data; there are one or two orders of magnitude of data that feeds into that final image that goes into the animation pipeline. That's something that has to be considered."
He also referenced Muren's description of how the Cyclops from The 7th Voyage of Sinbad looked more obviously like a puppet when rendered in high frame rate. "We haven't had to fix these things in post up until now," he said. "That means paint and roto for every frame...and now with two times the number of frames." Grant also noted that, just as toolsets helped workflows smoother for 3D, something similar is now required for HFR cinema. "You'll have to see a lot of work done in the tool creator side to make HFR more successful," he said. Grant also noted that we're still in the very early days of HFR cinema. "No one has asked us at Digital Domain to even bid on a HFR project yet," he said. "There are challenges in terms of how you think about shooting a film when you're going from 2D to 3D and now, again, from 24 fps to 48 fps."
Of all the speakers, Park Road Post's Oatley was the only one with hands-on experience. In fact, he's working on the first-ever commercial HFR feature film, The Hobbit. He described director Peter Jackson's big desire to shoot HFR. "We were trying to achieve the director's vision," he said. "In the beginning of 2010, we looked into the commercial and technical viability of HFR cinema, at 48 and 72 fps -- knowing that we would still have to deliver a 24 fps version of the film."
"We started talking deeply with technology partners to work with us on creating and accepting HFR, Christie being one," he said. "After a certain amount of R&D, we realized 48 fps was playable on Gen 2 hardware, but 72 frames was a little outside the current capabilities of hardware. There were also issues around bit rate ceiling of DCPs given the time frame of The Hobbit's release. We also did exploration around shutter angle, which is vital to the look and feel. It had to translate to 24 fps."
The company met SGO, which had developed Mistika, a DI platform with a 3D toolset and a platform that enabled Park Road Post to build its own tools. "RED was on the cusp of releasing the Epic cameras and we got a handful of these cameras and 3ality rigs and embarked on a week long shoot to see what this would be like in a real-world scenario," Oatley said. "During that testing phase, we pulled together the first 48 frame dailies workflow and the ability to deliver to a 24 fps editorial workflow because there are no real editorial tools to cut natively at 48 and translate that back to a 24 fps timeline. We had to match dailies turn-around times, to deliver to the production. We managed to do that through the development we did with Mistika."
Park Road Post's workflow began with a digital lab, where everything is moved into the company's in-house-developed asset management system. "Everything generated is tracked and moved through the asset management system. The Mistika environment develops dailies, which the stereographer and colorist work on simultaneously. We ran two to three hours of dailies screenings a day, projected in our cinema."
"HFR capture creates an avalanche of data," he continued. "We generated 6 to 12 TB of original camera data each day and we shot 6 days per week. We designed systems to cover that volume of data as quickly as we could. In 35mm terms, we processed 24 million feet." Post production on The Hobbit is still underway, and Park Road Post is, said Oatley, still committed to moving HFR cinema forward.
The Hobbit with Peter Jackson, who filmed The Hobbit with RED Epic cameras shooting at 48fps. Warner Bros. is considering a limited release at the higher frame rate.
Landau noted that, "we'll learn a lot from The Hobbit," and added, "Jim is still trying to learn. Initially, he thought 48 fps was enough, then he looked at higher frame rate. One thing we do think about HFR is that filmmakers don't need to come up with a consensus: people can choose to do what they want. The technology allows you to do that."
Muren was even more to the point. "There should never be a consensus on how to do this," he said. "Directors will be driving this. We should listen to them and give them what they need."
The enthusiasm over the possibilities of HFR cinema was tempered by some of the impact on VFX and post production toolsets and workflows. But if the past -- including the recent past with the adoption of 3D -- is any indication, this won't hold anything back. In his videotaped presentation, Cameron went so far as to say that a move to 4K imagery was meaningless as long as the frame rate stayed 24. When it's finally released, The Hobbit will tell the studios a lot about how HFR cinema is received by the general public -- at least in the cities where that 48 fps version is released. If audiences are enthusiastic, everything will fall into place: studios will greenlight HFR projects, some directors will enthusiastically embrace it, hardware and software vendors will come out with the technology to handle it, and the VFX and post houses will deal with the consequences, as they always have.
We've got a lot of time and a lot of work and development to get to that place. Numbers will be crunched, from dollars to bit rates. Concerns will be aired. Successes will be hailed. Stories will be told. At Creative COW, we intend to continue the conversation.
HBO holds the highest reputation for television image quality, often based on an aesthetic very closely tied to their use of film. The stakes as they move toward digital pipelines are especially high as they very carefully consider their next steps, for both production and post.
James Neihouse, the large format cinematographer renowned for his work on projects from shuttle launches to volcanic eruptions, and newly-minted Academy member, finds himself working around the globe, literally, shooting the IMAX 3D film, Earth 2.0 (working title) co-produced by Walt Disney Pictures and NASA. In this feature, Neihouse reflects on experiences working with astronauts, race cars, and rocket launches, and how important choosing the best equipment is in extreme production.
Douglas Spotted Eagle has a broad experience base with POV cameras designed for action shots. Rather than choosing one catch-all, must-buy cam, he will provide information about which action-camera is best suited for specific criteria which can be then used to help you make informed purchasing decisions.
Rodney Charters, ASC, CSC is probably best known for his work as cinematographer on the groundbreaking international hit 24. The most recent of his many jobs since then is TNT's Dallas, where he has shot 36 episodes so far, and directed three. Creative COW Contributing Editor Debra Kaufman spoke to Rodney about his work on both sides of the lens, including his use of the Blackmagic Design Pocket Cinema Camera to complement the production's ARRI Alexas.
Sure, some of the biggest movies are still being shot on film, but in a world where film cameras haven't been in production for years, where only one company still makes motion picture film, and by the end of the year, there may be no more film prints for theatrical distribution, it's no wonder that the number of businesses who can sustain themselves by processing film has plummeted. In this latest dispatch from the film BUSINESS, Creative COW Contributing Editor Debra Kaufman finds industry leaders asking themselves, will 2014 be the year we see the last film lab?
After collaborating on the 1985 pilot for the highly successful "Power Rangers" franchise, DP and Digital Cinema Society President/Co-Founder James Mathers recently worked on a new pilot with the same producers. This offered him the perfect opportunity to put several new products through their paces on his RED Epic-based 4K production: including greenscreen virtual sets and on-set VFX previz tools, a wide variety of lighting options, the Aja Ki Pro Quad, and much more.
What were the big technology trends in media and entertainment over the past year? What's going to be significant in the coming year? Those are questions that many of us are asking, so we went to some of our savviest Creative COW contributors to ask their opinions of where we've been and where we're going. Here in Part 1, we offer an overview of their perspectives. (See Part 2 for their additional insights.)