Technology 2014 | Production, Post & Beyond: Part TWO
COW Library : Cinematography : Debra Kaufman : Technology 2014 | Production, Post & Beyond: Part TWO
STEVEN POSTER, ASC, PRESIDENT OF THE INTERNATIONAL CINEMATOGRAPHERS GUILD, LOCAL 600 OF THE I.A.T.S.E.
And as the tools get better, faster and less expensive, we're finding out that this work is being done by crew people, whether it's in production or post production, whether it's on the set or in editorial. And that's pretty exciting. What it allows for is the image intent of the director and director of photography to be preserved in a way that we've never been able to control before. The director of photography can, on the set, interpret what the image should look like before it goes into post production, and that information can now flow through to the digital intermediate and the final translation of the image. No, we're not doing final color on the set. There is still work to do at the end, because you have to match scene to scene. But the exciting thing is that we can create dailies that look right. The tools for calibration are improving.
One of the great things we're going to see is the advent of workflows built around the ACES color management architecture. That's a very important development; it gives us a big enough bucket to put everything in so that every quality of the image can be preserved and used down the line. And it's a standard where we've never had a real standard before. So, we will be able to rely on calibration when it is fully implemented all the way through the system, from beginning to end, in prep, on the set, in the editing room and in post production and in working with VFX, compositing, animation. It's becoming a closed chain and that's the important part. The artistic intent, the communication of the story through the director's concept and through the director of photography's vision is now truly a possibility so that everyone can understand it every step of the way.
DR. SIEGFRIED FOESSEL, HEAD, MOVING PICTURE TECHNOLOGIES DEPARTMENT, FRAUNHOFER IIS
In 2014, we can plan on seeing the industry's first use of specialized lightfield camera arrays for production and the first integrations of lightfield processing with postproduction tools.
Also in 2013, the Interoperable Master Format (IMF) was proposed with new extended capabilities for storing 4K master formats or digital archive formats, essentially making it a future-proof storage standard format for movie productions. The use of the IMF in its extended app#2 flavor for postproduction will become prevalent in movie productions in 2014.
JOHN GALT, SENIOR VICE PRESIDENT OF ADVANCED DIGITAL IMAGING, PANAVISION ON 4K & THE FUTURE OF IMAGING TECHNOLOGY
A further confusion is that all the "4K" cameras introduced in 2013 use Bayer pattern sensors and employ some form of data compression even as they describe their outputs as "raw". This fancy footwork results in image data per frame of as little as 3MB. This is less than half the data per frame of an uncompressed high definition RGB camera image. However, when the compressed Bayer pattern is de-Bayered and decompressed, the RGB data is now 33MB per frame. This is the data rate that post-production and archivists must now deal with.
The area of greatest concern to me, that we at Panavision can do nothing about, is the problem of theatrical projection of 4K images. With most television panel makers phasing out 2K for screen sizes above 55 inches, the big screen TV you buy next year will be 4K whether you want it or not! Contrast this with the four out of five theatrical screens that cannot project 4K. Since the exhibitors have only recently spent the money to convert from film to digital projection, this is unlikely to change for many years to come. We are faced with the specter that for the first time in history, television could exceed the image performance of theatrical projection.
Another problem with the one-dimensional metric 4K being a surrogate for resolution, is that it completely ignores the contribution of optics to the process of creating a high resolution image. Almost all existing lenses for film formats do not obtain their best performance when used on digital cameras. There are a number of reasons for this. First, as any cinematographer knows, nothing thicker than a gelatin filter or a net should ever be placed behind a lens. Digital cameras, however, have various glass and crystalline materials between the lens and the imager that can be as much as 3mm or more compared to the 0.1mm of a gelatin filter. All this material behind a lens, which was originally designed to image in free air, causes chromatic and other imaging errors that were not visible on film, or lower resolution 2K cameras. These problems are more noticeable at smaller f-numbers and shorter focal lengths.
Although the migration from film to 2K digital, and now to 4K digital and beyond creates many imaging challenges, it also creates great opportunities for the lens designer and manufacturer. Freed from the constraints of film emulsions and spinning mirrors, the lens designer can now reinvent the cine lens. These new lenses will combine classical optics with electronics that will enable the cinematographer to explore a new creative landscape in depth, not just the two-dimensional post process of color correction.
The Bayer pattern images, with built-in image enhancement, have also caused many cinematographers to look to optics to enable the differentiation of their images. One of Panavision's most successful recent lens offerings, PVintage, is a series of classic lenses, some more than 50-year old designs, that we have rehoused and updated mechanically without changing the glass.
John Galt is one of the straightest talkers in the business. Here he cuts through "intentional obfuscation" as he lays out the difference between "real" pixels and "marketing" pixels – and why pixels are the wrong way to talk about resolution.
At the other extreme, Panavision has recently reintroduced the Primo line of optics, now called the Primo Vs, with aberration correction for digital imaging. Other lens manufacturers will probably do the same as they introduce new lens series, but all existing film optics are not optimal when used on digital cameras. The other problem is that existing optics, from all manufacturers, were designed to image on color film emulsions that are at least 35 microns thick. Most 35mm cine lenses are optimized for a spatial frequency of around 20 line pairs/mm whereas a 4K 35mm size sensor would need a lens optimized for 40 line pairs/mm. This implies a photo-site of around 6 microns and a height of less than 5 microns. Just as 4K is four times the resolution of 2K, a lens with 80 percent contrast at 40 line pairs/mm has four times the performance of a cine lens at 20 line pairs/mm.
Panavision has also recently introduced a new series of optics, the Primo 70 series, that are designed to cover the 70mm format with better than 80 percent contrast at 40 line pairs/mm. These lenses can also compensate for a behind the lens optical thickness of up to 7mm of glass (BK-7).
The future of imaging technology will see a greater integration of the various components that make up an imaging system. Lenses will talk to cameras and cameras will not just record images but spatial and motion information, which will be captured as metadata and facilitate post-production processes, that today require heroic effort and vast quantities of money. Even as we gear up for a 4K future, the Japanese broadcaster NHK plans its introduction of 8K television by 2018 and intends the 2020 Olympics to be broadcast nationwide in 8K. Which, once again is not twice the resolution of 4K but four times. The lens and camera designers will be kept very busy for the foreseeable future!
WILLIAM FEIGHTNER, CTO, COLORFRONT
Colorfront has been working on a number of different scenarios for distributed and collaborative workflows. At CineGrid, what we showed was the ability to connect over the standard Internet, working over the Amazon cloud service and playing back live 4K material. Distributed post is a trend because it's perceived as being desirable. Right now, we're all looking at how to distribute and collaborate with high quality images.
My personal take on high frame rates is that it's here to stay. There is nothing inherent about 24 frames. But we have to remember that it's not an either/or scenario. We'll see different rates used for different things. Right now, there's a black and white argument over HFR, but it is just another tool to tell the story. There are discussions of variable frame rates within a single project and even different layers in a single image being at different temporal updates.
One of the other things driving more temporal updates is high resolution. Higher resolutions demand more temporal update. Standard TV in the U.S. has always had a temporal update of 60 fields a second. So, when it comes to frame rates, no one size fits all. That will be a big theme going forward.
With 3D, we had yet another blip and then it went away because of bad, improper usage. To dimensionalize a project as an afterthought does not have long-term success. In most cases, it doesn't work well; it has to be integrated in the initial project. This is the pattern we've seen throughout the history of 3D, but we keep learning a lot each time. Although the use of 3D has retracted, what's left is some validation on its use as a creative tool. When it's properly applied, it can really look good.
One of the big difficulties is implementing it for live capture is that it's extremely expensive to do properly. New tools are evolving that could potentially lower the cost. To put it simply, you don't necessarily need two high resolution cameras side by side; there are other ways to sense depth data and have really good looking 3D. As 3D matures more as an aesthetic choice for certain things, we'll see improvements in how it's implemented to lower the cost. A lot of things are being tested that can work well.
A Conversation with Bill Feightner, New Colorfront CTO
With 4K, the train left the station. The driving force for 4K is on the consumer side with larger and larger displays. TV manufacturers certainly tried to stimulate sales by offering 3D but that didn't seem to go far, and the 3D offerings have diminished. But they're pushing hard on 4K much more than they did on 3D. The cost for 4K TVs is not significantly higher than what we've been paying for 2K, and we will go beyond that probably to large video walls that could include all kinds of other functionalities. But, for now, 4K is around the corner. It's not a passing trend.
When you talk about ACES, I think the bigger picture is that standardization in imagery is much needed. We see total chaos out there with all the different cameras and distribution approaches. Our industry cannot afford those inefficiencies because it costs money and we don't have the budgets to waste money. ACS offers a standardized image specification for a container that is large enough to be future-proofed. We already have a pretty good standard for digital cinema projection, but there are extreme variances in home delivery that cause a lot of complaints.
We badly need to keep working to that point on the home side to accommodate new standards. For example, we're mastering for Rec 709 and very few home sets will accurately display Rec 709; most go far beyond that. It's an outdated standard and we need to implement something new. Most importantly, we need a goal post and something for the TV manufacturers to agree on, especially with larger color spaces and high dynamic range HD imagery.
NORMAN HOLLYN, PROFESSOR, EDITING TRACK HEAD, MICHAEL KAHN ENDOWED CHAIR, USC CINEMATIC ARTS ON 3D, 4K, & COLLABORATION/MEDIA MANAGEMENT
Author of The Lean Forward Moment and The Film Editing Room Handbook (4th Edition)
Instead, I see the postproduction world in 2014 dealing with two main issues – creating better ways to collaborate over great distances and helping us out of the media management logjam. On the latter point, we are now reaching the tipping point where media creators, both large and small, have been shooting and storing way too much data to keep track of it in any easy and profitable way. This is complicated by the disconnect between capture companies and post – codecs and capture techniques have always run ahead of our NLEs ability to ingest, edit and finish with them.
The industry is finally focusing on what metadata is worth capturing up and down the entire chain, and how to do that. That will result in new hardware and software products for smaller users as well as the big ones, and we may see some companies merge who serve the different ends of the chain. I believe that this trend will pick up steam later in the year.
As for distance collaboration, my last three films have all been done over long distances – on two of them, I never even met the directors. This has its ups and downs. I can work with people all over the world who I never would have had the opportunity to work with before. On the other hand, we are missing the personal and the eye contact that can be so essential to a successful project. A number of people have been working on tools to solve some of these personal issues during the past year. I hope that we will start to see the fruits of their labor in the coming year.
DAVID STUMP, ASC ON ACES WORKFLOW
The adoption of log workflows is another great thing. It's astounding. The first time I worked with a log workflow, shooting my first Viper movie in 2002, everybody was so averse to anything log because they didn't know how to use the signal. Log was an orphan. And it was stillborn then. Now it's the easiest thing on the planet. Everybody knows it and expects it and if you're not doing a log output on your camera, you're well behind the times.
3D TV, UltraHD, HFR, and higher projected brightness levels – they are all inevitable, and we'll see them all emerge in combination. One missing factor that nobody is talking about that's a revolutionary technique is wider shutter angle. I'm a huge, huge proponent of wide shutter angle and I always have been. Did you know that the amount of blur in a single 24P frame is greater with a 180-degree shutter than the amount of blur in a 60P frame with a 360-degree opening? If you're going to go to high frame rates, why not leave the shutter open? It's still less blur than at 24P.
Here's the trick to thinking about things such as high dynamic range, higher resolution, and higher frame rates. The human eye at its native refresh rate is between 200 and 300 Hz per receptor, per rod or cone, asynchronous. Nobody realizes that. The return on frame rate in cinema is that you get more and more information up to between 60 and 72 fps; above that, the resulting return for your investment drops off substantially. Humans generally can't see the difference above 72 fps; it's insignificant enough as to not be cost effective. Once you climb from 24 fps to 60, 66, 72 fps, you can test and measure the difference.
With regard to resolution, the human eye in retinal density, in terms of rods and cones, is equivalent to roughly 6K, per degree of viewing angle (and up to 8K if you're a jet pilot!). Once we get to 4K or 6K at 16 bits of dynamic range – like we're going to go to in ACES – with a refresh rate of 60 Hz with the possibility of a 360-degree shutter opening, we've reached the threshold of human visual acuity. There is nowhere to go from there. But we're quickly approaching that as an imagery criterion and we don't know it yet. There's nobody driving the imagery development as a whole, with the exception of the Academy and the ASC. Everything is creeping inexorably in that direction, and I don't see any need to go beyond that.
Read more: Dave Stump, ASC reports on the industry's progress toward setting standards for high frame rate digital cinema.
LAURENCE J. THORPE, SENIOR FELLOW IMAGING TECHNOLOGIES & COMMUNICATIONS GROUP
PROFESSIONAL ENGINEERING & SOLUTIONS DIVISION AT CANON
There was an increasingly vigorous industry discussion throughout 2013 on the role of higher frame rates in the context of 4K and UHDTV in general. Within the television discussions there was broad agreement that 30 fps was much too low for 4K motion imaging and that at least 60 fps was essential. 120 fps has been formally incorporated as an upper frame rate in the new ITU Recommendation ITU-R BT.2020 for UHDTV (both 4K and 8K). Convincing demonstrations by both BBC and NHK research on the benefits of higher frame rates are spurring a broader global examination. In the cinema world, it would appear that progress has been made in persuading some to consider shooting digital theatrical features at frame rates considerably higher than the long-established 24 fps. It remains to be seen what forms of features best lend themselves to such elevated frame rates. Time will tell all.
4K AND BEYOND
YURI NEYMAN, ASC, DIRECTOR OF PHOTOGRAPHY, PRESIDENT OF GLOBAL CINEMATOGRAPHY INSTITUTE ON "EXPANDED CINEMATOGRAPHY"
"Expanded Cinematography" is the combination of live and virtual cinematography which dominates the visual landscape of today's image making technologies in motion pictures, television, web-content, gaming and other visually based genres of performance and presentation – old, existing and the new which are yet to be discovered and developed.
New forms of relationships and interdependence between "traditional" cinematography, art direction, VFX/FX, "virtual cinematography" and previsualization will continue and evolve into additional varieties of technological and artistic pipelines and workflows – from the previsualization to the further incorporation of the "traditional" post-production methods of color correction and editing into the set operations and to the post and internet distribution.
In general we will see the advanced continuation of the "Democratization" in general and "Hybridization" in the professional "image-building" process.
The ever cheaper and more powerful cameras, lighting and editing software, hardware and equipment will allow more people (not necessarily knowledgeable and prepared to it) to have access and to own the equipment and to produce more and more "fodder" for Facebook, YouTube, Vimeo etc.
The process of "Hybridization" is evident when traditional and non-traditional methods of the image building are mixed and "the Hybrid imagery" created.
With the ever-advancing creative and technological approaches to the narrative script structure, the Industry will continue to develop new methods of "visual storytelling." The fast changing global, social, and cultural paradigm demands innovative approaches to meet the ever changing needs of the different kind of audiences via many new and more "visually effective" storytelling outlets such as gaming, web, and few others. And this has already led to changes in cinematographers profession
The fundamental changes in cinematography already have occurred, and will continue to occur. All previous "revolutions" in cinematography cannot be compared with the ongoing Expanded Cinematography/Hybridization Process. In all previous "revolutions" affecting the image-building process (such as sound, color, wide-screen, digital), despite the way visual expression was changed, nobody questioned before the authorship or cinematographer's authority in the images creations. But, that is exactly what is happening now.
Recent feature films have amplified the traditional role of the cinematographer and made even more people appreciative of the cinematographer's artistry and craft. However, today and going forward, images in film are no longer produced as the result of only the traditional tools of cinematography.
The Academy Awards selections clearly illustrate the issue. It is not a coincidence that for the last four years the Oscars for both Cinematography and Visual Effects were given to the same film. This consecutive trend is a strong confirmation of the new artistic and technical visual paradigm that is emerging, and will continue.
And we all will look very attentively what will happened in 2014 with nominations and awards for Cinematography and Visual Effects for "Gravity". The current Academy president Cheryl Boone Isaacs stated: "Technology is changing the definitions of what we do." It will be an interesting year!
CURTIS CLARK, ASC ON HIGH FRAME RATE, 4K & ACES
As motion pictures, along with scripted TV dramas, migrate from film cameras to shooting with digital motion picture cameras using end-to-end digital imaging workflows, we're in danger of losing what many consider to be important constituents of our traditional photographic motion imaging. For better or worse, 24 fps image capture with 24fps projection has been the industry standard since 1929 and is deeply embedded in our aesthetic appreciation of the cinematic art form. Motion image blur and strobe effects from camera panning were not viewed as "artifacts" that necessarily needed to be (nor could be) eliminated. These "artifacts" were considered non-optional parameters that governed temporal resolution of motion picture film capture and projection. They, of course, have been routinely managed and dealt with as intrinsic components of the film motion-imaging platform to be controlled in an artful way by the cinematographer and director when composing and executing shots.
The spatial resolution of 35mm film negative has been considered generally sufficient, especially when digitally scanned as 4K. To a certain extent, HFR is an attempt to compensate for the limited spatial resolution of some digital cameras by adding temporal resolution when shooting and subsequently projecting higher frame rates, e.g., 48fps/60fps. HFR is sometimes used to overcome image artifact issues related to fast action in 3D presentations. For some filmmakers HFR tends to reproduce a "video look."
Filmmakers have been discovering the creative potential of digitally capturing and digitally projecting 4K images with four times the spatial resolution of 2K or HD video images. The ability to capture and render images with significantly enhanced image detail adds tremendous potential for creating more viscerally engaging images. This also applies to the new UHDTV (aka 4K TV) display platform. Of course, the optical performance of the lens is also a major factor in the reproduction of image resolution and sharpness, as well as contrast. Resolution is not the same as sharpness. Digital video cameras frequently use electronic sharpening to artificially enhance edge details, which contributes, along with video's 30fps, to the general perception of what is frequently referred to as a "video look."
Assuming that no "artificial," i.e., electronically enhanced, image sharpening is used on a 35mm 4K (preferably 16-bit) image scan, the 35mm film negative has a more natural sharpness combined with greater spatial resolution than is possible with HD or 2K digital image capture. Since film is still considered the motion picture imaging benchmark for achieving a "film look," digital motion picture cameras are frequently judged in relation to their ability to digitally replicate a high quality film look." In addition to 4K spatial resolution, this should include the ability to capture naturally sharp images with a wide dynamic range of scene tones, along with greater color bit depth (preferably 16-bit) to enable greater precision in color grading for achieving optimal creative results. The Sony F65 and the Sony F55 are two of the latest generation of digital motion picture cameras that are able to record 4K images with 16-bit wide gamut color, along with a wide dynamic range of scene tones.
ACES (ACADEMY COLOR ENCODING SYSTEM)
2014 should see a more rapid adoption of ACES, especially as filmmakers and post facilities better understand both the creative and cost efficient advantages. ACES is the only color management system that not only protects, but also allows the filmmaker to take full creative advantage of wide dynamic range 16-bit wide gamut color images.
For a broader look what recent trends can tell us about the future, please see
Technology 2014 | Production, Post & Beyond: Part ONE
What were the big technology trends in media and entertainment over the past year? What's going to be significant in the coming year? Those are questions that many of us are asking, so we went to some of our savviest Creative COW contributors to ask their opinions of where we've been and where we're going. In Part 1, we offer an overview of their perspectives.