B. Sean Fairburn, SOC is an HD DP with a wide range of credits as a Director of Photography and camera operator in television, film, commercials and documentaries. Among his film credits are Windtalkers, Once Upon a Time in Mexico, Expendables and The Mechanic. TV credits include Super Bowl XXXVIII, Star Trek Enterprise, House MD, American Dreams, and the documentary Inside the Space Station. As a U.S. Marine, Sean shot the beginning of Operation Iraq Freedom by mounting an HD camera on the back of a Humvee and accompanying U.S. Marines from the breach into Iraq to the streets of Baghdad. The resulting footage led to Sean being awarded an Emmy for its inclusion in Dan Rather's 2007 Combat Photography. A pioneer in shooting HD, as well as a cinematography instructor, Sean is now CEO of MIO 3D.
Sean Fairburn SOC
I'm always fascinated by the next, more difficult challenge. Anybody can be a pilot, but for me 3D has always been something more like space flight; very few people can work at the level of an astronaut. Several years ago, I was blessed to learn 3D from Peter Anderson, ASC, who's really the godfather of modern 3D production. Jim Cameron, Vince Pace, Steve Schklair, Steve Pezo, Gary Shino were all educated in 3D by him. I met Peter as an engineer, and he taught me 3D engineering and how to tune lenses and ultimately just the philosophy and concept of 3D.
I was hired to tune two HD cameras to match within an inch of their lives and color correct them live on the fly through a projection system for a Pepper's Ghost set-up. It was for a museum in Germany, with an actor playing the role of Bach. As he walked around, the set would change around him. One camera was shooting a black stage with an actor on stage and another is shooting a performance set with other actors. By melding the two, you create a ghost that can half-resolve between the two and then project it onto a set with a live stage. The color and density had to be seamless.
Click image for larger view.
Working with Peter helped me become intimately aware of stereo. The more I worked in 3D, the more I discovered the Achilles heel: that the interocular distance isn't changeable later. Some math can be adjusted, some can't. What can't be adjusted forces you into a screen size appropriate for the scale. I had this problem repeatedly, often with producers or directors regularly asking how they can adapt their 3D material for a smaller screen. And the answer was always, no, it doesn't work well. It's really only meant for one screen size.
Working with a (now 3ality) Element Technica Rig. Click image for larger view.
I've come up with a solution that gets around that problem. The genesis was in 2000, when I was working on a 3D project with Max Penner and Paradise VFX
, and, in thinking about ways of solving this problem, came up with a "piggyback" extra camera. At the time, it was tough to get three cameras to work, but technology has changed and cameras have gotten smaller, better and more consistent. A couple of years ago, I revisited this concept and since then I've been working on MIO, or multiple interocular 3D. This allows the cinematographer to take a small, medium and wide interocular at the same time so, in post, you get to pick which pair is most appropriate for the shot.
MIO, or multiple interocular 3D allows the cinematographer to take a small, medium or wide interocular at the same time. Click images for larger view.
Take for example a shot of someone walking up to the front door of a house. In the current way of shooting 3D, the first question you ask is where this is going to be shown and once you answer this question, you're locked into the math. If your destination is the big screen, you shoot with a 1.75-inch interocular distance. As the actor walks closer, you'll have to squeeze the interocular to .33-inch to make that shot work. But when it comes time to put scene on Blu-Ray, you have very, very weak 3D on anything smaller than the big screen. It's practically not watchable. The converse is also true. If you shoot just for cell phones and laptops, you'll start off with about a 3.5-inch interocular distance and move down to 2.0 inches. If I show that on home theatre or big screen, the 3D is so strong, it's unwatchable. You're really stuck.
With three cameras running, one at small IO, one at medium IO and the third at wide IO, I have choices. Most shots -- like the example I gave of the actor walking to the front door -- will be dynamic and start from a wide IO. I may cut to something else -- but then I can jump to an intermediate IO and then a small IO as they knock on the door. Here's the fun part: I didn't have to move the cameras. The cameras were fixed. So the rig that would have been as big as half a Harley Davidson is something small and locked off so the cameras don't have to move. Now I can easily move the rig around.
Click image for larger view.
The biggest complaint that directors have about 3D isn't the interocular aspect, which they can't fix, but that the rigs are too big and need to be more mobile and smaller. Interocular and convergence setting is a 12-part trigonometry problem and if any one part moves, it changes the math. How often are things moving in the shot? All the time. That's why it's a problem.
You want these three cameras side by side on a rig, and you get to pick whatever pair is appropriate for that shot or screen size as a general starting place. It's like being able to shoot multiple exposures all at the same time. In shooting, having a wide dynamic range is a safety factor. Having more latitude lets you be a little more free and easy with your exposure, knowing something else can help you out. This is a similar capability with 3D. You don't need to know all the math, but now you'll have a smaller and bigger safety net and someone else can decide what it needs to be. Everyone who works in 3D will tell you the same thing: the choice you make for a shot is good for that shot, but what comes before and after in the final edit is unknown? By shooting multiple interocular with three cameras, you now have a choice where you did not have one before.
This solution first occurred to me in September 2009 in the early morning hours. It took me awhile to work out the ideas into real-world application and form factors. Now, this shoot concept of multiple interocular, or MIO 3D
, is patent pending and I'm talking to potential investors.
In the meantime, I've used it on several independent projects. The first was a sports documentary feature called A Day in the Dirt 3D
, an annual motocross race in Pala, California. The second was Dark Truth
s, a low budget, all greenscreen 3D project that I shot that Fall 2011. I just finished shooting the indie movie Dead in 5 Heartbeats.
Dead in 5 Heartbeats. Click image for larger view.
Dark Truths, a low budget, all greenscreen 3D project that Sean shot in the Fall of 2011.
What I learned from shooting these three films this way of shooting 3D is incredibly fast. You no longer have to constantly be checking and adjusting the alignment and doing all the math and sacrificing elements of shots because the math wouldn't work. I can now shoot things that wouldn't have worked because I know I have more range, more safety. That helps me move as fast as I want to move. The other element that I didn't put much effort into but I now see a being as big or bigger is the form factor and size. 3D likes wide angle, likes to be in the action.
Because of the size of the cameras I was able to use and because I was able to fix them while I was shooting, the size of the rigs can be very small; I could fit 3D rigs in the helmet on my head, like motorcycle body armor. I would put this on actors and they would perform the scene, then shoot again with the 3D camera on the other actor and get a true first-person perspective in 3D.
Because of the size of the cameras, Sean could fit 3D rigs in the helmet on his head. Click image for larger view.
This has massively changed how I think about coverage and storytelling. Once the actors embrace it, this suddenly became a new piece of storytelling power to be able to put the audience in the perspective of the actor or another perspective they wouldn't have gotten before.
In Dead in 5 Heartbeats
, the actor walks up to the motorcycle, gets on it, rides and pulls up to another location, walks into a bar and then gets into a fight -- all in one 3-minute shot and all in 3D. This plays on the first-person gamer perspective that kids are so familiar with from playing first-person games. Now we have the real perspective of the first person we can bring into the movie and it's not a gimmick, it's the real actor. When he punches or kicks someone, it's really him. I can cheat if I want to get multiple takes, but I can also put it on a stunt man or myself and I operate the camera as the actor. I can be in the driver's seat and turn and have a conversation with the actor in the seat while I drive...with traditional coverage. It opens up things you haven't been able to do before in 3D.
This is the kind of mobility that you can't typically achieve with ordinary 3D setups.Click image for larger view.
Anyone who attended DGA
Digital Days this year had a chance to see MIO 3D. I showed a reel there, with motocross riding, fishing, stunt planes, skateboarding. I have a guy doing backwards flips on a razor scooter in the skate park in Venice. You can't usually do those things.
I've used a pretty good variety of cameras: the Canon
XF105, a Lumix
GH1 hacked to 50 megabits a second; the Drift,
a great little HD camera that is small and light; Contour,
another similar HD camera; and Canon 7Ds and 5Ds depending on the focal length of lens. The other ones I really enjoy are the Sony
NEX FS100; Sony P1, which I can put on a small beam splitter rig, and the Silicon Imaging
SI2K, which is also nice although the cabling isn't ideal. For me, smaller size is really advantageous. MIO doesn't care whatever camera you use, but the smaller size let's me move it more easily.
Sean is designing a new helmet with integrated 3D built into it for the consumer market.
I've also shot commercials with MIO 3D. For a car commercial called "The Rally Fighter
," about the new custom-designed off-road vehicle, for which used a variety of different cameras. Having multiple cameras is very helpful. It's never about a single camera, in a bubble with the best lens, sensor, and so on. If that camera in a bubble all of a sudden has to be handheld or attached to the helicopter or a boat or mounted to a car, suddenly it's not attractive any more. The great images it takes make it impractical in many situations. The philosophy I take is "Let's start talking about what we want the shot to be and what the best camera to get that shot." The amount of resolution is farther down the list of "must have" features. An 8K shot I can't get is useless to me. Sometimes small and light is the heavier importance. When it doesn't matter, I'll happily put an ARRI
Alexa or a RED
on a beam splitter.
I'll shoot the bulk of the show with the best quality camera I can get away with but I won't sacrifice not getting the shot because I have to have a big sensor. It was shocking to me that seeing the material I shot on the big screen, I wasn't able to tell which shots were taken with the good cameras and which were from the throwaway cameras. At the very worse, these throwaway cameras function like an F900 first generation -- and they cost $300. Granted, there's a lot of massaging on the front end with proper lighting and also up-scaling to 2K on the back end as well as color correction. There's still work that goes into it, but I guarantee that I'd do that work anyway if I shot RAW.
Next up, I'm designing a new helmet with integrated 3D built into it for the consumer market. Take the GoPro
concept -- the camera with a little box can be put anywhere - but instead of this extra box that attaches to the helmet externally, I want to make something more formfitting and aerodynamic that brings the camera to the center of your forehead for a more true first-person perspective. I am in discussions with an investor who loves technology to be able to manufacture a device, as an all-in-one or an applique. The goal is to bring 3D -- MIO 3D -- to the GoPro market.
Sean was honored with an Emmy® Award for Excellence in Cinematography the extensive use of his war cinematography in Dan Rather's 2007 documentary "Combat Photographer." Click image for larger view.
The Emmy name and the Emmy statuette are the trademarked property of The Academy of Television Arts & Sciences (“Television Academy”) and the National Academy of Television Arts & Sciences (“National Academy”)