LIBRARY: Tutorials Reviews Interviews Editorials Features Business Authors RSS Feed

Multicam Madness!

Multicamera Madness!
A Creative COW Feature



Mitch Jacobson

Mitch Jacobson
Category-5 TV

New York, New York US
© CreativeCOW.net. All rights reserved.


Article Focus: Madness because everyone wants to do it. Madness because if you start on the wrong foot, you'll drive yourself mad. Here are some of the things that industry leader and multicam expert Mitch Jacobson has learned from the high-pressure, high-stakes world of rock and roll multicam; some of his favorite tools for improving your own multicam experience; and some surprising stories from multicam's origins...with Desi Arnaz and "I Love Lucy."


From a two-camera interview to a 26-camera concert special, multicamera production is being used more now than ever before. And with so many camera types, codecs and editing workflows for filmmakers to chose from, it’s hard not to suffer from Multicamera Madness! After being diagnosed recently with “multicamera-personality disorder,” I decided to write a text book and reach out in workshops to others who share the passion and desire to master multicamera production and editing.  At NAB this year, I will be sharing my experiences by presenting in-depth workshops and a keynote at Future Media Concepts World Post Production conference.

 

EDITING EPKs ON-SET: YOUR FIRST CUT IS YOUR LAST CUT

On-set cutting for concert tour kickoff events is some of the most exciting editing I’ve ever done. I’ve worked with bands including The Rolling Stones, Aerosmith, Paul McCartney, and U2, and the basics are the same. We create EPKs (electronic press kits) that combine cut pieces like story packages, interviews with the band, b-roll, three or four songs that are multi-camera edited pieces – and it's super-fast turnaround, usually same day. You don't have any time for errors. You have to be completely organized, the same way you would if you were shooting and editing any feature package in one day. By the time you ingest, prep and edit your first cut, it’s your last cut because of the same-day-satellite feed constantly looming in your deliverable’s future.

Multicamera ENG crews prep for b-roll shooting. Courtesy MHP3.com, Live Nation and U2.com

Mitch Jacobson and team prepping for U2 shoot

Setting up gear on location is a big deal. You have to have everything you need, because you can’t rely on an edit suite down the hall, or somebody else jumping in to help out if something goes wrong. You have to have all your tools -- not just your editing equipment, but also software for graphics and sound. Then clients are going to throw things at you like Word files or PDFs or TIFs or files with notes about things that have to be re-conformed, so you really have to have your ducks in a row. I keep a full Adobe Production Premium Package, Microsoft Office and a tool kit of various applications on the road.

The US kickoff event for the U2 360º Tour presented some technological challenges that showed how preproduction really comes into play.

"The Claw" Stage from U2 360º Tour EPK. Courtesy MHP3.com, Live Nation and U2.com
Click image for larger.

"The Claw" Stage from U2 360º Tour EPK. Courtesy MHP3.com, Live Nation and U2.com

Even though they tour in the United States, their entire rig is based on European power -- all of their gear runs at a whopping 220 volts at 50 cycles, twice the voltage of American power systems. In addition to that, they're doing everything in PAL, and we're NTSC. To do something as simple as adding a deck to record a line cut means converting the power AND the signal. In addition, we were doing an HD package, but the band was actually set up for SD, 16x9 anamorphic. So, we had to convert the power, we had to convert the format, and then we had to bump it to HD, just to get a line cut for editing—and have the full package done within hours of the concert!

On this particular U2 project, we shot with 5 Varicams (recording to tape) and combined them with the band's cameras -- U2 travels with 12 cameras with robotics, cranes and dollies, the whole deal. We wound up with a massive amount of footage to organize, catalog, convert and have ready so that everybody could get what they needed immediately.

Browser From U2 360º Tour EPK. Courtesy MHP3.com, Live Nation and U2.com

Tape is a slower format for logging and loading than using a tapeless format like P2. A really fast P2 workflow uses MXF4Mac and ingests to Final Cut Pro without having to rewrap or transcode to QT. AJA KiPro is also cool for going directly to Apple ProRes and Telestream has the Pipeline system that I also use for quick turnaround and ingest-while-editing.

Editor Mitch Jacobson cutting U2 EPK backstage. Courtesy MHP3.com, Live Nation and U2.com

Editor Mitch Jacobson cutting U2 EPK backstage. Courtesy MHP3.com, Live Nation and U2.com

It really is frantic to try to get all this done in the middle of a concert. You can hear the band playing out there, everything is thumping, people are screaming, and everybody has their idea of what to put in the cut – “Try this, let's do that, let's do this, let's do that” -- it really is nuts. A friend of mine once said, "It's like jumping off of cliff and building your wings on the way down."

 

THE ADVENT OF THE MODERN EPK

Back in the 70s, there was a little band called Journey. You may have heard of them.

A little band called Journey. You may have heard of them.

They actually developed the system that we know as IMAG, Image Magnification for concerts. They started a company called Nocturne which is still around today-- the largest touring IMAG company in the world. Basicly, they put up big screens at their shows so that people in the back row could experience the concert the same as if they were on the front row. It's all about the close-ups, showing the tight shots of the hands and the head, the singers. That led to having full multi-camera production teams on tour with the bands.

And then along the line, the record companies discovered the need for electronic publicity including all these elements like B-roll, interviews and the music from the shows. They began to hire dedicated production companies to handle that, which is where Mark Haefeli Productions came in. All the legacy-type concert jobs I've done were produced and directed by Mark Haefeli, including the U2 360º Tour. Mark is a television pioneer who helped to develop the modern EPK by combining ENG crews with the touring IMAG productions.

A number of years ago, Mark and his team were asked to shoot performance footage for the Rolling Stones, capturing their stadium concert experience for news organizations and promotional purposes: everything from selling tickets to creating commercials. While Mark knew there were already cameras in place at these shows, he realized there was a problem with the setup.

“Back then, says Haefeli, it was just cameras out there shooting the action as it happened and going directly to the big screens. There weren’t even any recording devices to record the feed. They recorded the audio, but they never recorded the mixed master of all the cameras together, which we now know as the line cut.”

Mark began to record a line cut off the switcher, with occasional ISO decks as well. He also began to supplement that with additional multiple cameras that could divide and conquer to double-team a performance. For instance, 5 cameras shooting ENG style packages during the day could reconfigure at night to compliment angles shot with the band’s IMAG cameras to maximize coverage.

U2 IMAG screen testing colorbars. Courtesy MHP3.com, Live Nation and U2.com

U2 IMAG screen testing colorbars. Courtesy MHP3.com, Live Nation and U2.com

That’s because IMAG is looking for close-ups – hands playing the instruments, individual band members, etc. -- but for television, you also want group shots, reverse angles of the crowd, and big, global shots plus people getting off on the music. So, Haefeli’s cameras focus on that, and then, they would just combine that with the IMAG feeds into these EPKs -- still the workflow that's used today.

View from the director's chair on the U2 360º Tour. Courtesy MHP3.com, Live Nation and U2.com

View from the director's chair on the U2 360º Tour. Courtesy MHP3.com, Live Nation and U2.com

During the day, we take, say, five cameras, and they run around starting at 6 AM grabbing b-roll and interviews, so they shotgun out to the world and around the stadium or the event. After they're done with the shots for the day, they come back, regroup, and become performance cameras for the actual concert.

Mark Haefeli: “Essentially, we walk out of there with a complete show under our belts. With remixed audio tracks, as well; half an hour after the show was over we were able to go right into post and edit what ultimately looked and felt like a multi-million dollar concert production of tremendous size.”

ENTER THE MOVIOLA MONSTER

Desi Arnaz and his director of photography Karl Freund, ASC were the first to shoot filmed multicamera television programs with the I Love Lucy show (1951). They pioneered the 3 camera studio system that sitcoms employ today. They invented the hanging light grid, crab dollies and added a live studio audience. Their editor Dann Cahn, A.C.E. wrote the bible for multicamera film editing, and was the first to cut multicam-- using the Moviola Monster, a four-headed Moviola for multiple film cameras and double system sound.

Editor Dann Cahn, A.C.E. editing "I Love Lucy Show" (1951) Courtesy Dann Cahn, A.C.E.

Dann Cahn and The Moviola Monstor, for multi-cam editing 

"When I had signed up for the I Love Lucy job and arrived in my cutting room, two guys came in wheeling this new edit thing and I said to my assistant, What are we going to do with this monster? It won’t even fit in the cutting room. So we put it in the prop room and used it there. It was a Moviola with four heads––three for picture and one for sound. Its new name—The Monster—stuck." – Dann Cahn, A.C.E.

It was retired more than 30 years later on "Designing Women" in the late '80s, long enough for Dann’s son, Daniel Cahn, A.C.E., to grow up and also edit on the Monster.

Now of course we have the luxury of non-linear digital editing systems but as much as things have changed, things have stayed the same. We still use the same studio systems and editing workflows that Desi and his team perfected in the 50’s. They are simply more refined and offer more options.

Editor Dann Cahn, A.C.E. (2009) at the Lucy-Desi Museum in Jamestown, NY with his original Moviola Monster. Photo courtesy Carrie Puchkoff.

Dann Cahn and the Moviola Monster for multicam editing at the Lucy and Desi Museum

 

[Ed. note: photo used for this article's title graphic: Moviola Monster used to edit the first multicamera filmed program: "I Love Lucy Show" (1951) Courtesy Dann Cahn, A.C.E.]

 

MULTICAM MADNESS®!Mastering Multicamera Techniques

My book is called Mastering Multicamera Techniques: From Preproduction to Editing and Deliverables. It's published by Focal Press, 472 pages loaded with tips and techniques for shooting, syncing, editing and finishing multi-cam projects. The DVD also has over 20 angles of multicamera videoclips from legends in the music industry, including a concert by Elton John shot specifically for this book.  It’s like a love letter to the world of multicam with techniques from every aspect of production and editing-- from run and gun ENG shoots, fly packs, remote trucks and feature film style productions covering most genres like sitcoms, concerts, reality, comedy and event programs.

The book is platform agnostic – Avid, Final Cut Pro, Premiere Pro, Vegas, Edius are all covered. One chapter deals with hardware, bandwidth and speed using “multicamera math” for getting your computer system singing instead of choking – it takes a lot of firepower to do that.

Another section on music covers concerts and music videos but goes beyond rock and roll. We do case studies on projects from Aerosmith, McCartney, The Stones and Journey plus The Metropolitan Opera’s world wide live theatrical simulcasts The Met: Live in HD and Great Performances Carnegie Hall Opening Night. For these, they actually use the score from the symphony as the video script! The director, the AD, the camera crew and everybody else has to read music because they make their camera cues and everything right into the score. They refer to that as "the book" and the editors actually work from that as their script as well.

Below, Score from Leonard Bernstein’s West Side Story, featuring camera direction by Gary Halvorson (Courtesy WNET.org and Leonard Bernstein Foundation)

Click image for larger.

Leonard Bernstein, "West Side Story" - The Book.

 

"WE LIVE AND DIE BY TIMECODE." – Ray Volkema, Editor, HDNet

Time code is the essential, original metadata, and a lot of the time, people get it wrong. Our work in multicam is designed around getting it right. A couple of minutes to do something as simple as jam sync your cameras in the field could save hours of time in editing. High-end cameras all have capabilities for that, so there are no excuses.

Assistant cameraperson preps a pair of Panavision Genesis® cameras on the set of "Law & Order" in Manhattan. Photo courtesy Mitch Jacobson.

Assistant cameraperson preps a pair of Panavision Genesis® cameras on the set of Law and Order in Manhattan.

 

But even many lesser-capable cameras are still able to work with timecode, and even when they aren’t, there are many workarounds, add-on boxes, and so on. Some of the workarounds are simple and inexpensive –clapboards, handclaps, or flashlights – but timecode really does work better, and for multi-cam, that usually means time of day, free run.

I should mention that there are a lot of really cool iPhone slates out there. “Movie Slate” by PureBlend Software even lets you jam-synch multiple slates, and generates EDLs, XMLs and ALEs for Final Cut and Media Composer. You can even e-mail them right out of your phone to your editor.

Movie Slate iPhone app Movie Slate iPhone app

 

Longitudinal timecode is one way to go for these lower-end cameras that don’t handle timecode on their own. A signal gets sent to cameras' audio tracks, and then picked up in post and converted to auxiliary time code. A company called Ambient makes theLockit Box, a high-quality timecode and sync generator.  They also make the LANC Logger for smaller, lower-end cameras. (It actually works on high-end cameras too.) It plugs into the LANC jack, and generates, reads and converts timecode. It also creates XMLs that you can input into your editing software, and start grouping clips right away.

The technology that was used to write code for these devices was done in cooperation with big brains from around the world, such as Andreas Kiel from Spherico and Bouke Váhl of VideoToolShed.

Ambient has a lot of other cool products for timecode with multicam.  Sequence Liner takes timecoded clips, and lines them up in your sequence on vertically stacked tracks. This gives you what we call a sync map of your multicamera show, and then make groups from the sync map. And, Video ToolShed has the AuxTCReader for converting LTC to Aux TC in FCP. (Avid does this natively).

Timeline From U2 360º Tour EPK. Courtesy MHP3.com, Live Nation and U2.com
Click image for larger.

Timeline From U2 360º Tour EPK. Courtesy MHP3.com, Live Nation and U2.com

Spherico SequenceLiner works really well in conjunction with MXF4Mac, for P2 workflows. You can bring everything in really fast, without having to rewrap your QuickTimes, then use SequenceLiner to take all of those clips and put them into your timeline, vertically stacked, so that you can see where all your sync points are and make your group clips from there, saving literally hours of prep time.

And all of these products work with time of day, free-run timecode, which as I’ve mentioned before, is the best way.

The problem is that not everybody does it this way, and for one reason or another you sometimes get footage that doesn’t have perfect sync. The real challenge is how you deal with the stuff that's not setup properly. That's where something like PluralEyes from Singular Software comes in. It basically analyzes the audio tracks of the cameras and syncs everything to that, to make the sync map making a sync map for grouping your multiclips. If you don't have audio on your cameras, then PluralEyes doesn't work, so if in preproduction you realize that you're not going to be using traditional approaches with free run/time of day, then you need to make sure you at least have audio going to all the cameras -- preferably the same audio.

Because let's say you're using a little camera mic, and the cameras are in different distances from the sound that they're picking up -- 37 feet is approximately the same as one frame of lip-sync error in the United States, so if you have a bunch of cameras that don't have the same sound, syncing by audio could be off a little bit.

And then you end up back to the old fashioned way – matching by eye and by ear. You definitely get good at that kind of thing as a multi-cam editor. But my parting advice is to manage timecode, and use it well, because it harnesses all the power of metadata to help you work quickly and efficiently in multi-cam editing.

So, grab a few cameras and sync ‘em up, It’s a great time to master multicam!


 

Mitch Jacobson is the owner and executive producer at Category-5 Entertainment, a creative editing boutique in New York City. Mitch is an Apple Certified Pro and specializes in Avid and Final Cut Pro systems. He has over 25 years experience cutting network TV programs and concert films for A&E, CBS, Fox Sports, E! Entertainment, PBS and others.

He is also a keynote presenter at the Post | Production World Conference at the National Association of Broadcasters Conference, April 12 -15, 2010 in Las Vegas.

Mitch Jacobson

 

 

Comments

Re: Multicam Madness!
by Mitch Jacobson
Chuck, Thanks for the info. I haven't checked these comments before but the link has been fixed. Thanks again, Mitch

Mitch Jacobson
director/editor & author
New York City
Re: Multicam Madness! (Site Issue)
by Chuck Pullen
Just finished the book after anxiously waiting for Amazon to ship it. This is a great source of information for someone who wants more in-depth knowledge of multi-camera production, or just wants to learn about some different approaches to the subject. There are plenty of insightful comments from some very knowledgeable people in the industry, some of whom are familiar faces on the Cow Forums.

I did have question though, I’m not sure if Mitch will see this, so I’ll just throw it out to see if anyone might have the answer. I am trying to log on to the masteringmulticam.com site, and it is asking for a username / password. Is this hidden somewhere in the Appendix, or am I possibly having an issue with my browser?

Thanks again Mitch, this book was exactly what I’ve been waiting for!

Chuck Pullen


Related Articles / Tutorials:
Live & Stage Events
NAB 2012: Snell

NAB 2012: Snell

At NAB 2012, Snell unveiled several new products: Kahuna 360 Compact, a smaller-frame version of the company's Kahuna 360 video switcher that is targeted at the live production market; KudosPro, a format-flexible and cost-effective next-generation image-processing platform; Momentum, an asset management (MAM), workflow automation, and resource planning solution; and a 4-RU version of its Vega routing platform. The company also showcased enhancements to its ICE channel-in-a-box and Morpheus automation systems.

Feature
Debra Kaufman
Live & Stage Events
Orchestrating a Simple Wipe

Orchestrating a Simple Wipe

A simple project became many complicated ones for a client with no idea what to even ask for. Managing projects and clients quickly merged.

Feature
Mark Grossardt
Live & Stage Events
OK Go: This Too Shall Pass

OK Go: This Too Shall Pass

Capturing an unbroken 4-minute take of a Rube Goldberg machine occupying two floors of a 10,000 sq. ft. warehouse, while dodging flying hazards and hoping that the machine actually runs its course, might sound hard. Here's the story of just how hard it was.

Feature
Mic Waugh
Live & Stage Events
Flypacking for the Dalai Lama

Flypacking for the Dalai Lama

Need a single, portable production package for image magnification, broadcast and webcast - all at the same time? Don't panic. Let Todd Gillespie's experience help lead you to inner peace.

Editorial
Todd Gillespie
Live & Stage Events
Journey - Anyway They Want ItJourney - Anyway They Want It

Three weeks to build 90 minutes of world-class video and animation for a one-of-a-kind performance by one of the world's legendary rock bands.

People / Interview
Bob Bonniol
Recent Articles / Tutorials:
Field Production
“Before I forget: don’t wear any underwear.”

“Before I forget: don’t wear any underwear.”

Before coming to Creative COW, before his lives in product marketing and product management at Avid and Boris FX, Creative COW Editor-in-Chief Tim Wilson ran a video production company. As we also observe the 100th Anniversary of the founding of the US Parks Service, Tim recalls one one especially memorable adventure to Everglades National Park, wherein he found himself quite literally up to his armpits in alligators. He had no idea that this was going to happen when the day began. At the time, he was focused on a brand new fear: getting sliced in half by burning underwear.

Editorial, Feature, People / Interview
Tim Wilson
Art of the Edit
The Science of Editing

The Science of Editing

Sven Pape, aka @ThisGuyEdits, joins Dr. Karen Pearlman -- former President of the Australian Screen Editors Guild and a three-time nominee for Best Editing at the Australian Screen Editors Guild Annual Awards -- for a provocative look at "Editor's Thinking," a cognitive skill set that you can use to improve your screenplay before you start principal photography of your film.


Sven Pape
Panasonic Cameras
Shooting MTV's Mary + Jane with Panasonic VariCam 35

Shooting MTV's Mary + Jane with Panasonic VariCam 35

To shoot the ½ hour scripted comedy series for MTV, Mary + Jane, the producers at Television 360 enlisted cinematographer Charles Papert (Crazy Ex-Girlfriend, Key and Peele), who found that Panasonic VariCam is a great fit for moving fast and getting great images when time and resources are scarce.


COW News
Adobe After Effects
Imagineer mocha Pro 5 Plug-In for Adobe: An In Depth Review

Imagineer mocha Pro 5 Plug-In for Adobe: An In Depth Review

Imagineer mocha Pro 5 Plug-in for Adobe brings all the amazing features of the professional version of the mocha Planar Tracker directly into After Effects and Premiere Pro in the form of a plugin. In this in-depth review, After Effects tutorial guru Tobias Gleissenberger of Surfaced Studio will show you what you can do with this new plug-in, and discuss what he likes and doesn't like about the new update.

Tutorial
Tobias Gleissenberger
Apple Final Cut Pro X
Hawaiki Keyer 3.0 Upgrader Tutorial

Hawaiki Keyer 3.0 Upgrader Tutorial

After 25 years as an editor, compositor, and VFX artist, frequent Creative COW poster and tutorial author Simon Ubsdell knows what he needs from a keyer -- and knew he wasn't getting good enough results from FCPX or Motion. Discussions in COW forums led him to create the highly regarded Hawaiki Keyer for Mac users using Apple Final Cut Pro X, Apple Motion, and Adobe After Effects and Premiere Pro on Mac as well. Enthusiasm expressed by COW members for its latest release led us to ask Simon for a tour of the even more advanced Hawaiki Keyer 3.0.

Tutorial
Simon Ubsdell
MORE
© 2016 CreativeCOW.net All Rights Reserved
[TOP]