LIBRARY: Tutorials Reviews Interviews Editorials Features Business Authors RSS Feed

Mercedes-Benz and Friends Make 360-Degree Stadium Halo Board

COW Library : RED Camera : Debra Kaufman : Mercedes-Benz and Friends Make 360-Degree Stadium Halo Board
CreativeCOW presents Mercedes-Benz and Friends Make 360-Degree Stadium Halo Board -- RED Camera


Venice, California
©Debra Kaufman. All rights reserved.


A visit to the Atlanta Falcon games at the Mercedes-Benz Stadium this past football season provided an unexpected treat: the world’s largest 360-degree cylindrical LED video screen at a sports venue. At 58-feet tall, the screen, dubbed the “Halo Board” because it surrounds the inner stadium, can display 20K resolution and gives a new meaning to the term “immersive video.” Indeed, that’s what visitors saw – spectacular images of Mercedes-Benz’s latest automotive line-up.

The campaign was made possible through a partnership between creative companies The-Artery, The Astronauts Guild, and VR Playhouse, along with the latest technologies from RED Digital Cinema and Radiant Images.

The-Artery’s founder/creative director Vico Sharabani, who directed the campaign, says that the advertising agency, Merkley + Partners, originally brought him the project simply to brainstorm. He immediately gave them an important piece of advice: to regard the project as a panorama rather than virtual reality or 360-degree video. “The 360 components are definitely there,” he says. “But unlike 360-degree video, which would enable the viewer to look around at will, visitors to the stadium would only see a portion of the screen, based on where they are sitting or standing.”

One of the many considerations was framing the cars. He noted that if they were too close to the camera, they would be dropped out of the screen, and probably cause lens distortion and stitching problems. But if they’re too far away, suddenly, the presence is diminished, and it won’t be impactful. Sharabani got the job.





Getting Started

Sharabani’s first step was to bring in what he called “the best partners,” for cameras and stitching: VR production company The Astronauts Guild and post/systems integrator VR Playhouse. Once Astronauts Guild technical producer Scott Connolly and cinematographer Evan Pesses were on board, the conversation turned to cameras. “We wanted to maintain the commercial grade imagery we delivery on high-end broadcast commercials,” explains Sharabani. “We immediately eliminated a lot of lower grade cameras. There were so many variables and, all together, it pointed us to one solution … RED.”

To end up with a 20K master, the team needed high-resolution cameras. “We knew the RED WEAPON with the HELIUM 8K S35 sensor would be the camera of choice,” says Pesses. “Resolution was key to this project, and RED is in the resolution game.”

There were other reasons to choose the RED. “It’s always been a camera company at the forefront of needs of new technologies,” says VR supervisor Ian Spohr. Pesses adds that the ability to run a master-slave format was crucial for this production. “We had one camera that would synch all the other cameras,” he says. “When the sun was going down and we needed to change settings, it was very useful to have that facility.” The WEAPON also had the dynamic range to handle the reflective, specular cars. “Honestly, it was the only option out of any other camera out there,” says Connolly. “The compact size was second to none.”


RED WEAPON with the HELIUM 8K S35 sensor


When the team began to shoot the content for the Halo Board, the stadium was still a construction site. Sharabani says they began planning every part of the job meticulously. The cars would be shot at the Willow Springs International Raceway near Lancaster, California. On a pre-production visit, Sharabani noticed a mountain in the West. “The sun disappeared behind that mountain an hour before sunset, which gave us an extended magic hour,” he recalls. “This is when we’d get perfect lighting in all directions.”

Pesses and Connolly, now working closely with VR Playhouse chief creative officer D.J. Turner, executive producer Leo Vezzali and Spohr, tested lenses for the RED cameras. The VR Playhouse team brought the existing previz into their Maya 3D software. “We would feed them specs and distances, and small test shots with different lenses,” says Connolly. “We’d hand them over to Leo and they’d get back to us with previs.” Turner reports that’s how they discovered that an 18mm lens under consideration wouldn’t get enough coverage or overlap for a good stitch.

After testing dozens of lens, they decided on Cooke SR 14mm Cine Prime lenses. Pesses explains that because the screen was 20K x 1080K, they only had to extract 1080 of vertical resolution. “But the cars had to fill the entire screen,” he adds. “We had to figure out the right distance for stitching, but also the right size lens that gave us the proper amount of coverage.”

Solution provider/rental house Radiant Images found matching lenses – no small feat for six cameras – and readied its patent-pending Sense 9 modular camera rig, the world’s most adaptive and accurate 360-degree video capture system, with built-in patch panels which simplify power distribution, video stitch, gen-lock and time code sync.

Sharabani and the team continued with their previsualizations and spent a jam-packed month creating a robust, error-free pipeline that would stand them in good stead during production.

Planning a Complex Workflow

Nailing down the camera and lenses was just the beginning of constructing the workflow, say Connolly and Spohr, who spearheaded that job. One crucial task they had to address before production began was how to monitor six RED cameras shooting 8K at 60 fps on set. As director, Sharabani had to be able to see the output of each individual camera on the monitor. “Vico needed to direct the stunt drivers, not just for the best performance, but the best technical stitching,” says Pesses. “The closer you are to a lens in a 360-rig, the less overlap there is to work with. The further away, the better the overlap. If you want to do a close-up, the image has to be center in the lens to avoid a stitch.”

Two feeds came out of each camera. One feed used a Decimator multi-camera switch that sent a single split screen image of all six cameras to Sharabani’s monitor. The second feed took a byzantine path into converter boxes to HDMI and then into Teradek Spheres, which sent a stitched signal to another wireless transmitter, then to an iPad from which it was broadcast on an iCon specialty wireless system. “Both of these signals were received by our video playback at base camp, on two 42-inch TVs,” says Spohr.

Connolly, who notes that the client had also seen the CG previz, which makes the rough live-stitch more comprehensible, says the production was “a data nightmare,” and credits DIT Jamie Metzger, who used four 16-TB drives to transfer data. “It was very convoluted and difficult, but we did it and it worked,” says Pesses.

The campaign was made up of four segments: “Blow By,” “Victory Lap,” “Huddle” and “Burnout.” “Victory Lap shows the cars going around the camera, which ironically sounds easy but was actually extremely difficult, because no matter what, they pass through a stitch point,” says Pesses. “The simpler the car move, the harder it was to hide the transition of the cameras in post.” In Burnout, the cars drift the entire time, and for Blow By, the drivers weave in and out making S-turns. Huddle has eight cars drive straight into the camera.

In addition to making sure that the close-up took place in the center of the lens, and being careful about where the split screen fell on the cars, the production team had other challenges. “It’s still a car commercial and you have to light cars and shoot them in the right direction,” Pesses relays. “The best time is when the sun is just below or above the horizon, so we shot for three days at sunset. In 360-degrees, you have a good side and a bad side, and there were eight cars, all in different colors. We also needed the right performance.”

“We rehearsed and prepared on many levels,” says Sharabani. “We were so prepared we knew exactly what we’d be doing at what time. It was so important to rehearse ahead of time, so that when I decided to change something on set, everyone knows why we’re doing it and what the implications are.” The segments were all shot so that every person in the stadium would have at least a couple of the cars in their view. He also gives credit to the drivers. “This was a real team effort and we had to practice this a lot,” he says. “But the drivers were so good; they were able to weave in and out, close to the center of the lens when they were at the closest point.” Not only did they have the pressure to perform, but a 20-minute window to capture it. The fourth piece, Huddle, required them to re-rig. “The trick was that they had eight cars that had to come straight into the camera and, just like in Blow By, they each had to be in the center of the lens.”

To achieve this, they created a nodal version of the rig, often used to shoot panoramas. “We used four cameras for four cars, with two just for the overlap crossover,” says Pesses. “Then we’d turn the cameras around and do it again with the other four cars. The light changes during sunset about every five minutes, so each time we had to change direction, we had to change settings and/or filters.” Shooting 180 degrees at a time had another upside. “You usually have to set up the camera and hide,” says Turner. “But it allowed us to review and tweak on the spot.”

Tackling Post Production

At VR Playhouse, post production proceeded frame by frame. “We used Nuke for stitching,” says Turner. “It was the only software that could handle the payload, which was massive. We didn’t work with proxies because we’d have doubled the workload. We were working in 35K and 42K.” To do so, says Turner, they rebuilt their render farm. “We had to double the RAM in each one of our individual blades on that farm,” he says, noting also that they rendered on CPUs, since GPUs max out at 16K. After experimenting, they transcoded the files into 16-bit half-float EXRs to do all the work, and then delivered EXRs, without altering the color, to maintain the HDR.” According to Turner, the system ingested upwards of 10 TB of raw footage, which resided on their RAID servers. A single frame took 15 minutes to render. “You could sleep a lot on render days,” jokes Vezzali.

With a tight schedule, VR Playhouse had about six weeks to complete the project. “There were so many revisions and corrections that we delivered it in pieces,” adds Vezzali. “We did all the stitching and then it went to The-Artery for color. We broke up each file into two halves of 10K x 1080, so they could bring it into the Resolve color software. Finishing was done in Flame.

What Turner thought would be the easiest segment – Victory Lap – ended up being the most demanding. “In a perfect world the car would stay the same distance from all the cameras, but as it got closer to the edge of the lens, the distance appeared to change because of the barrel distortion,” he explains. “So it was a challenge to smooth that out and make the car appear as if it seamlessly moved through each camera. That required a lot of tweaking and warping to get that to work.”

Also challenging were the segments Burnout and Blow By, which feature smoke. “You can’t really stitch smoke,” says Turner. “You can get close, but the way the light hits the smoke is different for each camera. It was a challenge to clean that up and make sure there were no visible stitch lines. There’s some magic in there.”

VR Playhouse used Dropbox and physical media to deliver the finished product at 20K. “It was a remarkable project,” says Sharabani. “We get a kick out of projects like this that involve a big production in 360 with technological innovation and a unique post pipeline. At the end of the day, it’s experiential.”

# # #



Related Articles / Tutorials:
RED Camera
VFX LEGION: FX ON SCANDAL & HOW TO GET AWAY WITH MURDER

VFX LEGION: FX ON SCANDAL & HOW TO GET AWAY WITH MURDER

Shot in LA and set in D.C. and Philly, Scandal & How To Get Away With Murder’s stories unfold in virtual environments that seamlessly blend with live-action footage, defying the viewer’s eye. Here Is some of how it is done using Shotgun, Redshift, The Foundry’s NUKE, Media Shuttle, et al.


Sherri Golden
RED Camera
RED IPP2: Real-World Looks At An Image Processing Revolution

RED IPP2: Real-World Looks At An Image Processing Revolution

Science is one thing, the real world is another, and yet beautiful things can happen when the two interact with each other. Our conversation begins with RED Digital Cinema's Graeme Nattress explaining the ways that RED's customers are shaping the company's new approaches to color science, as reflected in RED's new image processing pipeline, IPP2. From there, filmmakers Chris McKechnie and David Battistella get specific about how RED IPP2 has revolutionized their RED workflows, both in the field and in post. No hype here. Just the facts, plus some very pretty pictures, and, okay, more than a little bit of excitement in the lab, in the field, and in the edit suite.

Feature
Christine Bunish
RED Camera
Guardians of the Galaxy Vol 2: Behind the Scenes with RED

Guardians of the Galaxy Vol 2: Behind the Scenes with RED

Anticipation for Guardians of the Galaxy Vol. 2 was already heated up when director James Gunn announced that it was to be the first feature film captured with the RED WEAPON camera using an 8K RED DRAGON VV sensor. In this marvelous behind the scenes featurette courtesy RED Digital Cinema, James discusses the decision to use the RED WEAPON, and how it played out with director of photography Henry Braham, BSC. Both of them found the combination of the massive sensor and small form factor incredibly compelling, providing them the technology to capture the epic scale of the action, in a package small enough to allow them to get exceptionally close to the scenes of genuine intimacy that are this series' secret weapon. (See what we did there?)

Editorial, Feature
Tim Wilson
RED Camera
Don Burgess aligns with Light Iron and Panavision for ALLIED

Don Burgess aligns with Light Iron and Panavision for ALLIED

Don Burgess, ASC trusts Light Iron. His last seven films can attest, so Burgess chose Light Iron to support him again with digital dailies and post finishing services on Allied. Directed by Robert Zemeckis and starring Brad Pitt and Marion Cotillard, the World War II-set film sees an intelligence officer's romance with a French Resistance fighter tested when high command thinks a double agent might be in play.

Editorial, Feature, People / Interview, Business, Project
COW News
RED Camera
House of Cards

House of Cards

Netflix is distributing all 13 episodes of its first season House of Cards at once, on streaming platforms. That's not all that makes this new political drama series unique. With David Fincher as an executive producer and director of the first two episodes, you'd expect a digitally-savvy pipeline and you'd be correct. In this story, Fincher's post production supervisor Peter Mavromates and assistant editor Tyler Nelson talk about how FotoKem's newly evolved nextLAB system played a role in streamlining the post pipeline.

Editorial, Feature, People / Interview
Debra Kaufman
RED Camera
Reproducing Rainbows: Color in the Digital Environment

Reproducing Rainbows: Color in the Digital Environment

As digital photography and cinematography reach further into the digital media landscape, and your captured images can be viewed on a vast range of media, the need for standardizing color has become more invaluable than ever. Imagine trying to get a consistent hue of green or blue or teal across multiple platforms. So, it becomes easy to understand why it's important to have a color reference chart for each new scenes' lighting conditions. All of the formats carry specific gamma and color spaces to reproduce color. The chart is the point of agreement. In this review, David Battistella reviews the RED Cambook from DSC Labs, which readily addresses these concerns. Read on for David's findings...

Editorial
David Battistella
RED Camera
Exposing the RED MX

Exposing the RED MX

David Battistella looks at the RED camera, an offering from a company that have now delivered over 7000 cameras to customers worldwide. David examines exposure techniques for the MX sensor and gives us an upgraded review and updated vision of the best exposure practices. Read on for the details of this highly informative product review...

Review
David Battistella
RED Camera
Exposing the RED: Perfect Exposure, Every Time

Exposing the RED: Perfect Exposure, Every Time

Director DP, REDtrepreneur, early adopter and obsessed student of the RED camera system David Battistella takes a closer look at how to hit the sweet spot on the RED sensor. David shows how understanding REDs inner workings will help you expose your scene just right, every time.

Feature
David Battistella
RED Camera
Under the Gun with RED One

Under the Gun with RED One

Dylan Reeve and a small team of New Zealand filmmakers had 48 hours to get their assignment, write a script, direct, shoot and post their film, then hand in the finished show...oh yeah, and refine their RED shooting and editing skills under the gun while trying to beat out 230 other teams! Sound wild? You have no idea....

Feature
Dylan Reeve
RED Camera
Shooting with RED: Testing, testing...

Shooting with RED: Testing, testing...

CreativeCOW leader Gary Adcock's experience shooting both film and HD gives him a unique perspective as he takes you along for thorough testing with the new RED camera. In part 1 of this series, he focuses on the camera's benefits and shortcomings of the camera itself, as well as some of the challenges shooting with it.

Feature
Gary Adcock
MORE
© 2018 CreativeCOW.net All Rights Reserved
[TOP]