Mercedes-Benz and Friends Make 360-Degree Stadium Halo Board
COW Library : RED Camera : Debra Kaufman : Mercedes-Benz and Friends Make 360-Degree Stadium Halo Board
A visit to the Atlanta Falcon games at the Mercedes-Benz Stadium this past football season provided an unexpected treat: the world’s largest 360-degree cylindrical LED video screen at a sports venue. At 58-feet tall, the screen, dubbed the “Halo Board” because it surrounds the inner stadium, can display 20K resolution and gives a new meaning to the term “immersive video.” Indeed, that’s what visitors saw – spectacular images of Mercedes-Benz’s latest automotive line-up.
The campaign was made possible through a partnership between creative companies The-Artery, The Astronauts Guild, and VR Playhouse, along with the latest technologies from RED Digital Cinema and Radiant Images.
The-Artery’s founder/creative director Vico Sharabani, who directed the campaign, says that the advertising agency, Merkley + Partners, originally brought him the project simply to brainstorm. He immediately gave them an important piece of advice: to regard the project as a panorama rather than virtual reality or 360-degree video. “The 360 components are definitely there,” he says. “But unlike 360-degree video, which would enable the viewer to look around at will, visitors to the stadium would only see a portion of the screen, based on where they are sitting or standing.”
One of the many considerations was framing the cars. He noted that if they were too close to the camera, they would be dropped out of the screen, and probably cause lens distortion and stitching problems. But if they’re too far away, suddenly, the presence is diminished, and it won’t be impactful. Sharabani got the job.
Getting StartedSharabani’s first step was to bring in what he called “the best partners,” for cameras and stitching: VR production company The Astronauts Guild and post/systems integrator VR Playhouse. Once Astronauts Guild technical producer Scott Connolly and cinematographer Evan Pesses were on board, the conversation turned to cameras. “We wanted to maintain the commercial grade imagery we delivery on high-end broadcast commercials,” explains Sharabani. “We immediately eliminated a lot of lower grade cameras. There were so many variables and, all together, it pointed us to one solution … RED.”
To end up with a 20K master, the team needed high-resolution cameras. “We knew the RED WEAPON with the HELIUM 8K S35 sensor would be the camera of choice,” says Pesses. “Resolution was key to this project, and RED is in the resolution game.”
There were other reasons to choose the RED. “It’s always been a camera company at the forefront of needs of new technologies,” says VR supervisor Ian Spohr. Pesses adds that the ability to run a master-slave format was crucial for this production. “We had one camera that would synch all the other cameras,” he says. “When the sun was going down and we needed to change settings, it was very useful to have that facility.” The WEAPON also had the dynamic range to handle the reflective, specular cars. “Honestly, it was the only option out of any other camera out there,” says Connolly. “The compact size was second to none.”
RED WEAPON with the HELIUM 8K S35 sensor
When the team began to shoot the content for the Halo Board, the stadium was still a construction site. Sharabani says they began planning every part of the job meticulously. The cars would be shot at the Willow Springs International Raceway near Lancaster, California. On a pre-production visit, Sharabani noticed a mountain in the West. “The sun disappeared behind that mountain an hour before sunset, which gave us an extended magic hour,” he recalls. “This is when we’d get perfect lighting in all directions.”
Pesses and Connolly, now working closely with VR Playhouse chief creative officer D.J. Turner, executive producer Leo Vezzali and Spohr, tested lenses for the RED cameras. The VR Playhouse team brought the existing previz into their Maya 3D software. “We would feed them specs and distances, and small test shots with different lenses,” says Connolly. “We’d hand them over to Leo and they’d get back to us with previs.” Turner reports that’s how they discovered that an 18mm lens under consideration wouldn’t get enough coverage or overlap for a good stitch.
After testing dozens of lens, they decided on Cooke SR 14mm Cine Prime lenses. Pesses explains that because the screen was 20K x 1080K, they only had to extract 1080 of vertical resolution. “But the cars had to fill the entire screen,” he adds. “We had to figure out the right distance for stitching, but also the right size lens that gave us the proper amount of coverage.”
Solution provider/rental house Radiant Images found matching lenses – no small feat for six cameras – and readied its patent-pending Sense 9 modular camera rig, the world’s most adaptive and accurate 360-degree video capture system, with built-in patch panels which simplify power distribution, video stitch, gen-lock and time code sync.
Sharabani and the team continued with their previsualizations and spent a jam-packed month creating a robust, error-free pipeline that would stand them in good stead during production.
Planning a Complex WorkflowNailing down the camera and lenses was just the beginning of constructing the workflow, say Connolly and Spohr, who spearheaded that job. One crucial task they had to address before production began was how to monitor six RED cameras shooting 8K at 60 fps on set. As director, Sharabani had to be able to see the output of each individual camera on the monitor. “Vico needed to direct the stunt drivers, not just for the best performance, but the best technical stitching,” says Pesses. “The closer you are to a lens in a 360-rig, the less overlap there is to work with. The further away, the better the overlap. If you want to do a close-up, the image has to be center in the lens to avoid a stitch.”
Two feeds came out of each camera. One feed used a Decimator multi-camera switch that sent a single split screen image of all six cameras to Sharabani’s monitor. The second feed took a byzantine path into converter boxes to HDMI and then into Teradek Spheres, which sent a stitched signal to another wireless transmitter, then to an iPad from which it was broadcast on an iCon specialty wireless system. “Both of these signals were received by our video playback at base camp, on two 42-inch TVs,” says Spohr.
Connolly, who notes that the client had also seen the CG previz, which makes the rough live-stitch more comprehensible, says the production was “a data nightmare,” and credits DIT Jamie Metzger, who used four 16-TB drives to transfer data. “It was very convoluted and difficult, but we did it and it worked,” says Pesses.
The campaign was made up of four segments: “Blow By,” “Victory Lap,” “Huddle” and “Burnout.” “Victory Lap shows the cars going around the camera, which ironically sounds easy but was actually extremely difficult, because no matter what, they pass through a stitch point,” says Pesses. “The simpler the car move, the harder it was to hide the transition of the cameras in post.” In Burnout, the cars drift the entire time, and for Blow By, the drivers weave in and out making S-turns. Huddle has eight cars drive straight into the camera.
In addition to making sure that the close-up took place in the center of the lens, and being careful about where the split screen fell on the cars, the production team had other challenges. “It’s still a car commercial and you have to light cars and shoot them in the right direction,” Pesses relays. “The best time is when the sun is just below or above the horizon, so we shot for three days at sunset. In 360-degrees, you have a good side and a bad side, and there were eight cars, all in different colors. We also needed the right performance.”
“We rehearsed and prepared on many levels,” says Sharabani. “We were so prepared we knew exactly what we’d be doing at what time. It was so important to rehearse ahead of time, so that when I decided to change something on set, everyone knows why we’re doing it and what the implications are.” The segments were all shot so that every person in the stadium would have at least a couple of the cars in their view. He also gives credit to the drivers. “This was a real team effort and we had to practice this a lot,” he says. “But the drivers were so good; they were able to weave in and out, close to the center of the lens when they were at the closest point.” Not only did they have the pressure to perform, but a 20-minute window to capture it. The fourth piece, Huddle, required them to re-rig. “The trick was that they had eight cars that had to come straight into the camera and, just like in Blow By, they each had to be in the center of the lens.”
To achieve this, they created a nodal version of the rig, often used to shoot panoramas. “We used four cameras for four cars, with two just for the overlap crossover,” says Pesses. “Then we’d turn the cameras around and do it again with the other four cars. The light changes during sunset about every five minutes, so each time we had to change direction, we had to change settings and/or filters.” Shooting 180 degrees at a time had another upside. “You usually have to set up the camera and hide,” says Turner. “But it allowed us to review and tweak on the spot.”
Tackling Post ProductionAt VR Playhouse, post production proceeded frame by frame. “We used Nuke for stitching,” says Turner. “It was the only software that could handle the payload, which was massive. We didn’t work with proxies because we’d have doubled the workload. We were working in 35K and 42K.” To do so, says Turner, they rebuilt their render farm. “We had to double the RAM in each one of our individual blades on that farm,” he says, noting also that they rendered on CPUs, since GPUs max out at 16K. After experimenting, they transcoded the files into 16-bit half-float EXRs to do all the work, and then delivered EXRs, without altering the color, to maintain the HDR.” According to Turner, the system ingested upwards of 10 TB of raw footage, which resided on their RAID servers. A single frame took 15 minutes to render. “You could sleep a lot on render days,” jokes Vezzali.
With a tight schedule, VR Playhouse had about six weeks to complete the project. “There were so many revisions and corrections that we delivered it in pieces,” adds Vezzali. “We did all the stitching and then it went to The-Artery for color. We broke up each file into two halves of 10K x 1080, so they could bring it into the Resolve color software. Finishing was done in Flame.
What Turner thought would be the easiest segment – Victory Lap – ended up being the most demanding. “In a perfect world the car would stay the same distance from all the cameras, but as it got closer to the edge of the lens, the distance appeared to change because of the barrel distortion,” he explains. “So it was a challenge to smooth that out and make the car appear as if it seamlessly moved through each camera. That required a lot of tweaking and warping to get that to work.”
Also challenging were the segments Burnout and Blow By, which feature smoke. “You can’t really stitch smoke,” says Turner. “You can get close, but the way the light hits the smoke is different for each camera. It was a challenge to clean that up and make sure there were no visible stitch lines. There’s some magic in there.”
VR Playhouse used Dropbox and physical media to deliver the finished product at 20K. “It was a remarkable project,” says Sharabani. “We get a kick out of projects like this that involve a big production in 360 with technological innovation and a unique post pipeline. At the end of the day, it’s experiential.”
# # #