Santa Monica California USA
©2013 CreativeCOW.net. All rights reserved.
In San Jose, California, over 3,000 attendees from 50 countries attended 450 sessions and visited 83 exhibitors at NVIDIA
's annual GPU Technology Conference (GTC)
. Although media and entertainment as a market segment is dwarfed by scientific and engineering uses of the company's GPU technology, GTC highlighted numerous exciting sessions in the M&E space, topped by a keynote with Douglas Trumbull.
I had never heard NVIDIA CEO/co-founder Jen-Hsun Huang speak, and he lived up to his reputation as an incredibly dynamic presenter. He spoke about the breakthroughs in computer graphics in the last year, gave an update of GPU computing and explained the road map into the next "click" of NVIDIA's technology. Furthermore, he updated the idea of remote and virtualized graphics introduced last year and introduced new technology.
NVIDIA CEO/co-founder Jen-Hsun Huang
Much of what he discussed was spot-on relevant to the media & entertainment community. Huang first showed Wave Works, a real-time Beaufort-scale ocean simulation. "Wave simulation is where science, art and engineering meet," he said, noting that the ocean in Life of Pi
was "state of the art." He played with Wave Works, showing the audience, in real-time, how the velocity of the wind, waves, spray, and foam all behaved as he dialed up the Beaufort scale from a calm 3 to a gale-level 10.
Huang also showed advanced real-time character performance with Face Works. Noting that creating a believable CG human face is extremely hard, Huang showed examples of successful and less successful CG faces, noting the term "uncanny Valley" for human-like CG faces that "get sufficiently real until they fall off a cliff and become creepy." "We've been working on rendering the human face for years, and it's an endeavor worthwhile."
Rendering the human face was one subject covered in Huang's keynote address. Photo by Raymond Yuen. ©NVIDIA Corporation
There's a fine line between an image that the human psyche connects with as reality and the steep drop off in the "uncanny valley." At this point, the image begins to become "creepy."
He then showed the results of a partnership between NVIDIA and ICT (Institute of Creative Technologies)
, itself a partnership between the University of Southern California and the U.S. Army. Digital Ira is a CG face that looks incredibly real. "This is one of the reasons we created Titan," he said, referring to NVIDIA's supercomputer graphics card. "Titan takes 8,000 instructions to articulate all the meshes. It's about 2 teraflops." He suggested that "every important person should have a scan done to create so much fidelity," and noted it was ideal for videoconferencing, gaming and other uses.
Hsung reported the success of GPU computing, saying that 430 million CUDA-capable GPUs have shipped among other data. "It's clear that if we're not at the tipping point, we're racing towards it," he said. "Of the top 500 super computers
, being built for scientific applications - we represented 20 percent of these."
Detailing NVIDIA's roadmap for the future, Hsung showed the evolution of the GPU from Tesla to Fermi, Kepler and, next up, Maxwell with a unified virtual memory and improved programmability. After that comes Volta - named after the inventor of the battery - which will be more energy-efficient and introduced a new technology called Stacked DRAM. "It will solve access to memory bandwidth, one of the biggest challenges with GPUs today," he said, noting Volta will offer 1 terabyte per second of bandwidth. Going over the Tegra roadmap, Huang drew applause when he introduced Logan, "the world's first mobile processor with CUDA" that will fit into a chip the size of a dime, require no fans and offer higher performance than Kayla. Logan is expected to appear later this year and will ship early next year. Beyond Logan is Parker. "We're hewing to Moore's Law," said Huang. "You should expect each generation the Tegra processor brings something enormously surprising from the past."
Huang described the specific capabilities designed into NVIDIA's different cards and the evolution of the GPU.
Huang talks about how "The Life of Pi" was made using NVIDIA GPU technology, at GTC 2013.
NVIDIA's big technology reveal of the show was GRID, "the world's first visual computing appliance." "GRID is 16 virtual machines that can be connected to whatever you like," he said. "
Everyone can have a powerful workstation on their desk. The computer in the background runs the application, and does it so fast you think it's being generated by your own computer. They all think they have a super workstation on their desk. If you have 50 users on the network, you can see how the price/value is translated."
Jen-Hsun Huang discussing the GRID visual computing appliance. Photo by: Raymond Yuen. ©NVIDIA Corporation
He showed Autodesk
3d studio max operating on a Mac and how easily he could switch to Adobe
Premiere. Next, he showed RED
4K footage that NVIDIA shot as a demo, pointing out that 4K footage was processed in real-time. "You can jump between applications, as if you had multiple workstations, as if you had your own personal high-end PC under your desk," he said.
Huang brought up Jules Urbach, founder and CEO of OTOY
, which worked with Autodesk to create a "next generation cloud-rendering platform." and Josh Trank, director of Fantastic Four
. They showed OTOY's Octane Render Cloud Edition, which is a rack of GRID appliances working together to create VFX shots. Urbach and Trank demonstrated it running from Los Angeles, with 120 GPUs connected to the window; the wireframe rendered in less than a second. "We can get final film quality in a few seconds," said Urbach, who also showed real-time responses to moving the camera and changing the lighting. "It's all there, completely controllable and integrated into every Autodesk product and many others. You can now tell a story in 3D very easily."
GRID starts at $24,900 for a base pack of 8 GPUs and system memory of 192GB; the Max version is $39,900 with 16 GPUs, 32 CPU threads and 384 GB, plus $4,800 a year software license. Both are unlimited devices.
gave another keynote [watch on USTREAM]
, giving attendees a glimpse into his production studios in the Berkshires and talking about filmmaking without locations or sets. He's already in post production on a 10-minute demo project that was shot at 120 fps in 3D, using Canon
C500 cameras with Codex
recorders and mounted on 3ality
3D rigs. Intel
is helping with storage solutions. "We're putting this together on a shoestring," Trumbull admitted. "Many talented people are contributing their energy."
Douglas Trumbull at his Berkshires Studio
He gave credit to some of those people, including Timothy Huber at Theory Consulting
, Steve Roberts at eyeon
and Paul Lacombe at Unreel
, all of who were at GTC. "Eyeon offers an amazing suite of tools from soup to nuts," said Trumbull. "It allowed us to look at dailies at 120 fps, do our post pipeline, update shots and with a direct link to Avid
MC. It's a systemic approach that is far superior to any solution out there." Stewart Filmscreen
built the screen and Christie Digital
provided the projectors in his Berkshires immersive screening room.
Trumbull also announced that he just received a patent on Showscan Digital
for an unusual process he's described in the past: dynamically changing the frame rate on any scene, shot, sequence, pixel, similar to color grading. "Dynamically changing the frame rate would prevent material from looking like TV," he said, noting that it "may not be appropriate for some scenes."
The goal, he said, is to make movies that make the viewer feel like he's a part of the movie. "A first person, not a third person experience," he said. "Extremely bright and sharp with no blurring or strobing or flickering and a wide field of view gives a window into reality. It's very unconventional; it's happening and happening directly to you, not through another character." Part of the experience is a hemispheric screen, which Trumbull has also been experimenting with. He credits his work on 2001
as a source of continuing inspiration to him and others.
"We're pushing the envelope of virtual production," he continued, noting that Unreal's facilities are in his Berkshires studio. "We're heading to make feature films with no sets or locations; an extreme version that costs way less. Tentpoles cost $300M and I think that's crazy and dysfunctional. I'd like to do it for a lot less." He encouraged imitators: "If you can learn anything from what we're doing, go with it."
He also discussed the history of Cinerama and 70mm film - enthusing about the Cinerama documentary by Dave Strohmeier - and expressing his disappointment that multiplexes replaced the spectacle of large screens. With HFR 3D from both Peter Jackson and James Cameron, "history is repeating itself," said Trumbull. "Everyone else is going to have to understand if they want to be part of the future of entertainment, they have to look at games, which is a much larger business than movies," he said. "There's a lesson to be learned there about what people want. What I'm doing, you don't get a joystick, but you get to be inside it!"
Trumbull also has entrepreneurial plans. "My business plans are about making a new enterprise, from production to distribution to new, immersive theatres," he said, noting that he helped to take IMAX public. "We're looking for investors and we're probably coming to this town to look for them. Tablets and iPads are subsuming all the convenience offered by multiplexes. If we want people to go to theatres, they have to see spectacular experiences they can't get on an iPad or home theatre system. That's my agenda."
In addition to keynotes, there were a large number of smaller sessions, which covered 70 topic areas. Media & Entertainment was well covered. Theory Consulting's Huber described Trumbull's HFR near-time review and editorial pipeline, which enables him to playback footage at full resolution. The pipeline also enables Trumbull's team to quickly extract a matte from the greenscreen RAW image sequence, use the camera motion data to composite the foreground motion locked to the background plate, render and playback in stereo, all within 5 to 10 minutes after the footage has been shot.
's Tracy McSheery talked about "Super Resolution: High Frame Rate Stereo Capture Technology," and showed examples of his company's optical motion capture system. PhaseSpace created a 4-megapixel camera for the U.S. Air Force and U.S. Navy that allows capture of 2D and 3D at 100 fps at 4-megapixel resolution and 200 fps for HD resolution.
Unreel CEO Lacombe described his company's work in virtual sets at Douglas Trumbull's headquarters in the Berkshires. "Six people represent over 60 man years of focus in this very narrow area of real-time graphics with real-time camera tracking and keying," he said. He also showed Unreel's work with ESPN in augmented reality, which adds CGI to the real world. Lacombe also showed the virtual set work the company had done on the movie Oz: the Great and Powerful
, in scenes such as the dark forest and top of the city. "It's a big experiment: how do you get the big look and hold down costs," he asked. "The answer is there are lots of virtual sets in the movie."
's Vice President of Imaging Sciences/Technical Director Joachim Zell presented "Post Production Facility in a Box," describing the company's services on location, including dailies and color. MTI Film
Vice President Dave McClure talked about dailies and transcoding. "Transcoding has to be faster than real time," he said. "The rest can just be real-time. None of this is possible without the GPU."
Read our coverage of Daniel Simons' gorgeous work here
Nicholas Recaagno, post supervisor/online stereographer and SGO Mistika
Product Specialist, talked about "HFR Post Production Workflows: From Digital Acquisition to Processing and Screening." He showed some sample videos to highlight key considerations of frame conversion and camera shutter angle, and also looked at solutions and technology challenges in the post production workflow from dailies, editing, finishing to delivery. That included looking at how users view and QC 48 fps material. "There is no SMPTE
standard or a valid HD-SDI video signal that can be audited," he said, and enumerated possible solutions.
NVIDIA's GTC had numerous other interesting media & entertainment presentations, by Zoic Studios
' Head of Pipeline Mike Romey; Pixar
Global Technology Software Developer Laurence Emms; Vizrt
's Chief Engineering Officer Gerhard Lang; Avid Video Chief Architect Shailendra Mathur; ESPN
Principal Engineer Mark Muench and Software Engineer Christopher Pond; and Cinnafilm
Founder/CEO Lance Maurer.
Automotive Lead Designer Daniel Simon
showed his Cosmic Motors
; WETA Digital
VFX Supervisor Wayne Stables spoke about real-time rendering; Rhythm & Hues Studios
' Lead Software Engineer Nathan Cournia described his company's hybrid GPU/CPU platform; Chaos Software
CTO Vladimir Koylazov; Art and Animation Studio
Founder/CEO Jan Tomanek; Lightdog Films
VFX Artist Marc Leidy; Jawset Visual Computing
Software Engineer Jascha Wetzel; and Dolby Laboratories
Senior Product Manager Amit Gulati also spoke.
There was truly something for everyone at NVIDIA GTC, and a surprisingly rich amount of thoughtful presentations for the Media & Entertainment crowd. GTC provided many opportunities to learn the basics and, for the truly nerdy, many opportunities to dig deep. As a mere semi-geek, I learned a lot and met a lot of interesting people. I look forward to GTC 2014.