Creative COW SIGN IN :: SPONSORS :: ADVERTISING :: ABOUT US :: CONTACT US :: FAQ
Creative COW's LinkedIn GroupCreative COW's Facebook PageCreative COW on TwitterCreative COW's Google+ PageCreative COW on YouTube
LIBRARY:TutorialsVideo TutorialsReviewsInterviewsEditorialsFeaturesBusinessAuthorsRSS Feed

NVIDIA's GPU Technology Conference M & E Roundup

COW Library : Stereoscopic 3D : Debra Kaufman : NVIDIA's GPU Technology Conference M & E Roundup
Share on Facebook
CreativeCOW presents NVIDIA's GPU Technology Conference M & E Roundup -- Stereoscopic 3D Feature


Santa Monica California USA

©2013 CreativeCOW.net. All rights reserved.


There was truly something for everyone at NVIDIA GTC, and a surprisingly rich amount of thoughtful presentations for the Media & Entertainment crowd. GTC provided many opportunities to learn the basics and, for the truly nerdy, many opportunities to dig deep. As a mere semi-geek, Debra Kaufman learned a lot and met a lot of interesting people. Read her roundup of new M&E technology thanks to super-fast GPU computing.



In San Jose, California, over 3,000 attendees from 50 countries attended 450 sessions and visited 83 exhibitors at NVIDIA's annual GPU Technology Conference (GTC). Although media and entertainment as a market segment is dwarfed by scientific and engineering uses of the company's GPU technology, GTC highlighted numerous exciting sessions in the M&E space, topped by a keynote with Douglas Trumbull.

I had never heard NVIDIA CEO/co-founder Jen-Hsun Huang speak, and he lived up to his reputation as an incredibly dynamic presenter. He spoke about the breakthroughs in computer graphics in the last year, gave an update of GPU computing and explained the road map into the next "click" of NVIDIA's technology. Furthermore, he updated the idea of remote and virtualized graphics introduced last year and introduced new technology.


NVIDIA CEO/co-founder Jen-Hsun Huang
NVIDIA CEO/co-founder Jen-Hsun Huang


Much of what he discussed was spot-on relevant to the media & entertainment community. Huang first showed Wave Works, a real-time Beaufort-scale ocean simulation. "Wave simulation is where science, art and engineering meet," he said, noting that the ocean in Life of Pi was "state of the art." He played with Wave Works, showing the audience, in real-time, how the velocity of the wind, waves, spray, and foam all behaved as he dialed up the Beaufort scale from a calm 3 to a gale-level 10.

Huang also showed advanced real-time character performance with Face Works. Noting that creating a believable CG human face is extremely hard, Huang showed examples of successful and less successful CG faces, noting the term "uncanny Valley" for human-like CG faces that "get sufficiently real until they fall off a cliff and become creepy." "We've been working on rendering the human face for years, and it's an endeavor worthwhile."


Photo credit: Raymond Yuen. Copyright NVIDIA Corporation
Rendering the human face was one subject covered in Huang's keynote address. Photo by Raymond Yuen. ©NVIDIA Corporation





There's a fine line between an image that the human psyche connects with as reality and the steep drop off in the "uncanny valley." At this point, the image begins to become "creepy."


He then showed the results of a partnership between NVIDIA and ICT (Institute of Creative Technologies), itself a partnership between the University of Southern California and the U.S. Army. Digital Ira is a CG face that looks incredibly real. "This is one of the reasons we created Titan," he said, referring to NVIDIA's supercomputer graphics card. "Titan takes 8,000 instructions to articulate all the meshes. It's about 2 teraflops." He suggested that "every important person should have a scan done to create so much fidelity," and noted it was ideal for videoconferencing, gaming and other uses.

Hsung reported the success of GPU computing, saying that 430 million CUDA-capable GPUs have shipped among other data. "It's clear that if we're not at the tipping point, we're racing towards it," he said. "Of the top 500 super computers, being built for scientific applications - we represented 20 percent of these."

Detailing NVIDIA's roadmap for the future, Hsung showed the evolution of the GPU from Tesla to Fermi, Kepler and, next up, Maxwell with a unified virtual memory and improved programmability. After that comes Volta - named after the inventor of the battery - which will be more energy-efficient and introduced a new technology called Stacked DRAM. "It will solve access to memory bandwidth, one of the biggest challenges with GPUs today," he said, noting Volta will offer 1 terabyte per second of bandwidth. Going over the Tegra roadmap, Huang drew applause when he introduced Logan, "the world's first mobile processor with CUDA" that will fit into a chip the size of a dime, require no fans and offer higher performance than Kayla. Logan is expected to appear later this year and will ship early next year. Beyond Logan is Parker. "We're hewing to Moore's Law," said Huang. "You should expect each generation the Tegra processor brings something enormously surprising from the past."



Huang described the specific capabilities designed into NVIDIA's different cards and the evolution of the GPU.



Jen-Hsun Huang talks about how The Life of Pi was made using NVIDIA GPU technology, at GTC 2013.
Huang talks about how "The Life of Pi" was made using NVIDIA GPU technology, at GTC 2013.


NVIDIA's big technology reveal of the show was GRID, "the world's first visual computing appliance." "GRID is 16 virtual machines that can be connected to whatever you like," he said. "Everyone can have a powerful workstation on their desk. The computer in the background runs the application, and does it so fast you think it's being generated by your own computer. They all think they have a super workstation on their desk. If you have 50 users on the network, you can see how the price/value is translated."


Photo by: Raymond Yuen. Copyright NVIDIA Corporation
Jen-Hsun Huang discussing the GRID visual computing appliance. Photo by: Raymond Yuen. ©NVIDIA Corporation


He showed Autodesk 3d studio max operating on a Mac and how easily he could switch to Adobe Premiere. Next, he showed RED 4K footage that NVIDIA shot as a demo, pointing out that 4K footage was processed in real-time. "You can jump between applications, as if you had multiple workstations, as if you had your own personal high-end PC under your desk," he said.

Huang brought up Jules Urbach, founder and CEO of OTOY, which worked with Autodesk to create a "next generation cloud-rendering platform." and Josh Trank, director of Fantastic Four. They showed OTOY's Octane Render Cloud Edition, which is a rack of GRID appliances working together to create VFX shots. Urbach and Trank demonstrated it running from Los Angeles, with 120 GPUs connected to the window; the wireframe rendered in less than a second. "We can get final film quality in a few seconds," said Urbach, who also showed real-time responses to moving the camera and changing the lighting. "It's all there, completely controllable and integrated into every Autodesk product and many others. You can now tell a story in 3D very easily."

GRID starts at $24,900 for a base pack of 8 GPUs and system memory of 192GB; the Max version is $39,900 with 16 GPUs, 32 CPU threads and 384 GB, plus $4,800 a year software license. Both are unlimited devices.

Douglas Trumbull gave another keynote [watch on USTREAM], giving attendees a glimpse into his production studios in the Berkshires and talking about filmmaking without locations or sets. He's already in post production on a 10-minute demo project that was shot at 120 fps in 3D, using Canon C500 cameras with Codex recorders and mounted on 3ality 3D rigs. Intel is helping with storage solutions. "We're putting this together on a shoestring," Trumbull admitted. "Many talented people are contributing their energy."



Douglas Trumbull at his Berkshires Studio


He gave credit to some of those people, including Timothy Huber at Theory Consulting, Steve Roberts at eyeon and Paul Lacombe at Unreel, all of who were at GTC. "Eyeon offers an amazing suite of tools from soup to nuts," said Trumbull. "It allowed us to look at dailies at 120 fps, do our post pipeline, update shots and with a direct link to Avid MC. It's a systemic approach that is far superior to any solution out there." Stewart Filmscreen built the screen and Christie Digital provided the projectors in his Berkshires immersive screening room.

Trumbull also announced that he just received a patent on Showscan Digital for an unusual process he's described in the past: dynamically changing the frame rate on any scene, shot, sequence, pixel, similar to color grading. "Dynamically changing the frame rate would prevent material from looking like TV," he said, noting that it "may not be appropriate for some scenes."

The goal, he said, is to make movies that make the viewer feel like he's a part of the movie. "A first person, not a third person experience," he said. "Extremely bright and sharp with no blurring or strobing or flickering and a wide field of view gives a window into reality. It's very unconventional; it's happening and happening directly to you, not through another character." Part of the experience is a hemispheric screen, which Trumbull has also been experimenting with. He credits his work on 2001 as a source of continuing inspiration to him and others.

"We're pushing the envelope of virtual production," he continued, noting that Unreal's facilities are in his Berkshires studio. "We're heading to make feature films with no sets or locations; an extreme version that costs way less. Tentpoles cost $300M and I think that's crazy and dysfunctional. I'd like to do it for a lot less." He encouraged imitators: "If you can learn anything from what we're doing, go with it."

He also discussed the history of Cinerama and 70mm film - enthusing about the Cinerama documentary by Dave Strohmeier - and expressing his disappointment that multiplexes replaced the spectacle of large screens. With HFR 3D from both Peter Jackson and James Cameron, "history is repeating itself," said Trumbull. "Everyone else is going to have to understand if they want to be part of the future of entertainment, they have to look at games, which is a much larger business than movies," he said. "There's a lesson to be learned there about what people want. What I'm doing, you don't get a joystick, but you get to be inside it!"

Trumbull also has entrepreneurial plans. "My business plans are about making a new enterprise, from production to distribution to new, immersive theatres," he said, noting that he helped to take IMAX public. "We're looking for investors and we're probably coming to this town to look for them. Tablets and iPads are subsuming all the convenience offered by multiplexes. If we want people to go to theatres, they have to see spectacular experiences they can't get on an iPad or home theatre system. That's my agenda."

In addition to keynotes, there were a large number of smaller sessions, which covered 70 topic areas. Media & Entertainment was well covered. Theory Consulting's Huber described Trumbull's HFR near-time review and editorial pipeline, which enables him to playback footage at full resolution. The pipeline also enables Trumbull's team to quickly extract a matte from the greenscreen RAW image sequence, use the camera motion data to composite the foreground motion locked to the background plate, render and playback in stereo, all within 5 to 10 minutes after the footage has been shot.

PhaseSpace's Tracy McSheery talked about "Super Resolution: High Frame Rate Stereo Capture Technology," and showed examples of his company's optical motion capture system. PhaseSpace created a 4-megapixel camera for the U.S. Air Force and U.S. Navy that allows capture of 2D and 3D at 100 fps at 4-megapixel resolution and 200 fps for HD resolution.

Unreel CEO Lacombe described his company's work in virtual sets at Douglas Trumbull's headquarters in the Berkshires. "Six people represent over 60 man years of focus in this very narrow area of real-time graphics with real-time camera tracking and keying," he said. He also showed Unreel's work with ESPN in augmented reality, which adds CGI to the real world. Lacombe also showed the virtual set work the company had done on the movie Oz: the Great and Powerful, in scenes such as the dark forest and top of the city. "It's a big experiment: how do you get the big look and hold down costs," he asked. "The answer is there are lots of virtual sets in the movie."

EFILM's Vice President of Imaging Sciences/Technical Director Joachim Zell presented "Post Production Facility in a Box," describing the company's services on location, including dailies and color. MTI Film Vice President Dave McClure talked about dailies and transcoding. "Transcoding has to be faster than real time," he said. "The rest can just be real-time. None of this is possible without the GPU."


Read our coverage of Daniel Simons' gorgeous work here
Nicholas Recaagno, post supervisor/online stereographer and SGO Mistika Product Specialist, talked about "HFR Post Production Workflows: From Digital Acquisition to Processing and Screening." He showed some sample videos to highlight key considerations of frame conversion and camera shutter angle, and also looked at solutions and technology challenges in the post production workflow from dailies, editing, finishing to delivery. That included looking at how users view and QC 48 fps material. "There is no SMPTE standard or a valid HD-SDI video signal that can be audited," he said, and enumerated possible solutions.


NVIDIA's GTC had numerous other interesting media & entertainment presentations, by Zoic Studios' Head of Pipeline Mike Romey; Pixar Global Technology Software Developer Laurence Emms; Vizrt's Chief Engineering Officer Gerhard Lang; Avid Video Chief Architect Shailendra Mathur; ESPN Principal Engineer Mark Muench and Software Engineer Christopher Pond; and Cinnafilm Founder/CEO Lance Maurer.

Automotive Lead Designer Daniel Simon showed his Cosmic Motors; WETA Digital VFX Supervisor Wayne Stables spoke about real-time rendering; Rhythm & Hues Studios' Lead Software Engineer Nathan Cournia described his company's hybrid GPU/CPU platform; Chaos Software CTO Vladimir Koylazov; Art and Animation Studio Founder/CEO Jan Tomanek; Lightdog Films VFX Artist Marc Leidy; Jawset Visual Computing Software Engineer Jascha Wetzel; and Dolby Laboratories Senior Product Manager Amit Gulati also spoke.

There was truly something for everyone at NVIDIA GTC, and a surprisingly rich amount of thoughtful presentations for the Media & Entertainment crowd. GTC provided many opportunities to learn the basics and, for the truly nerdy, many opportunities to dig deep. As a mere semi-geek, I learned a lot and met a lot of interesting people. I look forward to GTC 2014.







  Stereoscopic 3D Tutorials   •   Stereoscopic 3D Forum
Reply   Like  
Share on Facebook


Related Articles / Tutorials:
Stereoscopic 3D
3ality Technica: CEO Steve Schklair on Changing the S3D Game

3ality Technica: CEO Steve Schklair on Changing the S3D Game

Combining forces to drive innovation in 3D production, 3ality Digital has acquired Element Technica, and has changed their name to 3ality Technica. The newly merged company will dominate stereoscopic 3D production, with an estimated 80% market share. Creative COW caught up with fast-moving Steve Schklair in Brazil and later, Los Angeles to get the inside information.

People / Interview
Stereoscopic 3D
3DTV: The Industry's Challenges, Your Opportunities.

3DTV: The Industry's Challenges, Your Opportunities.

As consumers, it is easy for us to see that 3DTV has not spread as quickly as originally hoped. There are still lots of opportunities for manufacturers and broadcasters to make big moves, and Al Caudullo, The 3DGuy discusses them. More important, Al has seen opportunities for independent content creators like himself to find new revenue streams, even in this trying economy -- and believes you can do it too.

Editorial
Stereoscopic 3D
Tim Squyres Edits Life of Pi

Tim Squyres Edits Life of Pi

Tim Squyres, who was nominated for an Academy Award for his work on Life of Pi, got hooked on film when he took an introductory film course at Cornell University in upstate New York. Squyres talks to Creative COW about the challenges of editing Ang Lee's first digitally shot feature film, which was also a stereoscopic 3D release.

Feature, People / Interview
Stereoscopic 3D
Debra Kaufman's Review of The Hobbit in 48 fps 3D

Debra Kaufman's Review of The Hobbit in 48 fps 3D

Debra Kaufman started her high school's Tolkien Club when she was a big fan of The Hobbit and Lord of the Rings. Though she hasn't celebrated Frodo's birthday in many years, she did get out to see The Hobbit in 48 fps. Here are her thoughts on The Hobbit in 48 fps and HFR Cinema in general.

Editorial
Stereoscopic 3D
Setting Standards for High Frame Rate Digital Cinema PART 2

Setting Standards for High Frame Rate Digital Cinema PART 2

As part of Creative COW's ongoing look at high frame rate cinema production and exhibition, we spoke with Michael Karagosian, co-chair of SMPTE's HFR Study Group, about his thoughts on the ways that higher frame rates factor into mastering, distribution, and exhibition for cinema.

Editorial, Feature, People / Interview
Stereoscopic 3D
The Making of Flight of the Butterflies in 3D on IMAX

The Making of Flight of the Butterflies in 3D on IMAX

SK Films has released Flight of the Butterflies in 3D, an IMAX/giant screen film that details the true story of the scientist who spent 40 years researching the incredible epic migration of the monarch butterfly. Using, among other technologies, a specialized 3D snorkel camera designed by Peter Parks for close-up photography, the feature film features spectacular real footage of the brilliant monarch butterfly sanctuaries.

Editorial, Feature
Stereoscopic 3D
3DTV To Go: The Evolution of 3D and Multi-Screen Viewing

3DTV To Go: The Evolution of 3D and Multi-Screen Viewing

3D in movies is here to stay. While a bit behind schedule for home viewing, 3D's next horizon may be even smaller screens: glasses-free 3D has been available on phones for over a year, and is evolving for touchscreen tablets. Editor-in-chief and publisher for The Hollywood Reporter for 17 years, Bob Dowling is the co-producer of both the 3D Entertainment Summit and the Multi-Screen Summit, and offers a front row view.

Editorial, Feature, People / Interview
Stereoscopic 3D
Stereo D Converts Abraham Lincoln: Vampire Hunter to 3D

Stereo D Converts Abraham Lincoln: Vampire Hunter to 3D

If you think that converting a motion picture from 2D to 3D is a post production process, think again. Stereo D just wrapped up the conversion of director Timur Bekmambetov's Abraham Lincoln: Vampire Hunter from 2D to 3D, and the job began before a single frame was shot.

Feature
Stereoscopic 3D
3net's Best Practices for 3D TV

3net's Best Practices for 3D TV

With a clatter of possibilities for 3D production, 3net has issued a guide - sort of a lion's roar - to help TV producers and creatives learn how to be confident in their knowledge, capable of finding the additional information they need, and resourceful enough to make killer 3D TV programs on budget and on time, all while dealing specifically with all the technical issues required to produce a successful 3D stereoscopic TV show.

Feature
Stereoscopic 3D
Fields of Valor: the Civil War in 3D

Fields of Valor: the Civil War in 3D

Inspired by the massive number of historic stereoscopic still photos from the Civil War, 3net has ventured to deliver the four-part miniseries, "Fields of Valor" in 3D, dubbed "the most ambitious 3D project ever produced for television." 3net will air a special encore presentation on Saturday, March 24, 2012.

Feature
MORE


FORUMSTUTORIALSFEATURESVIDEOSPODCASTSEVENTSSERVICESNEWSLETTERNEWSBLOGS

Creative COW LinkedIn Group Creative COW Facebook Page Creative COW on Twitter
© 2014 CreativeCOW.net All rights are reserved. - Privacy Policy

[Top]