Article Focus: This tutorial by Doug Bischoff describes a technique for reducing or eliminating vertical camera movement "jitter" or "bounce" using the Mokey software package from Imagineer Systems. By examining the simple techniques involved, readers will learn the basics of Mokey's powerful tracking and stabilizing features.
This tutorial begins with a source QuickTime movie file of footage shot using a camera stabilizing system. As sometimes happens, a slight "bounce" is noticeable when the operator repositioned their feet. This kind of movement can be disconcerting for viewers, and so the desire here is to reduce or eliminate the bounce.
Getting Started - The first step in any Mokey work is to create a new project, either by clicking on the "new project" icon in the toolbar or by selecting "New Project..." from the File menu. Select the MokeyInput.mov source file, and check the following settings as they come up by pressing the "Next>" button:
Range to Import: First Frame: 0, Last Frame: 43
Frame Rate: 29.97 frames per second
Time: Frame Number
Film Type: NTSC
Camera Model: No distortion
Layers - Most Mokey operations make use of "Layers" as defined in the Layers tab. Using the Selection Tool from the toolbar, draw a square all around the entire image. You may need to use the Zoom tool to zoom out in order to see everything you're working on. In order for the Stabilizing function to work properly, your selection should include enough "headroom" on both the top and bottom of your frame to allow for the maximum camera movement. If your footage includes any panning motion, as this one does, be sure that your selection allows enough room to cover the duration of the pan: in this example note the extra "space" allowed on the right side of the footage (the middle of the footage has been compacted for brevity). Name this layer "Background."
Tracking - Since we are interested only in the movement of the entire frame (as opposed to the movement of a particular object in the frame), we need only track this one selection. The Tracker is one of the most powerful features of Mokey, and one that I will cover in more detail in future tutorials. Suffice it to say for now that using the default settings for a Background such as this will be fine for most applications: Mokey will treat the entire background as one, flat plane. This is appropriate because, in this case, that is exactly what we are interested in: the footage as a single "plane" being stable. With the background still selected (as shown), simply click "Track Forwards"
Stabilizing - Now that Mokey has calculated the movement of the background, we can instruct it to counteract that movement. This is done by simply selecting the layer we wish to track (it is already selected unless you clicked elsewhere in the work area), telling Mokey what sort of correction we wish applied, and selecting the "stabilize forward" button in the "Stabilize" pane. For our example, we want to eliminate vertical (Y axis) movement while preserving the pan in the X direction: de-select "X Translation." If there was some rolling movement present in the shot as well, we could attempt to correct this with "rotation" stabilization as well.
Here is where the true power of Mokey becomes apparent. If you wish to dampen rather than completely eliminate the motion, de-selecting "Maximum" in the Smoothing Level pane and setting a level in the box below will give you control over how much movement Mokey eliminates. Finally, in the "Borders" pane, the "Auto Fill" check box is the hidden gem. With this selected, Mokey will use data from other frames to re-construct the missing pixels after applying the stabilization. No more repeated-pixel borders or zoomed-in effects! Note in this finished clip how the woman's hair and the chainmail on the man's arm have been restored seamlessly!
If you find that there is some artifacting or color shifts in the reconstructed areas, try selecting "Model Illumination" (for color and brightness shifts) or "Dissolve" (for detal artifacting) and re-applying the stabilization.
Conclusion - All that remains is to save the results out of Mokey for use elsewhere. There is one slight "gotcha" to be aware of, and that is the "RGB Channels" pop-up. If you leave it at the default of "Original" then you will simply save to disk your original clip! To be sure that you have what you want saved, ensure that this pop-up is set to "Stabilized." I recommend using "Image Sequence" Exporting for later re-assembly in QuickTime Pro or other application.
I hope this gives you a taste of the power of Mokey and remember, we've just scratched the surface!
TV workflow supervisor Kylee Peña (Jane the Virgin, Colony) visits Adobe's "Make It" talk show to chat with host Jason Levine about the evolution of motion picture workflows, from the days of film and tape to our modern digital world of crazy-high shooting ratios and constantly evolving technology. She also expounds on the upside to creative constraints and tight deadlines. And don’t miss the lightning round!!!
Following successful collaborations on The Matrix, Legends of the Guardians, and Happy Feet, Sydney's Animal Logic worked with Warner Bros on The LEGO Movie from pitch to proof of concept to post. Animal Logic has gone even further on the latest LEGO animated feature, The LEGO Batman Movie, where they were embedded with the production for over a year. The range of their work pushed every aspect of the Baselight system for editorial, VFX, and HDR not just for post, but for the entire production process.
Marco Solorio of OneRiver Media gives an introductory overview to the rebirth of the Fairlight digital audio workstation (DAW) that is now part of Blackmagic Design’s powerful DaVinci Resolve post-production system. Is this finally the solution to break away from the ProTools stronghold?
Longtime broadcast engineer, facilities designer, and workflow consultant Bob Zelin has been attending the NAB Show for years, and his legendary write-ups have gone to unparalleled lengths to bring you hidden gems from every corner of the floor. Bob is taking a slightly different approach this time, as he views the specifics through the lens of trends that have been emerging for quite some time, now bearing fruit in 2017. Sit back and enjoy another real-world, hype-free, anything-but-objective ride through the industry's biggest week of the year.
Anticipation for Guardians of the Galaxy Vol. 2 was already heated up when director James Gunn announced that it was to be the first feature film captured with the RED WEAPON camera using an 8K RED DRAGON VV sensor. In this marvelous behind the scenes featurette courtesy RED Digital Cinema, James discusses the decision to use the RED WEAPON, and how it played out with director of photography Henry Braham, BSC. Both of them found the combination of the massive sensor and small form factor incredibly compelling, providing them the technology to capture the epic scale of the action, in a package small enough to allow them to get exceptionally close to the scenes of genuine intimacy that are this series' secret weapon. (See what we did there?)
When you ask an editor what they DO in the edit suite, the answer is often something like, "Well, it's intuitive." To become better editors, though, we need to be more specific. Editor, author, and professor Dr. Karen Pearlman breaks down the process into five specific steps that editors must take in order to turn a mass of material into something coherent. You can learn to hone the specific skills of observation and self-awareness that distinguish editors from other observers, and make unexpected connections that move stories in compelling new directions. Sven Pape of "This Guy Edits" presents his conversation with Karen in the form of a powerful video essay that you will find illuminating and inspiring, and will be able to start using right away.
The April 2017 release of Adobe After Effects (version 14.2) is packed with new features, and Tobias Gleissenberger of Surfaced Studio is here to show you the latest and greatest. Highlights include the Essential Graphics panel and Motion Graphics templates (with Adobe Premiere Pro integration), the addition of Lumetri Scopes and Color Correction effects, new effects organization, the Camera Shake Deblur filter and more.
Immersive media is not new. Emerging technologies, such as VR and AR as we currently know them, are simply part of an evolutionary path making media more immersive. Many commentators and industry professionals became cynical after the short life-cycle of Stereoscopic 3D, and are hesitant to embrace VR, calling it another fad. International award-winning engineer, editor, colorist, VFX artist, stereoscopic 3D artist, and Head of Operations at Auckland's Department of Post, Katie Hinsen sees it differently. These technologies are simply steps within a much wider ecosystem, says Katie, where it's the combination of failures and successes that lead us towards what immersive media is to become.