Video Innovations: The pros and cons of motion capture

since motion capture emerged as an exciting possibility for time-crunched animation projects, there have been a seemingly equal number of detractors and proponents of the process. Using optical or magnetic systems to capture data from a live or robotic actor means an accurate and nuanced translation of human movement and time saving over key-frame animation, but many have experienced glitches with data dropouts, difficulties dealing with non-human characters and inflexibility of captured data.

The technology has remained a subject of debate while maintaining a high profile ­ hitting primetime u.s. network tv earlier this month with the debut of a motion-captured nbc peacock, created with l.a.-based MEDIALAB Studio’s realtime performance animation system.

Producers and animation companies are now looking at how to streamline the process and are working with software developers and motion-capture studios to fine-tune systems and software ­ as well as the way existing technology is used ­ to allow animators greater control over the end product and maximize the time savings while maintaining the true, smooth moves motion capture delivers.

A handful of Canadian technology and animation companies are involved in the process of realizing the potential benefits of motion capture.

Given the sometimes unwieldy nature of the process, motion capture has largely been the domain of major studios and effects houses, as well as specialty studios dedicated to the art.

In the case of Toronto’s Big Animation, a 15-person operation, delivering motion capture for a sports game through a u.s. developer meant working with the experts and becoming involved in r&d efforts from motion-capture manufacturers and animation software makers.

‘Generally motion capture isn’t a turnkey tool yet,’ says Big’s Jocelyne Meinert. ‘And the applications for it vary so much that often you would want your own people writing adapters and converters specifically for your software, so for the most part motion capture has belonged to the larger studios.’

Big was asked as a third party to produce animation for a multimedia game, with the game developer keen on utilizing motion capture, touted to deliver large volumes of character animation in a short period of time. Using the studios and system of San Francisco’s BioVision, Big captured about 120 scenes in two days and is currently processing the resulting data. The saved animation time is being devoted to r&d efforts.

Big is beta testing Toronto-based Side Effects Software’s converter for the Biovision motion-capture format, allowing Side Effects’ Prisms and Houdini 3D animation software to be shipped with a working conversion tool for clients using BioVision while allowing BioVision to expand its service base.

For Big, it’s an opportunity for a small shop to gain expertise in the mechanics and the process of providing motion capture for animation projects. Meinert says Big is the first small studio in the city to undertake a major motion-capture project and the company plans to hold a workshop on the subject after work is completed later this summer.

‘Our interest is to get motion capture mainstreamed,’ says Meinert. ‘It would be helpful for a studio like us to say to people this is what works and this is what d’esn’t.’

Side Effects ceo Greg Hermanovic says the Big/BioVision beta test is a step toward being able to edit motion capture data, a primary concern among users.

‘The key thing in any motion-capture project is what you do with the moves once you get it into your own animation package; you want to be able to edit it,’ says Hermanovic.

Side Effects is working on editing tools which would allow motion-capture data to be tweaked, including retiming or slowing down some parts of a move and speeding up others; mixing one take of a move with another and exaggerating motions.

Hermanovic says increasingly complex animation projects will also demand managing multiple characters in a scene, currently a challenge with motion-capture methods.

‘Motion capture is kind of a black art,’ says Hermanovic. ‘It requires a lot of expertise to set up and finesse and calibrate and not many people are really equipped to do it.’

With the emergence of companies like l.a.’s House of Moves and New York-based Acclaim, which handles motion capture on a service basis as well as for its parent game development company, facilities have been able to exploit the technology and procedural expertise of studios devoted to motion capture.

Hermanovic says smaller shops undertaking projects would be well advised to avail themselves of that expertise rather than go it alone, at least for the first few projects.

Toronto post shop Command Post and Transfer purchased an Ascension motion-capture system over a year ago, and even for a major facility with wide equipment expertise, the process proved difficult. The shop provided motion capture in the form of dancing dollars for a Bermuda bank spot and Command Post technical producer Johnathan Gibson calls the technique a ‘sweet and sour thing.’

While providing excellent translations of human moves, he says data can often be unreliable and time must be spent in ‘clean-up’ mode. Gibson says motion capture is well suited to projects like difficult stunts for features and digital crowd replication.

Side Effects is also working with Santa Monica Studios arm VisionArt Design & Animation and Northern Digital of Waterloo, Ont. on a new Godzilla feature film from writer/ director Roland Emmerich and writer/producer Dean Devlin, the team responsible for Independence Day.

Northern Digital’s optical OptoTrak system is a precalibrated active marker system which can track up to 256 markers. The system wasn’t designed specifically for entertainment projects and has been used for various precise 3D measurement applications like aeronautics, biomechanics, robotics and medical device tracking.

The system being used on the Godzilla project was developed by VisionArt sister company FutureLight, which had been working on a motion-capture solution for the past three years. The FutureLight system is able to play back a captured character in realtime, and FutureLight director of research and development Rob Bredow says it is the most advanced system now available.

Bredow says his company started with Northern Digital’s hardware ‘because it was the most advanced motion-capture hardware on the market,’ and developed a complete set of hardware and software that complements the Northern Digital system and integrates it into a production environment. The Godzilla feature will be primarily driven by motion capture, which will be used to animate the scaly superstar himself.

Given the stresses of coping with a full-blown motion-capture system, Burlington, Ont.’s Puppet Works, a division of trf, has developed the first digital configurable desktop motion-capture system. The system is a set of joints ­ which are connected to build any 3D model ­ that interface with other animation software packages, allowing animators to build, map and animate any character with an ‘Erector set’ method.

The main Puppet Works J2000 system, released earlier this year, includes a 20-joint kit and sells for $us14,495. The company subsequently released two smaller systems, the J1000 which has 10 joints and sells for us$8,495, and the J300 which includes three joints and a joystick control and can be had for us$3,995.

Puppet Works vp of sales David Fleury says the company is targeting major feature film and tv studios, special effects facilities and game makers with the desktop system and Toronto’s TOPIX Computer Graphics and Animation will be beta testing a Softimage plug-in with the system.

Fleury says the Puppet Works package gives users the flexibility to employ it as a full-capture system or as a posing tool for key-frame animation, and he says stop-motion animators ­ who have used armatures in their work ­ gravitate to the system because it can be used in a number of ways.

‘You can create a digital armature which then drives a cg character in an animation program,’ says Fleury. ‘Animators can pose it and capture key frames or they can grab it and move it and use it as a motion-capture system and capture the motion in realtime.’

The system d’es not require an actor, provides clean digital information, and allows the creation and capture of quadrupeds and other non-human characters.

Puppet Works is also looking toward creating a motion-capture suit to provide users with a scalable motion-capture system for a broad range of applications.

For the intricacies of game graphics, motion capture, theoretically tailor made for that application, has drawbacks, which Alan Penford, head of research and development at Toronto game developer Grey Matter, says are related more to the way the technology is employed, not with the systems themselves.

Grey Matter used motion capture extensively on its game version of The Crow for Acclaim Entertainment and Penford says problems with the technology as well as how it was applied resulted in limitations to Grey Matter’s efficacy in creating the game.

While the strength of motion capture is nailing the subtleties of human movement, Penford notes that humans never complete a move the same way every time, so that moving from one animation sequence to another, the game’s creators encountered a snapping problem with the first frame of animation moving into a new animation cycle.

To counter that, a proprietary set of blending tools was used to smooth the transition. ‘But once you do that you’re modifying the motion capture, so it’s not motion capture anymore,’ says Penford. ‘You start to lose the characteristics you wanted out of the motion capture to begin with.’

Penford says the best way to employ the technology is to combine motion capture with traditional animation, capturing necessary moves then using specific key frames from it and allowing animators to add elements between those frames, thereby also accommodating the ‘more human than human’ action often called for in a game.