Capturing Motion for the Met

Motion Capture

Most people interested in the history of the NYU Polytechnic School of Engineering know the story of Distinguished Professors David and Gregory Chudnovsky, who in 2003 helped curators at the Metropolitan Museum of Art create a flawless multi-gigabyte image of the tapestry "Unicorn in Captivity," using a supercomputer that performed 7.7 quadrillion calculations to do so.

Now, the School of Engineering has lent a new set of skills and equipment to the Met, and the result is a digital performance piece called “The Return,” which is on view at the museum until August 2.

The tale behind “The Return” began in 2002, when the pedestal bearing a famed sculpture of Adam by Venetian Renaissance artist Tullio Lombardo collapsed, sending the marble masterpiece crashing to the ground. Restoring the statue, which had broken into 28 large pieces and hundreds of smaller shards, took more than a decade of painstaking work by conservators, engineers, and consultants who employed, among other things, 3D printing, laser mapping, fiberglass pins, and a newly developed adhesive.

To celebrate the project’s completion, the museum commissioned new-media artist Reid Farrington to conceptualize and create “The Return.”  Farrington turned to NYU’s Media and Games Network (MAGNET), a multidisciplinary center housed at the School of Engineering, to help him build a digital avatar of Adam using the same motion-capture technology that was used in the films Planet of the Apes and Lord of the Rings.  Visitors to the museum can now interact with three versions of ADAM: one based on the 3-D rendering that the museum’s conservators used to figure out how to repair the shattered statue, one depicting Tullio Lombardo’s original 15th-century work, and biblical Adam, who answers questions about the Garden of Eden and other scriptural matters. The avatars are controlled in real time by a rotating roster of actors in motion-capture suits. (In addition to interacting with the digital representations, visitors can enter the Grace Rainey Rogers Auditorium at the Met and watch the performers in action.)

“Our motion-capture system is a real jewel,” says Javier Molina, who helped on the project while completing his Masters degree in Integrated Digital Media at the School of Engineering. “It is developed using Unreal Engine, a software tool for video games and adapted for live performance with the help of Athomas Goldberg, a animation consultant and mentor for the Motion Capture studio at MAGNET”

The $70,000 system will be returned to MAGNET when the Met show closes, and Molina, now a faculty member, is looking forward to using it in a three-credit course that he’ll be co-teaching in the fall, Bodies in Motion. The course, whose other instructor is Todd Bryant of the Interactive Telecommunications Program (ITP) at Tisch, will be open to all NYU students seeking to learn about the incredible possibilities of motion capture. “Motion capture now has very user-friendly interfaces, so it’s not intimidating for artists, actors, and dancers--who might not have technology backgrounds—to work with,” he says. “And it has even broader potential beyond art. For example, at MAGNET we hope to collaborate with faculty and students at the NYU Ability Lab to study the movements of patients with Parkinson’s disease, in hopes that it will lead to novel treatments. Maybe we will help develop ways in which the visually impaired can better navigate through space. It’s exciting technology, and it will be an exciting class.”