Storytelling in Virtual Reality: Hamlet takes the (virtual) stage in new production

Students Hamlet VR

William Shakespeare’s famous play Hamlet has been adapted thousands of times: on the stage of the Globe Theatre, in high school productions, and even on the TV show The Simpsons. Now, in the 400th year since Shakespeare’s death, a team of artists, actors, engineers, and developers are transporting the canonical text into virtual reality in a new production titled To Be with Hamlet. Conceived by Javier Molina, a graduate and current adjunct assistant professor of Integrated Digital Media at NYU Tandon, the production is one of the first live performances of theater in social virtual reality (VR). Molina also works in the Virtual Reality Lab at MAGNET (Media and Game Network), the NYU program that promotes intersectional education and collaboration between technology and culture.

Though VR is a new experience for many, Molina said that familiarity with the Bard’s play would allow audiences to easily relate to the production. The performance focuses on the fifth scene of Act I, which features the encounter between Hamlet and his father’s ghost. The team has been tirelessly creating the virtual skeletons and avatars of the actors playing Hamlet and the Ghost — Zachary Koval and Roger Casey, respectively.

Roger Casey as the Ghost, and Zachary Koval, as Hamlet, have a particularly dramatic moment in their virtual reality remake of Shakespeare's Hamlet.

Roger Casey and Zachary Koval share a dramatic moment in their virtual reality remake of Shakespeare's Hamlet.

To Be with Hamlet was inspired by Janet Murray’s book Hamlet on the Holodeck: The Future of Narrative in Cyberspace. Both Molina and the production’s director, David Gochfeld, a graduate of NYU Tisch’s Interactive Telecommunications Program (ITP), acknowledged that Murray’s theory that storytelling evolves alongside new technology prompted them to “stage [Hamlet] in virtual reality and see what kind of experience that is for us as creators, for the actors, [and] for the audience.” In addition to their tech-based training, Molina and Gochfeld also have acting and theater backgrounds: Molina studied acting for theater in Ecuador and New York, and Gochfeld trained in physical and devised theater in New York and abroad.

In order to build Hamlet’s 3D model, the team scanned Koval’s body with a Sense 3D scanner and photographed him in various lighting conditions to create a dynamic virtual character. Motion-capture cameras, body suits, and markers track the actor’s movements, which are then transmitted onto the avatar and the virtual castle grounds. “What’s actually being transmitted is the movement data of the skeleton,” Gochfeld said. “The rest of the image is in the video game engine [Unreal Engine], which is running the experience.”

MAGNET’s black-box theater acts as the production studio, housing both the motion-capture equipment and the actors as they perform the scene. To view the performance, audience members — who can be in the black-box space or in a remote location — wear HTC Vive headsets that allow them to not only see but also interact with the virtual space. “By immersing [the audience] in the setting — allowing them to virtually share [Hamlet’s] world for a short time — we hope that they might gain a more visceral understanding of Hamlet's experience,” Gochfeld said.

Presently, they are working towards tracking the actor’s facial expressions, but the high costs of the facial-tracking equipment and software has been a challenge towards achieving this next step in creating a much more life-like Hamlet. Another challenge is that the actors currently do not wear VR headsets. “They can’t see the audience, even though the audience can see them in the VR space,” Gochfeld said. “That is a gap that we’d like to close eventually.”

Melding theater with VR technology is challenging not just for the engineers and developers, but also for the actors who are venturing into new, virtual territory. “It’s an interesting balance of technology, artistry, and performance,” Koval said, highlighting the difference between performing in a theater versus performing for a remote audience in VR. “It’s not only learning how to use my body in a way that expresses what the character’s going through,” Koval said, “But also to bring a life to this skin, basically, of myself in the virtual world that broadcasts what the actor is feeling.”

For the live theater performance in Virtual Reality of To Be With Hamlet, the software captures every gesture and movement from the actors.

Software captures every gesture and movement from the actors


The team would like to mount more live performances in a theater setting with larger audiences, but currently only private showings or special events are being held. Recent performances include a demo at the VR Days Conference in Amsterdam on November 6, as well as a showing at NYC MediaLab’s “Exploring Future Reality” conference on November 10. To Be with Hamlet is also partnered with NYU Shanghai and Abu Dhabi, as well as M3diate, which provides a multi-user platform that can host 15 audience members at once. (For more on this partnership, read this interview with Christian Grewell, Adjunct Assistant Arts Professor at NYU Shanghai and the creative lead of the M3diate production team).

Just as in all live theater, To Be with Hamlet is an ever-evolving project, transforming alongside developments in VR technology. Despite the pressures, Molina said, “It [is] a milestone for us to do this production, [to] dive into this experiment with Hamlet.”

 


To Be With Hamlet's development team created regular video diaries to share the innovative project's technological challenges and breakthroughs. View the first installment below and find the rest on their vimeo channel.

 

Camila Ryder
Graduate School of Arts and Science
Master of Arts in English Literature, Class of 2018