At SIGGRAPH’s Real-Time Live! event in Anaheim last month, Epic Games teamed-up with Ninja Theory, Cubic Motion and 3Lateral to expand upon the technical showcase of live performance capture seen earlier this year at the Game Developers Conference (GDC), San Francisco. The implications of this technology are far reaching, but post-event commentary from Epic Games and Ninja Theory has stated exactly what it can mean for the future of virtual reality (VR) development.

Attendees of the event witnessed a real-time cinematography demonstration that offered an insight into the creation of real-time performance capture technology and the process of implementing it within videogames, movies and VR experiences. With real-time cinematography every nuance of the digital character’s facial expressions, lighting, visual effects and sets are visible in real-time at final render quality. Rather than capturing to film, everything is rendered as digital 3D data directly into Unreal Engine 4 using its recently launch Sequencer tool.

“This demonstration required the teams at Epic Games, Ninja Theory, Cubic Motion and 3Lateral to challenge ourselves both technically and creatively to think about the future for real-time cinematography and deliver a working example of what is now possible through the power of Unreal Engine 4 and the Sequencer tool,” said Epic Games CTO Kim Libreri. “Now, our hope is that artists and technicians are empowered to employ these technologies and techniques to usher in a whole new generation of real-time cinematography that will take interactive entertainment and storytelling to a level of efficiency and quality that has never been achieved before.”

Unlike traditional videogame and film previsualisation methods, the experimental techniques showcased will in time allow developers and directors to capture, edit, playback, and export to offline 3D applications and output to video at any resolution; an asset that will eventually prove important to VR production.

Epic Games & Ninja Theory Comment on Performance Capture for VR

Tameem Antoniades of Ninja Theory said, “Our end goal is to find ways to create fully interactive 3D experiences for future games and virtual reality experiences that feature incredible, immersive worlds and believable characters. This amazing collaboration between our teams had brought the dream a huge step forward to everyone’s benefit.”

House of Moves, IKinema, Technoprops and NVIDIA also provided support for the live performance-driven short film. This collaboration received the SIGGRAPH award for Best Real-Time Graphics and Interactivity.

It should be noted that the technology is still considered experimental at present; however it has been made clear that the intention is to offer these examples of real-time capture performance as a stepping stone to a full rollout through Unreal Engine 4. Indeed, the Sequencer tool – the first step for development studios to bring real-time capture performance to their experiences – is already freely available as part of the binary of Unreal Engine 4. VRFocus will keep you updated with the latest efforts from Epic Games and Ninja Theory to bring these techniques to the wider development community and VR experiences.

Previous articleSony Adds PlayStation VR Titles to PlayStation Store, Official Pricing Revealed
Next articleVR in the Money: Investments Aren’t Slowing Down Any Time Soon