Ninja Theory and Epic Games Talk Hellblade and the Future of Real-Time

Ninja Theory and Epic Games Talk Hellblade and the Future of Real-Time

In July, SIGGRAPH 2016 attendees were treated to an epic Real-Time Live! show that featured some of the greatest achievements in real-time graphics to date, including this year’s award-winning “From Previs to Final in Five Minutes: A Breakthrough in Live Performance Capture.” The presenting team hailed from four different studios — 3Lateral Studio, Cubic Motion Ltd, Epic Games, Inc., and Ninja Theory Ltd. — and we sat down with Tameem Antoniades, Chief Creative Ninja and Creative Director for the game at Ninja Theory Ltd. and Kim Libreri, Chief Technology Officer (CTO) at Epic Games, Inc. to learn more about the process and find out what’s next after Hellblade: Senua’s Sacrifice.

28486566382_c4247e9455_k

SIGGRAPH: This July, you stunned the audience with your Real-Time Live! demonstration and won the “Best Real-Time Graphics and Interactivity Award” as a result. Tell us a bit about how the project started and what your role has been on Hellblade.

Tameem Antoniades (TA): My name is Tameem Antoniades and I’m the Creative Director of Hellblade and co-founder of Ninja Theory. For Hellblade, we wanted to create our most believable character yet and had been working with 3Lateral and Cubic Motion to create a virtual double of our actress, Melina Juergens. Melina is also based in-house as she is our video editor and we had built our own performance capture studio in one of our meeting rooms.

At the beginning of the year [Epic Games’] Kim Libreri and Michael Gay visited our studio in Cambridge. When they saw our work so far, Kim asked if we would be interested in doing a live performance of Senua on stage at short notice for [Game Developers Conference] GDC. Since we had a lot of the pieces of the puzzle in place and full access to both the actress and a capture studio, it felt just about achievable and so we said, “Yes!” It’s not every day you have an opportunity to do something groundbreaking. From there, our ambition grew for our next demo at SIGGRAPH where we wanted to show how this technology could be used to shoot and edit a small but complex scene in real time.

Kim Libreri (KL): My name is Kim Libreri and I’m the CTO at Epic Games. Prior to working at Epic I had supervised visual effects and technology for movies for 20 years including all three “Matrix” movies.

SIGGRAPH: For those who missed the SIGGRAPH 2016 demonstration, can you talk about whose idea it was to marry the film and games worlds in this unique way, now dubbed “real-time cinematography,” and what technologies are involved?

TA: After doing the GDC demonstration, which we repeated also at FMX in Stuttgart, Kim said he would like to demonstrate the use of Sequencer, a new cinematic tool for capturing and editing scenes. So we came up with the idea that Senua would talk to her own mirror reflection and built a scene around that. The goal was to demonstrate that you could shoot and edit a computer graphics (CG) scene in real time. We coined the phrase “real-time cinematography,” rather than use existing terminology, such as “previs,” because the idea is that all aspects of the scene are represented virtually on-set: lighting, facial, body, voice, camera, and VFX. What you see is what you get.

KL: When making any sort of cinematic computer graphics content it can be quite an abstract process where you often don’t get to see the content until quite late in the production process. Nowadays, Unreal Engine can produce quite photo-realistic imagery and we wanted to show what it, along with it’s new cinematic tool Sequencer, could bring to the world of virtual production. By combining Unreal Engine 4 (UE4) with state-of-the-art performance capture capabilities of Cubic Motion and the House Of Moves along with 3Lateral’s amazing facial animation systems we were able to pull off a quite convincing digital version of Senua that could be driven by Melina’s live performance.

For this demonstration, we extended Sequencer’s recording capabilities by adding the ability to record facial and body animation along with audio into an animation track that then appears on the timeline as a virtual clip that can be later edited and layered up against other performance clips. This allowed us to record Melina performing against herself as two separate animation tracks. We could also simultaneously record the camera and focus in to the master track.

SIGGRAPH: What was it like collaborating across studios on Hellblade: Senua’s Sacrifice?

TA: Remarkably easy in terms of coordination and communication and I attribute this to Kim and his team for setting clear goals, responsibilities, and an impressive production ethos. I felt like we had a crash course in visual effects development that money could never buy. I’d do it again in a heartbeat.

KL: Every day we would do “dailies” in the same way I would on a movie, but with the [Ninja Theory] Ninjas joining in through Google Hangouts and cineSync.

SIGGRAPH: In a recent article from Inverse, you talked about the possibilities this technology has for other areas, such as film and virtual reality (VR). What do each of you find most exciting about where this could go?

TA: Perhaps others can speak to the potential of this technology to streamline CG film making and eventually democratize it. My feeling is that virtual reality is where it will offer new forms of entertainment. Cinema has been defined by the grammar of the shot but in VR you are the camera. So the grammar of the scene and the virtual humans that inhabit it will be the tools for directing your attention. We are now testing our theories by mixing performance capture with procedural animation to create fully interactive virtual humans and scenes. This is something that 360-degree video will never achieve and where I think the future of a new form of entertainment lies. We’ll see.

KL: For filmmaking, I think we will see gaming technology start to play an important role in pre-visualization (“previs”) and on-set virtual production. Not only can we produce compelling images in real time but the fact that worlds created in an engine are simulated has major advantages over the traditional methods of key-framing all the action. Imagine how liberating designing an action scene such as a car chase will be when you can drive a car around a virtual set instead of having to pre-plan and animate every shot.

SIGGRAPH: Are you working on new real-time projects? If yes, can you share any details? If no, what are each of you planning as your next career step or how are your respective studios moving forward into new spaces?

TA: I believe that we are slowly shifting towards becoming a real-time technology company that happens to make great games. What this means is that the skills of our studio are well suited to a plethora of real-time applications. We could, for example, apply our skills to create live concerts with virtual characters that fully interact with the audience. We could conceivably create VR simulations for the treatment of phobias, anxiety, and other mental conditions. Or we could use our skills to facilitate filmmakers making CG films in fully virtual sets. That is why we set up Senua Studio so that we could explore non-gaming projects. I think what many people don’t appreciate is how adaptable and efficient game studios have had to be to make modern AAA games. I think we (meaning all high end game developers) have some of the best talent in the world when it comes to programming, art, and design, and it will be exciting to see how we can break out into other real-world applications.

KL: Real-time is what Epic is about. The very nature of video games is that they are interactive and we are now bringing that magic into other industries along with strengthening our core capabilities in gaming as we head into the uncharted worlds of VR and augmented reality (AR).

SIGGRAPH: The game itself deals with the deeper topic of mental illness, how has that shaped work on the project?

TA: The goal was to create a believable character so that we could put you in her shoes and see and hear the world through her eyes. As she suffers from psychosis, we wanted to ensure that we are truthfully representing what it could be like for sufferers. From the very beginning of the project, we’ve been working with Dr. Paul Fletcher, a psychiatrist and professor of health neuroscience; the Wellcome Trust global charity; and, many groups and individuals with direct experience of voice hearing and visions. Aspects of our findings and simulations were featured in both our GDC and SIGGRAPH demonstrations.

The studios have declined to share an official release date for Hellblade: Senua’s Sacrifice.

If you missed Real-Time Live! at SIGGRAPH 2016, you can watch the video below or on the ACM SIGGRAPH YouTube Channel.

2 Responses

  1. […] the second year in a row, the Best Real-Time Graphics & Interactivity Award went to an Unreal Engine-fueled project. […]

Leave a Reply