The Mill and Epic Games Push Real-time Boundaries With ‘The Human Race’

In August, SIGGRAPH attendees were treated to an epic Real-Time Live! presentation that featured some of the greatest achievements in real-time graphics from the previous year, including 2017 Best Real-Time Graphics and Interactivity award winner “The Human Race,” a project by The Mill and Epic Games. We sat down with the team behind the experience to learn about what’s next for real-time.

SIGGRAPH: You delighted the audience at SIGGRAPH 2017 with your Real-Time Live! demonstration. Tell us a bit about how the project started and what your role was in its development.

Joji Tsuruga (JT): “The Human Race” project stemmed from a mutual excitement and interest between The Mill, Epic Games and Chevrolet in pushing the boundaries of real-time filmmaking by utilizing the Blackbird, Mill hinchable Cyclops and Unreal Engine. My main role was to lead The Mill team that crafted the film. I was the bridge between our traditional and real-time visual effect (VFX) team. I also collaborated with Epic’s developers to help with a new compositing tool set for Unreal Engine 4 that was newly built for the project.

Eric Renaud-Houde (ERH): I was responsible for the development of The Mill Cyclops in its latest form. On “The Human Race,” I also acted as a bridge between Arraiy and Epic Games, by integrating their bespoke tracking library into the Unreal Engine.

Michael Gay (MG): We wanted to do a project that tested how real-time could enhance both the production and the viewer’s experience. I acted as sort of technical supervisor for the project on the Epic side and was involved from pre-production to final delivery.

SIGGRAPH: Talk about the origin of “The Human Race.” Whose idea was it to visualize a CG car model utilizing The Mill BLACKBIRD® within live action shots?

JT/ERH: Around the same time that The Mill Blackbird was being developed, we were also working on mobile AR applications, and began to think about how it can extend into an on-set tool for VFX shoots. This ultimately led to the creation of The Mill Cyclops, our virtual production toolkit. By combining these two technologies with Unreal Engine, we were able to visualize a CG Camaro against live-action footage, as it drove along the Angeles Crest Highway in real time.

SIGGRAPH: Combining Epic Games’ Unreal Engine with The Mill Cyclops, you were able to create a real-time vision of cars on set. How can this real-time technology improve the filming process and, ultimately, the final product?

JT/ERH: On-set visualization takes the guesswork out of framing, lighting, and performances of subjects that are not readily available during the filming process. The Mill Cyclops achieves this with a precise object track, and accurate environment lighting and reflections through the use of a 360-degree capture. Directors can thus spend less time being distracted by VFX and more time crafting beautiful shots.

MG: By pairing The Mill Cyclops with Epic Games’ Unreal Engine 4 we were able to allow the directors to visualize the Blackbird in a way never before possible. They were able to frame up shots and set lighting live during the shoot. This made for faster iteration time and better shots.

SIGGRAPH: Is this technology already influencing new projects within your company?

JT/ERH: The technological approach used on this project has not only influenced new projects, but also has led us to new clients. It has spawned several internal projects, including a virtually puppeteer-ed, flame-breathing, confetti-spewing llama that is shot using The Mill Cyclops.

SIGGRAPH: This is the third consecutive year in which our Best Real-Time Graphics & Interactivity Award has been presented to an Unreal Engine-fueled project. Though game developers have been using Unreal for years, it hasn’t quite made a further splash in entertainment. Do you envision that changing soon? Why or why not?

JT/ERH: With their growing enterprise business, it’s clear that Epic is rapidly broadening the scope of its engine and emphasizing its utility beyond games. Winning projects at SIGGRAPH are clear indicators of where this technology is headed. As the landscape of entertainment evolves into augmented reality, virtual reality, interactive, and experiential, game engines are simply better suited to handle these emerging media.

MG: Several media and entertainment companies are adopting the Unreal Engine for real-time workflows. Game engines being used more and more in film previsualzation is a trend I expect to continue until final pixels are being rendered in real-time.

SIGGRAPH: In your opinion, what does the future of real-time visual effects look like?

JT/ERH: The future of real-time visual effects will be indistinguishable in quality from what we would expect from traditional VFX today. We will also see a shift from post-heavy to front-loaded workflows as the line between production and post-production is blurred.

SIGGRAPH: What drove you to submit your project for SIGGRAPH review in 2017? Do you have any advice for newcomers?

JT/ERH: Every component of this project was fully dependent on real-time technologies. It was a shoo-in for SIGGRAPH’s Real-Time Live! [program] and a proof of concept for how we foresee the future of filmmaking.

MG: We feel the project pushed the bounds of real-time VFX. We hope others are inspired to push it even further, as others in the past have for us!


Michael Gay is a director of cinematic production with Epic Games. In this role, he uses more than 20 years of industry experience to lead the cinematic team, and is responsible for the design and development of Sequencer — the Unreal Engine 4 cinematic tool — as well as the replay system for “Paragon”, Epic’s MOBA for PlayStation 4 and PC.

Eric Renaud-Houde is a software developer with The Mill. 

Joji Tsuruga is a real-time supervisor with The Mill.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.