As I sat down to come up with a list of the best real-time projects in SIGGRAPH history, I knew I wanted to accomplish two things: convey what kinds of submissions the SIGGRAPH 2017 Real-Time Live! program is looking for, and inspire fellow creators. Founded in 2009 as part of the Computer Animation Festival, real-time demonstrations at SIGGRAPH have a relatively short history. In that time, though, the work that has been showcased is truly tremendous. Without further ado, here are my top choices for the best, most innovative SIGGRAPH real-time presentations in Real-Time Live!’s 8-year history.
Digital Ira: High Resolution Facial Performance Playback
By USC Institute for Creative Technologies, Joe Alter, Inc., and Activision, Incorporated | SIGGRAPH 2013
The presentation showed how the data captured from a video performance can drive the creation of a photoreal facial computer-graphic model. Equally impressive were the results, which showed how the rendered human face can be viewed from any angle, under any light setting, and still perform many character facial expressions as if it were put together by a team of visual effects modelers, riggers, and lighters.
Maturing the Virtual Production Workflow: Interactive Path Tracing for Filmmakers
By Blur Studio and Chaos Group | SIGGRAPH 2014
Ray-tracing and motion capture (mo-cap) were brought together in way that innovated the virtual production workflow for filmmakers — all thanks to this one presentation! Both ray-tracing and mo-cap require considerable amounts of time to process, but to be able to process both in real-time, with mo-cap data captured live, and subsequently render high-quality images was incredible. This introduced an awesome and fast production workflow, which generates computer graphics (CG) scenes for film and video games with mo-cap data.
From Previs to Final in Five Minutes: A Breakthrough in Live Performance Capture
By Ninja Theory Ltd., Epic Games, Inc., Cubic Motion Ltd, and 3Lateral Studio | SIGGRAPH 2016
Live performance capture of both body animation and facial expressions to construct a scene under a filmmaker’s live direction is a very big thing, but to be able to edit it with a second data set then stitch both data sets together and render it out for live playback? That made this presentation the definition of “mind-blowing.” The collective efforts of Epic Games, Ninja Theory, Cubic Motion, and 3Lateral shows just how much the industry is pushing the research and development (R&D) of real-time technology by innovating and collaborating. (Watch the full demonstration, starting at 1:04:46.)
Think your project could be the next big thing in real-time technology? Submissions are now open for the SIGGRAPH 2017 Real-Time Live! program. Projects will be accepted through 4 April 2017.
Cristobal Cheng is an on-location lab services technician with Technicolor. He has spent his career focusing on operational management and logistics in production environments and has over 10 years of experience. Cheng is serving as the SIGGRAPH 2017 Real-Time Live! Chair, but started out as a student volunteer (SV) in 2001 and has been volunteering ever since under various forms for the SIGGRAPH conference — including as part of the Student Volunteer program, S3, and the Computer Animation Festival.