“Anything can happen … and it probably will,” opened SIGGRAPH 2016’s Real-Time Live! Chair, Pol Jeremias of Pixar Animation Studios, preparing the audience for a program anticipated to go above and beyond expectations.
Jeremias welcomed attendees after the first performance of the night, an artist wearing an Oculus headset accompanied with hand-controllers, skillfully moving her hands through the air, sketching an image of a fish that was rendered on a screen in real-time. The total 13 demonstrations involved a wide variety of real-time techniques and “WOW”ed a full house of computer graphics artists, researchers, and those involved in the areas of interactive techniques.
Pol Jeremias, SIGGRAPH 2016 Real-Time Live! Chair
Curated pieces shared during Real-Time Live! included:
QUILL: VR DRAWING IN THE PRODUCTION OF OCULUS STORY STUDIO’S NEW MOVIE
As the Oculus Story Studio now works on their third movie, the team showed demonstrations of the illustrating capabilities of Quill. Taking attendees through Quill, a horse was drawn using the touch controllers demonstrated in the opening of the session. The touch controllers can record precision, velocity, pressure and more to build an entire narrative. Quill demonstrates the recently unearthed capabilities of VR illustrations to create entirely new, 3D narratives to bend new story-telling possibilities.
ILMXLAB AUGMENTED REALITY EXPERIENCE: AR/VR THROUGH AN IPAD
Ant Man made a special appearance for the ILMXLab demonstration! As the presenter waved around an iPad, her view was rendered onto the screen with position capturing technology through the HDC drive. Through this, and motion capture techniques, outside animation, like Ant Man, was rendered into the virtual space.
REAL-TIME SIMULATION OF SOLIDS WITH LARGE VISCOPLASTIC DEFORMATION
This ground-breaking presentation was the first demonstration of real-time plastic simulation that handles deformation. The presenter was able to grab the solid and fold it, roll it, chop it, slice it and eventually blow it all up. Since deformation usually turns graphics into a mesh that becomes mush, this new algorithm makes it possible to deform an object and continuously re-stabilize the resulting image.
REAL-TIME TECHNOLOGIES OF FINAL FANTASY XV BATTLES
This showcase of the new Final Fantasy XV game being released 3 September gave attendees a look at all of the VFX and AI applications used to create new details, ranging from weather effects to new-range motion. A new level of detail is given even to characters in the rain, moving through fog in wet clothing – even the atmosphere is able to reflect slightly nuanced lenses and colors, enhanced by VFX and AI techniques.
IMMERSIVE AUGMENTED REALITY IN MINUTES WITH META 2
Attendees got a glimpse of Meta 2 technology which uses an easy-to-use interface to connect humans to machines and customize obstruction-free hand-tracking with a 360-degree perspective.
After showing a small clip of the film “Adam,” the presenters demonstrated how the physical simulations worked. The visual effects artist demonstrated the motion capture performance and added a secondary layer of motion/animation on top of it. With their newly created tools, they were able to easily add layers and showed the real-time rendering used to enable Adam to rip his own arm off.
They also demonstrated the tool the team created to add realistic detail in real-time. In animation, creating the effect of falling rocks would typically take a designer hours to render, but with this new tool, they are able to do it quickly and effectively, without losing creative quality.
Inspired by the works of the >Demoscene_ movement, the creators of the game Bound were able to use the motion capture of a real contemporary dancer and allow a gamer to control the dancer’s moves in dynamic procedural environments. Aspects of the game environment enabled solutions for some of the hardest problems of VR gaming.
REAL-TIME GRAPHICS IN PIXAR FILM PRODUCTION
Using examples from “Finding Dory” and “Cars,” the presenter demonstrated the new USD that Pixar has just made available and full production rigs, specifically Presto, that Pixar uses for real-time displacement and rendering for their films.
Attendees went back to the Jurassic period as the presenter brought dinosaurs onstage using a tablet and Google’s Tango, a sensor-suite which can measure the environment around it. Tango uses a fish-eye camera with responsive rates, making it as natural as possible, so the dinosaurs that displayed on the screen were the exact same scale of the dinosaurs pulled from the American Museum of Natural History. The presenter also demonstrated a depth sensor with a constructor app that allows the taking of a 3D selfie. It is able to capture both color and 3D on the device and capture a quick composite of depth components — all on a single device!
AFTERPULSE: STATE-OF-THE-ART RENDERING ON MOBILE GPUs
Displayed through the frameworks in a mobile phone, Afterpulse is a game for mobile devices that uses the latest version of the Karisma engine, which implements state-of-the-art graphics, typically used in console and desktop games. The engine and tools were developed in-house with almost no external dependencies.
FROM PREVIS TO FINAL IN FIVE MIINUTES: A BREAKTHROUGH IN LIVE PERFORMANCE CAPTURE**Real-Time Live! Award Winner**
Imagine shooting fully rendered CG scenes in real-time. That concept is no longer a part of the imagination. Epic Games teamed up with Ninja Theory, Cubic Motion, and 3Lateral to create the world’s first believable human driven live capture by an actress within an Unreal Engine game world. In this demonstration, body, face, and voice were all captured live in real-time and recorded to create a real-time scene. What used to take months of CG editing is now possible in seconds with this new real-time cinemagraphic technique. This live performance-capture demonstration took away the big award of the evening: Best Real-Time Graphics and Interactivity.
GARY THE GULL
From Limitless Ltd. and Motional LLC, Gary the Gull is an interactive VR character that responds to voice, gestures, and distance. It was built by an ex-Pixar, ex-Bungie team to demonstrate a new form of storytelling in VR that bridges games and film.
CHARACTER SHADING OF UNCHARTED 4
Presenters from Naughty Dog, Inc. demonstrated the challenges of character shading in Uncharted 4: volumetric feeling of the hair, using shader packages for different surfaces, performance for main characters and crowds, and dynamic wear and tear.
Leave a Reply