‘Best of’ Real-Time Winner on Using iPhone X to Produce Stunning Mocap

by | 9 October 2018 | Awards, Conferences, Gaming, Mobile, Real-Time, Software

Image from “Bebylon Battle Royale” © 2018 Kite & Lightning

In August, SIGGRAPH attendees were wowed by some of the industry’s greatest minds in real-time production during Real-Time Live! One project in particular stood out among the rest: “Democratising Mocap: Real-Time Full-Performance Motion Capture with an iPhone X, Xsens, IKINEMA, and Unreal Engine.” This demonstration took home the SIGGRAPH 2018 Best Real-Time Graphics and Interactivity Award for its impressive use of off-the-shelf technology to create a live, portable motion-capture (mocap) setup. We went behind the scenes with creator Cory Strassburger (co-founder, Kite & Lighting) to learn more about the process, his experience as a Real-Time Live! demonstrator, and what inspires him.

SIGGRAPH: Your Real-Time Live! demonstration blew the audience and judges away in Vancouver. Tell us a bit about how the project came to be and the team that is involved.

Cory Strassburger (CS): This DIY mocap project started as a weekend experiment to see if the (new at the time) iPhone X could generate decent facial capture data. At Kite & Lightning, we’re building a wild party brawler game called “Bebylon Battle Royale.” The game is filled with all kinds of crazy immortal “beby” characters. It’s been a big mission of mine to bring these characters to life, not only for the game, but also for all the cinematic content we’re planning surrounding the game. Our challenge is that we’re a very small company, so I needed a mocap setup that was fast and easy to use, requiring little to no cleanup and which would still hit our minimum quality bar.

SIGGRAPH: Walk us through the development process. How long did it take from ideation to final demonstration? What was the biggest challenge you encountered?

CS: The project started in November 2017 when the iPhone X was first released. That initial weekend experiment showed the iPhone X had real potential for capturing facial data. The actual game development was taking up all my time, so continuing this effort became my “Sunday project.” Over the subsequent three or four months, I continued to dial in the facial capture and combine it with our Xsens inertial full-body motion capture suit. To do that I had to strap the iPhone X to a paintball helmet using GoPro camera mounting hardware. The results were very exciting, so I began using the setup to generate actual game content. Up until that point, however, all the captured data for the face and body was assembled offline in Maya, and then exported into the Unreal Engine. In the spring of 2018, Chris Adamson of Xsens asked if I wanted to submit the setup to Real-Time Live!, and I figured if it got accepted then I’d try and get the setup more “real time” by streaming all the data live into the Unreal Engine. The results of this really surpassed my expectations.

SIGGRAPH: What was it like preparing for your Real-Time Live! presentation? Do you have any tips for future contributors?

CS: Leading up to the presentation was intense! Getting the setup working in real time in Unreal Engine was just coming together, so I had to work around the clock in my hotel room for several days leading up to the presentation to get everything ironed out. My rehearsal had a few hiccups to say the least, so on the day of the presentation I honestly didn’t know how it was going to go. If had a tip, it would be stressing the importance of ironing things out as early as possible, and don’t do it alone if you can help it!

SIGGRAPH: In breaking down your demo, a lot of different technology was required to make it possible. Do you follow a similar approach with all of your projects? If so, can you share an example?

CS:

I think it’s fair to say I’ll pull from anything available to me in order to accomplish the mission at hand. These days there is so much great technology available — and it’s getting cheaper — so almost anything you can imagine creating becomes tangible.

SIGGRAPH: You co-founded Kite & Lightning. What motivated you to pursue this line of work?

CS: Augmented and virtual reality are exciting new mediums in which to create, and my fellow co-founder Ikrima Elhassan not only shares in that excitement but his skill set and personality really expands the possibilities of what we can make together.

SIGGRAPH: The theme for SIGGRAPH 2019 is “thrive.” As a creator and innovator, what is one thing that inspires you to thrive?

CS: I really love the word “thrive.” It’s something I feel almost every day. I have to say it’s our small team trying to do big things that inspires me the most. Being surrounded by seriously talented people and seeing what they do and how passionately they do it — that’s a gift!


Meet the Kite & Lighting Team

Co-founder and visionary Creative Director of Kite & Lightning Cory Strassburger is a two-time Emmy award-winning visual artist whose recent work earned him a Gold Lion at Cannes for the innovative Viv Magazine motion title and interactive spread. Since establishing Kite & Lightning in 2013, Cory has been responsible for creatively directing the studio’s content focused on virtual immersive storytelling through a variety of mediums. The studio has produced virtual experiences for NBC, GE, Lionsgate, and HBO, as well as their award-winning original virtual reality opera “Senza Peso.” During Cory’s two-decade career, his work has been featured in films like “Minority Report” and “Star Trek”; television shows such as the “X-Files,” “Deepspace 9,” and “Smallville”; and, broadcast visuals for ABC, ESPN, Fox, Discovery Channel, and Disney.

Ikrima Elhassan is the co-founder and co-director of Kite & Lightning, a virtual reality creative studio established in 2013. As co-director, he is responsible for guiding the development of the studio’s original content creation consisting of immersive computer-generated (CG) worlds that meld cinematic storytelling fused with interactive gaming, including the studio’s much-anticipated VR satire party brawler, which premiered at the 2017 Tribeca Film Festival, “Bebylon Battle Royale.” Before Kite & Lightning, Ikrima’s developer career spanned 14 years in the tech industry at Microsoft, Intel, Nvidia, as well as academia as a Turing Scholar at the University of Texas researching real-time advanced image synthesis.

Alex Underhill is a visual effects wizard and game industry veteran with a decade of experience under his belt. Studying illustration before entering game development, he began his career as an environment artist. With a growing passion for creating real-time visual effects, Alex became a Senior VFX Artist and Senior Technical Artist playing an integral role at Rock Steady in the “Batman: Arkham” series, which was the recipient of multiple BAFTA Game Awards.

Jennifer Chavarria has been an executive producer, head of production, and producer for the past 15 years. She has overseen projects from development through delivery in the film, episodic, commercial, music video, live event, and experiential markets. She is proud to have led numerous VFX and post-production teams through projects that were nominated for the Emmy Award for Best Visual Effects as well as two VMA Award wins for Best Editing and Video of the Year, and a MVPA Award win for Best Editing. During her tenure as head of production at MOE Studios, she was an integral part in launching the Motor Trend VOD Network. Jennifer is thrilled to have been a member of the Kite & Lightning team since October 2016.2016.

Related Posts

Hair-raising Innovation

Hair-raising Innovation

Pioneering a new era in real-time hair rendering, the SIGGRAPH 2024 Technical Paper “Strand-Based Hair Modeling and Rendering with Mesh Shaders” introduces innovative methods that redefine efficiency and realism in computer graphics.