Image credit: copyright © 2025 The Black Laboratory / puppix Ogof design right Chris Pirie
“Puppet In The Room”, which premiered in the SIGGRAPH 2025 Spatial Storytelling program, showcases puppix — a groundbreaking capture system that transforms live, full-puppet performances into digital animation twins. Audiences experience the magic firsthand as a physical puppet character shares the room, responding to direction and interaction while its digital counterpart comes alive in real time. We sat down with Ben Mars, SIGGRAPH 2025 contributor, to explore his work and the possibilities it unlocks for immersive, interactive storytelling.
SIGGRAPH: What inspired the development of puppix, and how does it bridge the gap between
physical puppetry and digital animation?
Ben Mars (BM): I started out building practical masks and creature suits, buying my latex and getting advice from Paul Barrett Brown while he was working on “Batman Returns”. I trained in stop-motion animation, then computer animation with Steve Roberts and Claude Lester at St. Martin’s London Animation School and later in prosthetics with Mike Stringer at Hybrid, so I’ve always kept hopping between the practical and digital worlds.
The idea for puppix came out of working as a previz animator on a film called “Pan”. Sarah Wright from the Curious School of Puppetry and a team of puppeteers created a creature called the Neverbird for movement and character tests.
In the film, it is a creature magically created from a collection of bones and feathers. And the creature they performed moved like bones and feathers held together by magic — because it was made from bones, feathers, and fishing line and was being moved by trained puppeteers.
It had an amazing, ethereal, realistic physicality perfect for the character. The puppet movement was referenced and maintained in the detailed previz done by Isabel Cody.
However, the final full-animation Neverbird, which was a beautiful piece of computer animation in its own right, didn’t use that puppet physicality, and I always felt that was a shame. It lost the magic for me.
At that point, I started telling people how we should be motion capturing puppets.
This was compounded when I worked on animation for “The Nice Guys”, where we had to create animation to replace a filmed puppet performance of a bee complaining about pollution as it’s being driven through Los Angeles by the main characters.
After quite a few years of talking about this, I realized if I wasn’t going to do it, someone else would — especially as around this time, reference puppets began to be used a lot on set in productions like the television series “His Dark Materials”, and theatrical puppets had a resurgence with “War Horse”.
And so we started doing it, with support from UK Research and Innovation, Digital Catapult, MyWorld, and NVIDIA.
This is not controller-driven puppeteering of digital characters, where you manipulate controllers to move the on-screen characters, or reference puppetry, where you’re moving a physical puppet representation of the character, primarily providing position, lighting and timing reference — usually recorded as images — to be used later as reference when keyframing animation.
We’re bridging the gap between physical puppetry and digital animation by capturing the whole puppet performance in three-dimensional movement in real time, using a puppet built to directly reflect the physicality the digital character would have.
So the actual performance that happened on set — that your actors were performing and reacting to — along with all the character movement and secondary movement that resulted from the performance, interaction, and reaction to the environment is available as digital animation.
You can show it live, enhance it, edit, adjust, add keyframe animation, apply it to simulation systems or other characters, or otherwise use it as you wish.
We have the puppix studio, build and capture spaces at The Puppet Place in the U.K., which gives us additional access to an amazing range of talented and experienced puppet and animatronic fabricators, engineers and performers, as well as all our contacts in the VFX and computer animation sectors.
It’s great to be able to work with an amazing team of people from multiple sections of the creative industries, combining all our skill sets and viewpoints to bring the creative team process and physical reality of live puppetry to the infinite possibilities offered by computer animation.
SIGGRAPH: How does having a live puppet character in the room transform the audience’s experience — what possibilities does this create for interactive digital performance?
BM: Live puppet characters allow directors and actors to work with nonhuman characters with the same flexibility, freedom and immediacy as human actors. All the characters in a scene are brought together into the same moment in time and space.
The character’s physical movement, weight, and timing are informed by its physicality and its interaction with its environment and with other characters. For the performers, actors, director, and live audience, there is a sense of connection to — and physical presence of — the character in the performance space. The character feels “real” because it is real. You’re not watching it through the window of a screen.
This real movement is transferred into the digital space, so you then have the performance and interaction from your puppet shared with characters or avatars. You can also use your puppet as a controller for interactions by its digital twin with other characters that exist only in that digital space.
You get the possibilities for multiple types of digital interaction, all moderated at a live human level.
We’ve been working with Hannah Southfield, a fellowship researcher with the University of Bristol and Watershed, to see where this might be taken in using the technology in therapeutic communication aids — to communicate in physical and digital spaces using puppet-driven avatars.
SIGGRAPH: Walk us through how puppix captures and translates physical puppet movements into a digital twin. What makes this system different from other motion capture tools?
BM: The key difference between puppix and other motion capture tools driving digital characters is that the performance focus, as the title says, is in the room. Your director and fellow performers are working together with the character on a live set.
With armature systems like Dinosaur Input Device and Sil that approximate character skeletons, you’re concentrating on getting a good result on the digital screen. Likewise, with control rigs like Waldo, the focus is the screen.
We also have the physicality of the puppix puppets matched one-to-one with what the digital twins’ physicality would be — not just joint positions, but mass distribution and volumes. So your environmental and co-performer interaction and reaction, and your secondary movements, all come for free because they’re physically accurate to the creature.
Another thing we’re doing is using multiple types of sensing systems, both internal and external to the puppet, to compensate for the limitations you’ll always encounter when using any single type of sensing. The classic example is trying to put optical motion trackers on puppets — you quickly find your puppeteers and other characters getting in the way of your capture markers.
As motion capture from video using machine learning becomes more common — and, indeed, if you look at many of the algorithms used in processing more traditional motion capture — you’ll see that a large proportion of the training data is human motion. These systems are very good at predicting, finding, and creating human movement from their input data.
But at the moment, creatures, nonhuman-shaped characters, and stylized humanoids or stylized movement have far fewer captured training data sets available. We’re one of the small number of companies capturing them.
SIGGRAPH: Spatial Storytelling is a new program at SIGGRAPH 2025. How does Puppet In The Room push the boundaries of narrative and performance within the evolving landscape of computer graphics?
BM: “Puppet In The Room” is something of a “Return to the Future” (with fewer DeLoreans than the excellent “Back to the Future”) moment.
In a similar way VR learned techniques from in-the-round theater, we’re taking the lessons we’ve learned from developing narrative for live performances — particularly live, multi-forking narratives from immersive theater and sandbox live role-playing games — alongside more traditional theater, film, animation, and performance storytelling.
We’re creating a tool that lets anyone performing with a puppix system use these live performance lessons in a digital space, or mix and match live physical and digital performances and interactions into layers of complexity.
We’ve already been amazed by what the people we’ve worked with are doing with the technology, and we’re looking forward to bringing more people into a technique that can blend multiple types of performance to help them play, explore, and create new sets of stories and experiences. At heart, you’re playing with a physical character in the real world and watching it bring that life into the digital realm.

Ben Mars works as a character animator and animation supervisor for features, television, games and immersive experiences, as well as a live action puppeteer. He’s been helping characters hit each other with frying pans since 1999, and very occasionally makes award-winning short animation films.
He founded The Black Laboratory in 2008, as an animation and development studio with a focus on character-based narrative work.
The puppix project came from a desire to combine the creative team process and physical reality of live puppetry with the possibilities of computer animation.



