Photo courtesy of LAIKA Studios/Annapurna Pictures
It’s been well over a year since creators at LAIKA sat down with a SIGGRAPH audience for their Production Session on the making of “Missing Link,” the studio’s most recent release (and now Golden Globe-winning production). We sat down with two lead creatives to learn more about their roles in creating the film and LAIKA’s filmmaking process.
SIGGRAPH: How many people, in total, worked on this film and what were your roles specifically?
Steve Emerson (SE): For “Missing Link” I believe the number was roughly somewhere around 400 artists total in the studio. The visual effects team was about 90 people. Half of the team was 2D-oriented — doing RotoPaint, cosmetic work, and compositing — and the other half of the team were building and helping to integrate CG elements. For my role in the film, I was visual effect supervisor. I was responsible for the front-end planning and working through show and shot strategy with the other key creatives at the studio. Then, as we got into production, I’m the guy that’s on the hook to make sure that whatever ends up on the screen, in the end, is representative of the director’s vision.
“I’m the guy that’s on the hook to make sure that whatever ends up on the screen, in the end, is representative of the director’s vision.”
Eric Wachtman (EW): My turn? I was the CG look development lead. I basically oversee all of the asset creation and anything that gets built digitally. I make sure that all that stuff matches anything that the teams do practically. I work with all of the other department heads, as well as artists, to ensure that we are following all of design rules and building things in a way that makes sense and is practical.
SE: Yeah. To add to that, “ParaNorman” was our first film where we really started talking about making hybrid animation films. We decided then that we were going to embrace CG technology in order to open up the worlds of the movies that we were going to make. We wanted to tell stories without any types of limitations. When we made that transition, Eric was the guy who figured out how we were going to go about it. What we learned was that it was going to be through some pretty intense collaboration with the practical side of the studio. Hopefully, when you’re watching the films you cannot tell what’s a physical puppet or physical materials and what we’ve done here in visual effects.
SIGGRAPH: Talk a bit about how the VFX and puppet-making teams collaborate. What does that process look like? Is there a technology you employ that might surprise readers?
EW: It’s a pretty typical CG pipeline…
SE: Yeah. I would say a lot of people think of us as an animation studio — and rightfully so because we make animated films. But one thing that people don’t always understand is that what we’re really doing here is live-action, photorealistic visual effects. We have actors, we shoot on green screens and we create photoreal moments. I mean, the only real difference is that we’re doing it with 10-inch actors one frame at a time, over the course of weeks or months with some of these shots. But, a lot of the toolsets that we use are representative of live-action visual effects. Eric and the team have been able to create a workflow to do that with stop-motion filmmaking.
EW: Exactly. So, I mean, it always starts through the design process and then we basically work back-and-forth with the character designers and costume department to make sure that we’re hitting the style. When you’re dealing with a scale like that, seams become really big. So even on that level there are a lot of ways we need to work to hide or enhance cloth seams and weave to make sure that we can see it. So, naturally, we’ve had to make some tools to deal with that. A lot of stuff we shoot under a microscope. We also do a lot of set scanning and puppet scanning. We’ve also had to write a lot of our own hair and fuzz procedural. For example, our hair is a bunch of hairs that are put into strips that we call “ribbons.”
We also have a lot of complex surfaces due to the practically-made materials as well as details made with many layers. And, we have rapid prototyping surfaces that sometimes get sanded and sometimes don’t, which gives it a completely different look and feel. To sum it up, it’s really about working back-and-forth across departments and aligning through open communication.
SE: Sure, yeah. A couple of things I would add is that we’re pretty unique in that these films take so long to make. VFX is not simply a post-production step in this process. We are an ongoing piece of the puzzle and are heavily involved from the very beginning during pre-production. When the teams are beginning to build the puppets and are starting to realize the types of materials they’re going to be using, we have a seat at the table. We’re constantly chasing stop-motion characteristics. And I think that quality has a lot to do with the direct human influence. When human hands and human beings are in the mix, you get these beautiful, subtle imperfections. A computer will always want to give you something that’s perfect unless you tell it not to. We chase those imperfections and pull them into our work as well.
SIGGRAPH: What technology, specifically, is involved?
SE: We use a lot of different software packages that are fairly typical in live-action visual effects work. Maya for animation and rigging. We use Houdini for all of our effects work. We use Silhouette for a lot of the cosmetic and puppet paint work. We adopted Nuke at the very beginning of “ParaNorman” and along with that brought in almost all of the Foundry products. What else?
EW: We use Substance for textures, Katana for lighting. RenderMan. We try to get off-the-shelf stuff because we have a relatively small team and, from there, our production technology team glues it all together with our own code.
SE: The puppet faces use rapid prototype (RP) printing and, outside of that, the costumes are all handmade — we have a costume designer for all the films. She designs and creates all of the costumes.
SIGGRAPH: Where do story and your actors fit into the pipeline?
SE: First we get a script and do an initial breakdown in order to figure out the scope of the film and what it’s going to mean in terms of resources and staffing. During that breakdown we start to get a really good idea of where visual effects will come into the mix and where there might be some big challenges. A classic of this example was “Kubo and the Two Strings.” When we sat down and read that script — and saw the initial pitch presentation — we knew that we were going to be doing huge water systems and had not done something of that scale yet. So we immediately starting planning for it.
The next thing that happens is the story team will start storyboarding and create animatics. As editorial starts to cut them together you start to get a real sense of the rhythm and scope of the film. We’ll record temp voices using local Portland talent and studio team members. Once the director is happy with the temp voices, we’ll go out and record the actors. From there, the animatics are adjusted accordingly to match the performances.
As soon as editorial feels really good about a part or section of the film, we’re off to the races.
SIGGRAPH: Do you remember exactly how long “Missing Link” itself took?
SE: We started on “Missing Link” while we were working on “Kubo.”
EW: Yep.
SE: So “Kubo” was what… 2016? Which means it was probably late 2015 that we started working on it. At that point it had been in development, I believe Chris [Butler] said he’d been working on the film for almost 20 years. We were in the mix for about four.
SIGGRAPH: In recent interviews, different members of the team have discussed making ice or building the Himalayas. Share the shot or scene in “Missing Link” that each of you are most proud of and why.
SE: The idea behind the entire studio is that we’re realizing the potential of stop-motion — that we’re telling stories that that are unlike anything that anybody has ever seen before using the art form, and we’re challenging ourselves in new and different ways.
I think one of the most groundbreaking elements of “Missing Link” is the ice bowl sequence. It is an action scene that is extremely cutty. The first iteration of that scene had more than 300 shots, some of which were as short as six frames. The final sequence landed somewhere around 200. You just don’t do that in stop-motion. You see, whether you have a shot that’s going to be 150 frames or six frames, it still takes the same amount of setup. You have to go through this long process where the set is dressed, the puppets are posed, the cameras are set up, the lighting, the rigging – everything is checked and double-checked. There are many layers of approval that we have to go through before the animator can get started. Best-case scenario, the shot setup may be two to three days. Taking on an action sequence like that was a bold choice. I am really proud of that.
EW: I liked working on that sequence, it was great, but I actually have a slightly different answer. I’m particularly proud of Optimates Club characters. I thought they came out really well and integrated extremely well. I was a little nervous about them because of it being in such a contrast-y environment, which doesn’t usually lend itself to CG very well sometimes. We also had to wedge those characters in-between practical characters, which is something we have not done a whole lot of in the past. Having digital characters touch practical characters was well done by the team and exciting to see.
SIGGRAPH: Much of your SIGGRAPH 2018 Production Session focused on how LAIKA has adopted RenderMan’s RIS and is able to leverage new workflows to quadruple the output of photo-real, design-intensive background puppets, props, and environments. Can you share more information about these workflows? (Our readers love details!)
EW: RIS helped us out quite a bit. We actually used it earlier toward the end of “Kubo” when we ran into some situations where the current system could not render. With RIS we got some stability for lighting and helped us to know what we were going to get. Plus, the iteration times with the live rendering shot up our productivity.
SE: Like Eric touched on earlier, we have a small team and we do touch every shot in some capacity. With that, there’s an incredible amount of work. I think the biggest thing for me with the switch to RIS was that, on previous films, the lighting work would go off to the farm and we wouldn’t see the results until early morning the next day. RIS enabled us to work through issues desk-side, where formerly we’d have to resolve problems in compositing because there simply wasn’t time to go back to lighting.
EW: For sure. We also completely changed our lighting dailies. We were able to run low sample renders and look at reels quick and make adjustments. Sometimes within a couple hours we would be able to see it and then from morning dailies to afternoon dailies we could realize a change quickly. It allowed for more stability in an environment where schedules and deadlines don’t always match up.
SE: [Laughing] Another important thing that you just reminded me of is that you can always make a shot better. You can always put more time into a shot. But, deadlines dictates when you have to let it go. RIS enabled compositors to focus more on shot sweetening because they were no longer tasked with fixing lighting problems. The bulk of the issues had already been resolved upstream.
SIGGRAPH: The video embedded below showcases the beauty and complexity of stop-motion animation, which is such a massive achievement on its own, but we’re still curious: What is one detail you had a hand in creating that audiences might miss?
SE: Okay, I’ve got a couple. First, specific to the McVitie’s Saloon scene, the characters that are populating the bar — outside of the lead characters, the villain, his thugs, and the dog — were digital puppets. There is a great shot in that sequence — a top-down wide of Mr. Link — and the bar fight has broken out. He is just surrounding by chaos. In that particular shot there are about 20 characters, half of which are digital. However, on the practical side, and people might not realize this — but scheduling the puppets is very difficult. There are rarely enough puppets to go around. At the height of production we’ll have around 50 active stages. They all need puppets. So sometimes we have to shoot the puppets in separate passes. We’ll get one puppet in May and the other two in July. Then, as time passes, subtle changes happen to the left. It shifts slightly, or a gel bleaches out a bit. By the time we get all of the different elements and start piecing it together, it ends up being a heck of a task for compositing — far more challenging than simply pulling keys, dropping in CG and integrating.
EW: The other thing that goes along with that that people might not realize is that we actually have to digitally recreate the whole set. We have to put the characters in there and they have to be able to shadow, etc., and we need to build fully textured versions of the whole bar, even though a set is built practically.
We need to build fully textured versions of the whole bar, even though a set is built practically.
SE: The other moment I’d mention is when our heroes arrive at the yeti temple. A practical curtain parts and you see down the temple to Emma Thompson’s character, The Elder. Both sides of the temple are lined with yeti guards. When that shot came together and we looked at it in the cut, we realized that because we were coming out of a close-up of Sir Lionel, it played like a POV. Chris wanted that shot to feel imposing and wanted to give The Elder a sense of power, so the camera was very low. But because of the shot we were cutting from, it felt like Lionel was six inches tall. The fix ended up being pretty simple; we placed a digital yeti arm in the foreground. It was just enough to eliminate the POV feel, and the shot suddenly worked.
SIGGRAPH: “Missing Link” is LAIKA’s first Golden Globe-winning feature. Congrats to both of you and your teams (for that and the Oscar nomination)! What was the reaction around the office?
SE: I don’t know if it’s changed much but [laughs] I’ll say this. The box office was certainly disappointing, especially considering the amount of time and effort that we put into the film. It’s the only thing that we worked on for years and years, and we were really proud of the film and excited to share it. Then, when people didn’t show up at theaters it was really disappointing. Now that we’re receiving this recognition – which, in turn will hopefully motivate more people to see the film — I think, in terms of morale here the studio, that attitude has really skyrocketed. After we won the Golden Globe they brought the award back to the studio. Our producer read a speech from Chris Butler, we had a champagne toast and the globe was passed around like the Lombardi trophy after a Super Bowl victory.
EW: It definitely boosted morale and everyone was really excited.
SIGGRAPH: This year’s animated feature list includes a unique mashup of stories that are varied in how they are told as well as in content, such as with “I Lost My Body.” What are your thoughts about what that means for animation?
EW: I come from an anime background and a lot of those stories lean toward being more adult. I think this year’s class of nominees is a great thing and like that films with many different contexts are being recognized.
SE: I agree. I don’t think it will necessary affect the types of films we choose to make at LAIKA. For years now we’ve been very vocal about the types of films that we want to make. We don’t want to build franchises and we want to make original films that are bold and that have deeper meanings and will hopefully move audiences. A film like “I Lost My Body” makes me so excited to see that these types of films are still being recognized.
SIGGRAPH: Share a favorite SIGGRAPH memory.
SE: Alright. [Laughs] I think one of the most memorable SIGGRAPH’s for me was 2009 in New Orleans. We had just finished up the production of “Coraline.” One specific memory, other than that it was fun because it was New Orleans, was the Nuke User Group meeting. You see, “Coraline” was a stereo film and we had done all of the compositing in Shake. It was a very complicated process and we dealt with so many problems because there was no stereo support. I remember really clearly being in the French Quarter in New Orleans at the Foundry’s presentation. The teams were there from “Benjamin Button” and “Ice Age” and Foundry did a big Nuke demo — we were already committed to using it on the next show — but it was when I first became aware of Ocula. It was like the first time I saw my wife, I’ll never forget it.
EW: Keep in mind, I haven’t missed a SIGGRAPH in over 20 years. So I have a lot of stories, some of which are not interview appropriate. [Laughs] I’m gonna go with the first time I did a Production Session, in 2016, which was on “Kubo.” It was pretty exciting and my parents actually happened to be in town and were able to come and see it. I think that’s one of my favorite memories.
Steve Emerson is an Oscar®-nominated visual effects supervisor (for “Kubo and the Two Strings”). As a longtime LAIKA collaborator, he has contributed to five award-winning films: Golden Globe winner “Missing Link,” BAFTA winner “Kubo and the Two Strings,” “The Boxtrolls,” “ParaNorman,” and “Coraline.” Prior to joining LAIKA, Steve spent nearly 20 years working in visual effects as an artist and technical director. He has worked on many feature films and television series, including “The Matrix Reloaded,” “Transformers,” “The Dark Knight,” and “Sliders” for Universal Television. In 2017, he received the Visual Effects Society Award for Outstanding Visual Effects in an Animated Feature for “Kubo and the Two Strings.”
Eric Wachtman is a CG look development lead at LAIKA. He has been working professionally in film and television since 1995, and joined LAIKA in 2006. Prior to joining LAIKA, Eric spent eight years as an art director and director of CG for Cartoon Network and Adult Swim.