Exploring the VFX in ‘Guardians of the Galaxy Vol. 2’

by | 22 February 2018 | Conferences, Film, Production, Visual Effects

photo by John Fujii © 2017 ACM SIGGRAPH

Last summer, SIGGRAPH 2017 attendees were treated to a presentation as part of the Production Sessions program titled “The Making of Marvel Studios’ ‘Guardians of the Galaxy Vol. 2′”. What follows is a recap of that session, featuring insights from [right to left above] Victoria Alonso (executive vice president – physical production, Marvel Studios), Damien Carr (VFX producer, Marvel Studios), Christopher Townsend (VFX supervisor, Marvel Studios), Guy Williams (VFX supervisor, Weta Digital), Simone Kraus (VFX supervisor, Trixter), and Nordin Rahhali (VFX supervisor, Method Studios).

In building the visual effects for Marvel Studios’ second installment of “Guardians of the Galaxy,” teams across the production were fortunate to have a deep frame of reference to start from. That said, with multiple non-Marvel vendors working on the project — Framestore, Weta Digital, Method Studios, and Trixter, to name a few — as well as, by Townsend’s calculations, nearly 98% of the film using VFX (of the film’s 2,360 total shots, 2,301 used VFX), there were some unique challenges and learning that came out of producing the massive sequel. Here are some key insights across each production team, broken down by character:


For Rocket, it all started with approach meetings, according to Carr, which allowed the team to break down the script, line-by-line, and count VFX shots. The character of Rocket in “Vol. 2” involved well over 1,000 Rocket-in-groups shots that Marvel needed flexibility on in post-production. Because of this and the film’s timeline, the Rocket asset had to be re-engineered for cross-vendor sharing.

“With a hairy character, the idea of sharing it among multiple vendors seems ridiculous and crazy. But there was no way that one company — in our time frame — could do it,” noted Carr.

Marvel challenged each team to take what was known and loved about Rocket and make him even better for the sequel. This led to the first step in the process, which was improving the character’s facial animation in order to make him more emotive. Upon completion, Framestore built a shareable “Rocket Bible” of everything each team member needed to know in order to re-create Rocket.

With this bible, Weta Digital, according to Williams, followed a “Hair-to-Hair Parody” process so that the team could architect the shading and detail in Framestore’s Rocket into Weta’s Rocket. This was necessary because some things just cannot be successfully shared between companies as a result of the different software being used. For example, animation puppets are so integrated into the animation pipeline that they are hard to share.

From Trixter’s perspective, the entire “Vol. 2” team was lucky to have a director like James Gunn because he knows his characters so well and treats the digital characters just like the live actors. To create a single, cohesive Rocket performance, the team at Trixter relied heavily on Gunn’s vision, a combination of on-set footage of actor Sean Gunn playing Rocket and recorded footage of actor Bradley Cooper reading Rocket lines, animal references, and a library of Rocket reactions built for the first film, in order to isolate useful details in complex animated sequences. Nowhere in the film is this more apparent than during a unique distortion/warp scene that led to the building of CG doubles for two of the human characters so that Trixter could warp the facial features of each when traveling between dimensions.

In the words of Alonso, “If you don’t share, then you can’t work with us.” It is difficult to complete such a massive production without sharing. A mindset of “we” versus “me” goes a very long way.

Baby Groot

The biggest challenge for a character like Baby Groot, who Framestore built, was the ability of each production team to convey different emotions through Baby Groot’s limited vocabulary of just three spoken words: “I am Groot.”

For an entirely CG sequence known as “Don’t Push This Button,” Rahhali noted that his team began its production using previsualization and storyboarding. Each animator then shot his or her interpretation of the scene because there was no direct reference for this type of scenario in the first installment of the franchise. In finalizing the shot, the team used V-Ray against Framestore’s Arnold in order to make the level of accuracy between characters invisible.

Perhaps one of the most exciting challenges on the film was rendering a dancing Baby Groot in the opening sequence, which was based off of actual footage of director Gunn. View a clip from the final shot below, starting at 1:07. For more on the development of this scene, check out these articles from Chicago Tribune, Fast Company, or CinemaBlend.

Ego the Living Planet

For Ego, Townsend shared that the team wanted the physical planet to feel very alien. The challenge for the VFX teams, then, was to create an incredibly surreal film environment that would still feel real for the audience.

Noted Rahhali, the creative process has a lot of loops and turns that eventually take you somewhere fantastic. The process for Ego was no different. Initial sketches from the art department were very illustrative and beautiful, so it was the team’s job to then make a tangible world. Original concepts involved arches instead of the eventual spires on the palace grounds and were also very red in color, with little green introduced.

Weta had to be similarly flexible when creating the inside of the planet, ultimately choosing to build it as a single set piece. The biggest challenge in building the environment as a single asset was that it is built out of fractals. Two things Williams points out about fractals are:

  1. They are infinite resolution, which means that as you add new angles, you see new things. And,
  2. The complex mathematical structures that fractals yield do not lend themselves easily to film projects.

In the end, Weta made educated compromises across its in-house teams and used both Maya and photogrammetry software to eventually get the detail in the final scenes (which, Townsend noted, utilized just under half a trillion polygons!). Fun fact: The longest frame to render took nearly 5 days.

The eventual final character and planet of Ego is a strong example for a case Alonso made: Out of 1,000 ideas for a film, around 998 of them will be shot down or eventually changed in order to better suit the final cut.

“Throughout your careers, you’re going to be told ‘no’ a lot, ” said Alonso. “But that doesn’t mean you stop creating. Keep throwing darts at the board and maybe one will stick.”

Good luck to the entire Marvel team at the Academy Awards next month!

Related Posts

SIGGRAPH Spotlight: Episode 77 – Synesthetic Connections, Real-Time, and Immersive Experiences

SIGGRAPH Spotlight: Episode 77 – Synesthetic Connections, Real-Time, and Immersive Experiences

SIGGRAPH 2024 Courses Chair Ruth West chats with audio and graphics experts Aaron McLeran, Felipe Romero, and Max Hays, about synesthetic connections between real-time audio and graphics within games.