Creating the Killer Content in ‘Mortal Kombat 11’

by | 5 March 2020 | Animation, Conferences, Design, Gaming, Visual Effects

Property of WB Games/”Mortal Kombat 11″

How does one go about creating the latest addition to one of the most successful game franchises of all time? NetherRealm Studios‘ Technical Artist Aren Voorhees, Lead Technical Artist Matt Battaglia, and Senior Software Engineer Jason Nadro rose to the challenge, developing the latest and greatest installment in the “Mortal Kombat” series, “Mortal Kombat 11.”

Expanding on their SIGGRAPH 2019 Talk, the team dives into the technology behind the famous blood FX, for which the game is known, and the utilization of real-time methods.

SIGGRAPH: Talk a bit about the process of developing high fidelity cached simulations for “Mortal Kombat 11.” What inspired the team to take on this challenge, especially using real-time methods?

Jason Nadro (JN): The process started very organically with Aren and Matt showing me some look-dev work of fluid simulations they were working on in some new simulation software. They expressed interest in achieving similar fluid simulations in our game. I then began with some “napkin math” to see if the baked simulations were even in the right ballpark for games. It seemed feasible, and I started figuring out the best way to make their vision a reality. I was inspired to move forward by the quality of the artist’s initial vision. Realism is one of the core pillars of the “Mortal Kombat” franchise, and the thought of seeing similar, realistic fluid effects in our game motivated me to make it happen.

Aren Voorhees (AV): The initial goal for our “Mortal Kombat 11” blood FX was to achieve a more fluid and evolving look than what we thought could be achieved with traditional particles, though we didn’t initially know how we would do so. Around the same time, we were working on learning some new simulation software and integrating it into our pipeline. We decided to look-dev liquid splashes as a way to test the new fluid simulation tools. I went out on a limb and simulated blood from one of Scorpion’s sword attacks, focusing on the exaggerated, over-the-top style for which “Mortal Kombat” is known. This look-dev test provided a visual target for the type of assets we would need to implement. Over time, we refined the asset creation process, improved the initial pipeline, and ended up with our ultimate geometry cache solution.

Matt Battaglia (MB): My work all started as a big “what if” question, exploring the possibilities of using complicated offline simulations in our games. One of the most interesting challenges during the production of a “Mortal Kombat” game is the wide variety of art and visual effects (VFX) assets we need to generate. While real-time tools are getting better all the time, being able to import a whole new class of geometry from external digital content creation (DCC) tools blew the doors off of what we could do. It started as an initiative to simply get geometry caches into the game, but once Aren did his sword slash look-dev, it quickly became a hotly requested item. It didn’t take long for “mesh blood” to be a staple in almost every cinematic type, so it then became more than just a technical implementation challenge and we had to figure out how to solve the production, scope, and budget issues.

SIGGRAPH: Let’s get technical. How many people were involved? How long did it take? What was the biggest hurdle? Tell us more about the engineering process!

JN: As the senior software engineer, I was the only one involved on the engineering and programming side, while Matt and Aren were leading the charge on the artistic side. Overall, the task took roughly six months, but I took an incremental approach to building up the system and delivering features to the artists. It took roughly the first month to get a first pass import pipeline in our toolset, a custom data format for homogeneous geometry caches (e.g., flags, cloth) and runtime playback. Next, I implemented heterogeneous geometry caches — the more difficult case — which were used for our fluid effects and became the topic of our SIGGRAPH 2019 Talk. It took an additional two months to get a working proof-of-concept for this in the game. Finally, there were on/off memory optimizations, triaging, and bug fixing, totaling around another two months of work. The biggest hurdle on the engineering side was making the prohibitively high memory cost of geometry caches possible, while also working within specific budget constraints. Most of my time was spent solving that problem, and it was the biggest portion of my section of our Talk.

We decided to use material-driven geometry cache sampling because it placed the power of controlling the animation of cache in the artist’s hands in a way that was most familiar to them. Unreal Engine already has world-position-offset materials, which allow for procedural animation of a mesh’s vertices. I realized we could extend this feature and let the artist sample the geometry cache buffers in the material and output a new vertex instead of just a vertex offset. From an engineer’s implementation point of view, it allowed me to reuse an existing system, which saved me a lot of time.

SIGGRAPH: What did the process look like from an artistic standpoint? How big was your team? Any major hurdles on this end?

AV: On the artistic side, there were two artists involved in the initial effort to prove out the pipeline — myself and our Lead Technical Artist Matt Battaglia. We also received feedback and direction from our Art Development Director Joe Berger. Having a small group devoted to this effort made it quick and efficient to push the technology forward, quickly solving problems and fixing bugs. From my perspective, the biggest hurdle was learning how to get the fluid-evolving look that we were trying to achieve. I was very new to Houdini at the time and had to initially stumble my way through the process before finding methods that consistently could produce the desired visual results.

The path we naturally ended up moving toward was the idea of developing a library for the assets. This allowed us to meet the production needs of filling in effects for a huge volume of cinematic events for each character on our roster. We spent a lot of time up front developing each alembic mesh for the library, but once that was done, it was relatively quick to put together a convincing VFX pass for each cinema event.

MB: Another big challenge was solving the production realities of using these assets in hundreds of cinematics. Due to our game’s package structure and streaming needs, we were capped at 150 MB for our total budget game-wide, which isn’t a ton to work with. In addition to our limited memory budget, we knew up front that we wouldn’t have the production time to make unique simulations for every cinematic. We just knew working within those constraints that we had to build a varied enough library of reusable assets. The rest was just hard work and iteration; you can never predict which assets will work best, you just start building cinematics and re-evaluating as you go.

After we had a few completed cinematics under our belts, we counted the frequency that each asset was referenced and found replacements for the lower-use versions, giving us some more space to add variants of the types of shapes we were using at high frequency. Our ability to control the playback rate of each cache via a material parameter also really saved us. Each simulation was only 55 frames, and the slow-motion effect you see in the final result was baked into the simulation. Keyframing that playback rate with finely turned curves allowed us to cheat the timing of the simulation to match the timing and choreography of the character animation and camera.

SIGGRAPH: What do you find most exciting about the final product you created and the results you presented in your SIGGRAPH 2019 Talk?

JN: As a graphics engineer, it is incredibly rewarding to see our talented artists take a system you built and flood it with creativity. It was also inspiring to receive all of the positive feedback and be able to share more ideas with the graphics community after the Talk. [The] SIGGRAPH [conference] provides an excellent opportunity to bridge gaps between artists and programmers and to facilitate the sharing of ideas among the graphics community, specifically those facing similar problems.

MB: Organizing my thoughts and getting them together to present to an audience was a great way to add clarity to the production process. While you are in the thick of things — iterating, failing, trying again, working against deadlines, and changing direction — you tend to become very myopic in thinking about your work. Stepping back and putting together our Talk was a post-mortem of sorts, which taught me some valuable lessons that will hopefully aid the next R&D (research & development) effort we start. Also, of course, seeing our initial “what if” question turn into really cool cinematic moments that our fans love makes all the hard work worth it.

SIGGRAPH: What’s next for NetherRealm’s use of real-time gaming?

MB: We can’t share too much just yet, but we’ve learned a lot through working with the system and have continued to push what we can do even more. Since “Mortal Kombat 11” initially launched, we’ve released a number of downloadable content (DLC) characters that we’ve had a lot of fun making and adding to the roster. We also mentioned some future R&D ideas toward the end of our Talk that we hope to pursue in the near future!

SIGGRAPH: Share your all-time favorite SIGGRAPH memory. 

JN: My all-time favorite memory would be the SIGGRAPH 2015 Talk “Building Interstellar’s Black Hole: The Gravitational Renderer.” This was an exceptional Talk by Kip S. Thorne, a theoretical physicist, who discussed the science behind black holes and how they rendered them for the movie “Interstellar.”

Second place goes to SIGGRAPH 2012, where I spent the whole conference on crutches. I presented at SIGGRAPH Dailies!, and I almost face-planted hobbling up the steps to the podium in front of the whole conference room!

MB: I really enjoy Technical Papers Fast Forward every year. It’s such a great way to compress an overview of everybody’s work in an entertaining format. One particular stand-out memory I have is from SIGGRAPH 2017: I was walking out of a Production Session and was greeted by a live giraffe hanging out in the lobby.

SIGGRAPH: Real-time and ray tracing methods are becoming increasingly popular in computer graphics. How do you see one or both of these approaches influencing the future of game development in the industry?

JN: The recent developments for ray/path tracing in real-time are certainly exciting, which is evident in all the SIGGRAPH 2019 Talks we heard last year. I believe this will continue to be the case and is something to follow in the coming years.

MB: I’m personally excited to see what the industry does with hardware ray tracing outside of rendering. We typically think of the GPU for graphics only, but there could be some exciting uses for general compute tasks.

What to share your games work at the upcoming SIGGRAPH 2020 conference? Submissions are still open for the Computer Animation Festival Electronic Theater, Real-Time Live!, and Posters!


Matt Battaglia is the lead technical artist at NetherRealm Studios. With over 14 years of professional experience, Matt has a background in AAA game development, feature film, and teaching. He enjoys developing asset pipelines and pushing the boundaries of what is possible in real-time rendering and visual effects. While not chipping away at his ever-growing list of new software or research projects, he enjoys tackling home improvement projects and attempting to entertain his wife and daughter with bad dad jokes.

Jason Nadro is a senior software engineer on the KoreTech team at NetherRealm Studios. For the past 11 years, he has worked on graphics for the “Mortal Kombat” and “Injustice” fighting game franchises. As part of the SIGGRAPH 2012 Computer Animation Festival Dailies, he shared — along with Jon Greenberg — the character morphing technology for “Mortal Kombat.” He received his B.S. and M.S. in computer science from Northern Illinois University.

Aren Voorhees got his start in the games industry in 2008, working primarily with character art. He gradually found himself drawn to technical art and began to focus on character rigging, material development, and writing scripts and tools. His latest pursuit has been learning to master high-end, offline simulations with particles, fluids, and volumetric effects. He is currently working on refining those skills and finding ways to apply them to real-time games pipelines.

Related Posts