Standardizing Color on LED Walls: A Virtual Production Solution

Images from Fireworks (2021) © 2021 Wilder Films

So you’ve prepared an LED wall for your next shoot. How do you ensure the colors approved in pre-production aren’t washed out once they’ve been filmed? The team at DNEG solved this conundrum and presented their findings as part of the SIGGRAPH 2021 Talks program. Here, presenter Oliver James gives further insight on the Talk “Colour-managed LED Walls for Virtual Production,” sharing the inspiration behind the project and how the solution transforms filmmaking.

SIGGRAPH: Share some background about “Colour-managed LED Walls for Virtual Production.” What inspired this solution? 

Oliver James (OJ): We set out to solve the problem of preparing images for display on LED walls in virtual production so that, when filmed, the resulting footage retains the creative intent of the original images.

We had tackled the problem of color-managing LED walls previously — for example, on “First Man” — but knew further work was needed. In spring 2020, we had the opportunity to access an LED wall and movie camera at our offices in London. This new opportunity gave us more direct access to the equipment and allowed us to perform a more thorough investigation. Previous conversations about color with LED wall suppliers showed us that their focus was typically on the appearance of the wall to the human eye, treating the wall as if it was the final stage in the imaging pipeline. Our color-science team realized this was the wrong approach for virtual production, so we jumped at this opportunity.

SIGGRAPH: Tell us about the process of creating “Colour-managed LED Walls for Virtual Production.” What was the biggest challenge you faced? How did you overcome it?

OJ: London was in the middle of our first COVID-19 lockdown when we started this new phase of work, but in some ways these constraints helped focus our development. Initially we had no access to the office, but we realized the problem of filming an LED wall was similar to photographing a computer monitor. The initial analysis and code development was therefore done at home. I photographed hundreds of color patches on my computer’s monitor. This meant that when we did eventually get access to an LED wall and movie camera, our methods were already in a good state.

Another constraint-turned-opportunity was that when we did access the LED wall, LED processor, and movie camera, we were left alone with it, as social distancing limited the number of people allowed in the lab. Typically, we would only have indirect access with camera operators and technicians assisting us. I’m much happier being more hands-on, and operating the equipment ourselves led us to experiment more and become very familiar with all aspects of its operation.

SIGGRAPH: How does this solution transform filmmaking? What problems does it solve?

OJ: From conversations with vendors, we realized that many people were taking an ad-hoc approach to color management when filming LED walls — they might send display-referred images to the wall and then tweak the color settings until they like what they see through the camera. This approach can give good results, but it also can take up a lot of time on set and is a subjective process. By setting a clear goal that the images recorded on camera should match the original material, we were able to standardize the process of setting up the LED color, removing the uncertainty of the subjective approach. This allows us to prepare a technical preparation of the LED stage before a day’s shoot, freeing up valuable time with the director to focus on creative aspects.

SIGGRAPH: You presented an on-demand Talk and participated in a Q&A about real-time technology at SIGGRAPH 2021. What was it like to present virtually? What was your favorite part of this year’s conference?

OJ: Given the circumstances, a virtual conference was the only option, but I much prefer a live presentation to a pre-recorded one. For me, SIGGRAPH is as much about meeting people as it is listening to the presentations, and that aspect looked different in a virtual world. On the other hand, having extended access to all the recorded Talks is wonderful, and it would be great if that became standard in future conferences.

For a SIGGRAPH highlight, I’m going to pick out a paper that has very little to do with my day-to-day work but resonated with a younger, tennis-obsessed version of me —“Vid2Player Controllable Video Sprites That Behave and Appear Like Professional Tennis Players.” The author’s love of the game shines through the work.

SIGGRAPH: What advice do you have for someone looking to share their own innovations at a future SIGGRAPH conference?

OJ: Do it! I can’t imagine anyone regretting submitting their work to SIGGRAPH, but I know many people who’ve regretted not submitting. Preparing a submission forces you to take a step back from your work and re-evaluate what you’ve done, which is beneficial even if it doesn’t make it to the conference.


ICYMI: You can still register for SIGGRAPH 2021! Register now to catch this session and more on-demand.

Oliver James is chief scientist at DNEG. For the last 25 years, he has been involved in creating technology to realize some of the most demanding visual effects in film, from pioneering work on blue-screen compositing systems to relatavistic ray tracing of black holes.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.