Reinventing Recoloring With ‘Photo-Chromeleon’

by | 14 October 2020 | Conferences, Design, Emerging Technologies

The representative image is originally from our paper which was published in UIST ’19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. Full copyright information can be found at the end of this blog entry.

SIGGRAPH 2020 Emerging Technologies selection “Photo-Chromeleon: Re-Programmable Multi-Color Textures Using Photochromic Dyes” showcases a new technology that allows the user to recolor single-material objects multiple times via re-programmable, multi-color textures. We caught up with the team from the MIT Computer Science and Artificial Intelligence Laboratory (MIT CSAIL) to explore their process for bringing the innovative idea to life.

SIGGRAPH: Tell us about the process of developing “Photo-Chromeleon: Re-Programmable Multi-Color Textures Using Photochromic Dyes.” What inspired your team to pursue the project?

Team: A lot of waste is produced in our current world with people buying and quickly throwing away various objects and materials, such as cloth, shoes, phone cases, etc. At the same time, there is a high demand for personalized items that reflect users’ individuality. We developed Photo-Chromeleon with the goal of satisfying these needs. Photo-Chromeleon enables the user to recolor their personal items, whether that be a favorite shirt or a phone case.

With this technology, instead of buying five phone cases with different designs, you just have to buy one and you can recolor it to your personal preferences. On a larger scale, this new process saves a lot of waste and enables people to express their individuality more frequently, at lower costs.

SIGGRAPH: Let’s get technical. How did you develop the technology?

Team: In designing Photo-Chromeleon, we took advantage of recent developments in material science that enable photo-chromic dyes to switch between a transparent and a colored state. These dyes, however, can only display one individual color. Our goal was to create a new ink formulation that enables multi-color textures at high resolutions.

We also took inspiration from traditional printing technologies. Ink-jet printers use a CMY color model to create a wide spectrum of colors from just three base colors. Thus, we investigated photo-chromic dyes that have similar colors to the CMY color space and ultimately mixed cyan, magenta, and yellow photo-chromic dyes together into one ink. This allowed us to generate multi-color textures.

We also needed to control the saturation of these materials. To get a light orange, one needs to set cyan to 0%, magenta to 50%, and yellow to 100% saturation. We carefully analyzed the light absorption spectra of those dyes and found that their saturation changes at specific wavelengths. We also found that an off-the-shelf RGB projector can be used to effectively control the saturation of each dye and generate various colors from the CMY color space.

SIGGRAPH: How many people were involved? How long did it take?

Team: We developed this technology at MIT with collaborators from the HCI Engineering Group, led by Professor Stefanie Mueller. The three lead authors — Yuhua Jin, Isabel Qamar, and Michael Wessely — are postdoctoral associates from different scientific fields, namely optics, material science, and computer science.

We developed the technology in just six months. This is a relatively short amount of time for scientific research and was made possible through the close collaboration of all involved.

SIGGRAPH: What was the biggest challenge?

Team: The biggest challenge most certainly lies in the diversity of the project. The research involved expert knowledge from optics, material science, engineering, and computer science to develop a system that can reprogram the appearance of objects with high-resolution, multi-color textures. The process was a rewarding experience for all parties.

SIGGRAPH: What do you find most exciting about the final product you presented to the virtual SIGGRAPH 2020 community?

: Changing a surface’s appearance or recoloring a physical object nowadays either involves working with displays, projectors, or with a laborious repainting of an object. Our system enables changing the visual appearance of an object without any external light sources or laborious paint jobs. The material itself has the ability to change color and be reprogrammed to the user’s needs.

This way of controlling the matter with which the world is built upon is a new and exciting step toward the fabrication of responsive and interactive objects and, ultimately, the internet of things.

SIGGRAPH: What’s next for “Photo-Chromeleon: Re-Programmable Multi-Color Textures Using Photochromic Dyes”? How do you envision the technology informing future fabrication techniques?

: Since this is the first work of its kind in the field, there are still many aspects to improve upon. We aim to speed up the recolor process — possibly making it instant. We also want to develop more advanced ink formulations that enable a wider color space. Finally, our goal is to increase the stability of the ink so it can keep its texture for months or even permanently.

SIGGRAPH: What did you enjoy most about attending SIGGRAPH?

: We really enjoyed the interaction between the researchers presenting their work and the audience bringing different perspectives from a wide range of fields. The conversations going from deep technical aspects to more broad art and industry topics, combined with people sharing inspiring views on the world, makes anything seem possible.

SIGGRAPH: What advice do you have for someone looking to submit to Emerging Technologies at a future SIGGRAPH conference?

Team: The most important aspect of a submission is that it is inspiring, novel, and enabling. People ultimately come to SIGGRAPH to hear the most recent advances in computer graphics, fabrication, and related fields. The Emerging Technologies program is on the forefront of these new ideas and possibilities, and your contribution should be as well.

Over 250 hours of SIGGRAPH 2020 content is available on-demand through 27 October. Not yet registered? Registration remains open until 19 October — register now.

Meet the Team

Yuhua Jin is a postdoctoral associate at MIT CSAIL. He conducts research at the intersection of HCI and optical engineering, and his current work focuses on developing novel optical methods for personal fabrication tools. His recent project, Photo-Chromeleon, received the Best Paper Award from ACM UIST 2019.

Isabel Qamar is postdoctoral associate MIT CSAIL and conducts research at the intersection of HCI and material science to develop re-programmable and interactive materials. She has received Best Paper and honorable mention awards for her work from ACM CHI and ACM UIST, and has hosted inter-disciplinary workshops aimed at bridging these fields.

Michael Wessely is a postdoctoral associate at MIT CSAIL whose research focuses on developing interaction-aware materials that can computationally control their material properties, such as shape and color, and scale from small prototypes to interactive architecture. He has published several papers at ACM CHI and ACM UIST, including two Best Paper Award honors, and one Best Paper nomination.

Stefanie Mueller is an assistant professor at MIT CSAIL, who conducts research on how personal fabrication and advances in material science can be used to create personal physical objects that adapt themselves over time to better accommodate a users’ preferences and needs. Stefanie has also given a range of live demos and organized workshops, tutorials, and courses over the last few years for ACM CHI and ACM UIST.

Image Copyright Details

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from UIST ’19, October 20–23, 2019, New Orleans, LA, USA © 2019 Association for Computing Machinery. ACM ISBN 978-1-4503-6816-2/19/10…$15.00

Related Posts

SIGGRAPH Spotlight: Episode 77 – Synesthetic Connections, Real-Time, and Immersive Experiences

SIGGRAPH Spotlight: Episode 77 – Synesthetic Connections, Real-Time, and Immersive Experiences

SIGGRAPH 2024 Courses Chair Ruth West chats with audio and graphics experts Aaron McLeran, Felipe Romero, and Max Hays, about synesthetic connections between real-time audio and graphics within games.