Twinkle, Twinkle, Little ‘Star-Stuff’

by | 11 November 2022 | Conferences, Virtual Reality

Top left and top right: Immersants splashing each other. Top center: Two immersants stand below a spiral galaxy created through their interaction; the form of their constellation bodies can be seen clearly here. Bottom left: Immersant standing calmly among the stars emitted from their body. Bottom center: The appearance of the hands as seen in VR. Bottom right: Top-down view showing stars rippling away as immersants dance and play with each other. CC BY-SA 4.0
Created by John Desnoyers-Stewart. Music by Dale Nichols.

There’s no need to look up at the sky, “Star-Stuff: A Way for the Universe to Know Itself” allows you to transform into a galaxy of stars right in front of your eyes. In hopes of discovering our own constellation, SIGGRAPH caught up with creator John Desnoyers-Stewart to learn about the inspiration behind this immersive project and the challenges behind working on this virtual reality experience. Plus, hear about upcoming projects John is planning for the near future.

SIGGRAPH: Tell us about the process of developing “Star-Stuff: A Way for the Universe to Know Itself.” What inspired you to pursue the project?

John Desnoyers-Stewart (JDS): I am pursuing a PhD in interactive arts and technology at Simon Fraser University where I am investigating the creation of immersive experiences that encourage social connection through abstract representations. With the birth of my daughter, I took parental leave from my PhD to give myself the time to focus on family. Watching her discover the world around her rekindled the childlike wonder in me and brought fresh eyes to my work as well.

With “Star-Stuff,” I was inspired to create an experience that brought together two people to give them an opportunity to see the world and each other with fresh eyes. As a child, I was fascinated by space. In particular, I was inspired by Carl Sagan’s documentary series “Cosmos.” He spoke about how looking outward to understand the universe was a way to look inward and discover ourselves. “Star-Stuff” is an homage to Sagan and his idea of simultaneously looking inward by looking out at the stars. Transforming people into galaxies allows them to see his metaphor literally, see ourselves as part of something much greater than we can comprehend, and recognize our fundamental shared identity as a way for the universe to know itself.

SIGGRAPH: How did the idea of using hybrid VR artwork to turn bodies into constellations come about? What was your favorite aspect of creating this project?

JDS: The idea came from a moment of inspiration while working on the aesthetics for an abstract bodily representation for another project. I was experimenting with drawing triangles between random points on a human mesh, and it reminded me of constellations being drawn in the sky. It sparked this question of “What would it be like to be a galaxy? What would it be like to meet a galaxy?” Normally this experiment would have been left behind as it didn’t fit the aims of the project it originated from, but I decided to follow that intuition and see where it led me.

My favorite part of this project was being able to follow my intuition and listen to my inner child. Often projects, even creative ones, can be quite rigidly defined from the outset. With “Star-Stuff” there was just one objective: to have two people meet as galaxies. The rest unfolded by iteratively experimenting with different ideas.

SIGGRAPH: What specific challenges came about during the making of this VR experience?

JDS: While the design process was open-ended and intuitive, “Star-Stuff” is the product of technological constraints. Often with creating immersive experiences, I find we are fighting the technology and pushing against it to try to do something it can’t quite yet do. With “Star-Stuff,” I instead focused on leaning into those constraints. While technology for extrapolating full-body tracking from just the hands is progressing rapidly, it was quite limited when developing “Star-Stuff.” The whole concept of the abstract body design comes in part from hiding the imperfections of the inverse kinematics that estimates the body position from just the head and hands. Running on a mobile headset like the Meta Quest also meant finding compromises to run a complex star simulation while still maintaining 72 frames per second. As a result, I simulated gravity using only a single point for each body which forms a fundamental interaction mechanic, and you can interact with the stars and change their orbit by moving your whole body. Beyond that, the question of how to interact with the stars was a challenge.

SIGGRAPH: What do you find most exciting about the final product you presented at SIGGRAPH 2022 in Vancouver?

JDS: Presenting at SIGGRAPH 2022 was a wonderful opportunity to show “Star-Stuff” and the design concepts embedded within it to a large and broad audience. It was exciting to see both academic audiences and industry professionals enjoy the experience. My hope is that showing it to people from Google, Meta, Snapchat, Disney, and more can allow my work to inspire others and shape the future direction of immersive technology and experiences design.

SIGGRAPH: Do you have any upcoming projects in the works?

JDS: A projection-based version of “Star-Stuff” will be launching soon at the H.R. MacMillan Space Centre in Vancouver where I hope it will inspire many to think creatively about our world. It is also available free on Meta’s AppLab. Beyond “Star-Stuff,” I am working on several projects that push the boundaries of what is possible with immersive technology. “Eve 3.0” is a performance that immerses six participants through touch and encourages them to dance, which I am working on with an international team of artists. “Synedelica” is a synesthetic passthrough mixed reality experience that modulates real surroundings based on the sounds that surround you. We hope to take this experience to music festivals in the near future and propose new ways that immersive experiences can encourage social connection in person.

SIGGRAPH: Do you have any advice for those looking to submit to Immersive Pavilion in the future?

JDS: Come to the Immersive Pavilion prepared for useful feedback from people with a wide variety of backgrounds in creative industries and academic research. Many of the people who tried “Star-Stuff” were open to providing constructive feedback and discussing improvements. In one instance, I received some feedback, integrated changes that reflected that feedback, and showed the updated version the next day. It’s a great opportunity to share your work and learn from colleagues.

Submissions for the SIGGRAPH 2023 Immersive Pavilion and more will be open before your know it! Continue to check our website for details regarding submission requirements and deadlines.

John Desnoyers-Stewart is an interdisciplinary artist-researcher pursuing his PhD in interactive arts and technology at Simon Fraser University in Vancouver, British Columbia, Canada. He is a professional engineer with a master of fine arts in interdisciplinary studies who seeks opportunities to bring knowledge from diverse disciplines together to promote creativity and innovation. His current focus is developing immersive installations to encourage new perspectives on immersive technology and to better understand its true potential and study its effects. He has exhibited his multi-user mixed reality installations, “Star-Stuff,” “Eve 3.0,” “Body RemiXer,” and more at art galleries and festivals around the world. Through these installations he hopes to encourage social connection and collaborative creativity, exploring positive social applications of virtual reality and better understanding the experience of embodying abstract bodies.

Related Posts

SIGGRAPH Spotlight: Episode 77 – Synesthetic Connections, Real-Time, and Immersive Experiences

SIGGRAPH Spotlight: Episode 77 – Synesthetic Connections, Real-Time, and Immersive Experiences

SIGGRAPH 2024 Courses Chair Ruth West chats with audio and graphics experts Aaron McLeran, Felipe Romero, and Max Hays, about synesthetic connections between real-time audio and graphics within games.