Immersive Experiences for All Environments

by | 11 September 2024 | Conferences

Image credit: MOFA from Reality Design Lab. Botao Amber Hu.

Hear from Botao Amber Hu, the winner of SIGGRAPH 2024 Immersive Pavilion Best in Show Award for the project “MOFA: Multiplayer Omnipresent Fighting Arena“. Get insight into the inspiration for this project and learn how leveraging local Bluetooth and Wi-Fi within proximity range made MOFA possible.

SIGGRAPH: Tell us about the process of developing “MOFA: Multiplayer Omnipresent Fighting Arena”. What was your inspiration for creating this immersive experience?

Botao Amber Hu (BAH): The development of MOFA was truly a passion project born from my experiences as an AR game enthusiast. As a longtime Pokémon GO player, I always felt there was something missing in existing mobile location-based games — the thrill of real-world, inter-bodily battles. The motivation was that the increasing prevalence of mixed reality HMDs opened up new possibilities for embodied interaction. While many researchers were focusing on remote co-presence, we saw an opportunity to bring a more physical, somatic approach to collocated mixed reality experiences. We asked ourselves, “Why can’t we play mixed reality games as spontaneously as tossing a frisbee?” This led to the concept of “inter-bodily mixed reality street play” that could be enjoyed impromptu, much like street performances. The fantastic fiction our generation grew up with — Harry Potter and Marvel superheroes — inspired us. We’ve all dreamed of engaging in wizard duels or superhero battles, and MOFA is our attempt to bring those dreams to life through my open-source mixed reality HoloKit headsets. The key feature of MOFA is its omnipresence. The magic field can expand anywhere, anytime, transforming ordinary streets into magical arenas. We wanted to create an experience where players could engage in wizard-like duels or superhero-style fights in their own neighborhoods. We’ve focused on creating an experience that truly expands the fictional world into reality, allowing players to live out their fantasies of magical battles in real-world settings. By the way, we chose the acronym “MOFA” because it means “magic” in Chinese, which felt like a perfect fit for our vision of bringing magical, immersive experiences to everyday environments.

SIGGRAPH: What was one specific challenge you faced during its creation, and how did you overcome it?

BAH: One of the most significant challenges we faced in developing MOFA was creating a seamless, low-latency, stable multiplayer connection that could support spontaneous collocated mixed-reality sessions anywhere, any time. This was crucial for our vision of an inter-bodily street play, as it required ultra-low network latency to ensure augmented objects kept pace with players’ rapid body movements in real-time. We quickly realized that traditional cellular networks weren’t up to the task. 4G had too much latency, and while 5G showed promise in theory, its limited availability and inconsistent signal strength in many areas made it impractical for our purposes. Local Wi-Fi connections, while suitable for indoor entertainment, weren’t feasible for spontaneous outdoor play due to the need for additional router setup. Our breakthrough came when we decided to leverage Apple’s MultipeerConnectivity technology, which is the backbone of AirDrop. This technology utilizes local Bluetooth and Wi-Fi within proximity range, which was perfect for our needs. We created a custom networking transport protocol that wraps around Apple’s native frameworks, allowing game developers to implement multiplayer functionality seamlessly in game engines. However, this was only part of the solution. We still needed to synchronize the coordinate systems of all AR devices simultaneously with self-recovery ability. To achieve this, we developed a customized coordinate registration system using QR codes. This system provides centimeter-level spatial synchronization and microsecond-precise temporal synchronization. We’re proud to say that this technology earned an Honorable Mention Demo award at ISMAR 2023. In the spirit of advancing the field, we’ve open-sourced all of these technologies as a Unity package for the community to use and build upon. Overcoming this challenge not only made MOFA possible but also contributed significantly to the broader field of collocated mixed reality development. We’ve since applied this technology to other projects, including our dance improvisation project “Cell Space,” which was accepted as a SIGGRAPH Art Paper this year.

SIGGRAPH: What was your reason for choosing “MOFA” game prototypes such as “The Ghost,” “The Dragon,” and “The Duel”?

BAH: Our primary goal is to explore this under-researched area of “inter-bodily mixed reality street play” through bringing fantastical fiction into reality. We decided to create experiences based on magical or supernatural concepts that people are already familiar with, rather than creating entirely new narratives. For example, we drew inspiration from popular fictional works like “Ghostbusters”, “Game of Thrones”, and “Harry Potter”. These supernatural themes are particularly well-suited for mixed reality experiences, allowing players to quickly immerse themselves in narrative roles. Importantly, this narrative framework helps afford and justify why players, as protagonists, need to engage in unusual behaviors in the real world. We started with “The Duel” to experiment with competitive bodily gameplay. Inspired by the infamous wizard dueling scenarios in Harry Potter, this prototype allowed us to explore one-on-one magical combat in a mixed reality setting. Next, we developed “The Dragon” to investigate cooperative bodily gameplay. In this scenario, HMD players team up to face a common threat controlled by a player using handheld AR. This was inspired by the epic dragon battles from “Game of Thrones”. We premiered this game at New York City’s Oculus Transportation Hub, bringing a dragon to life in an iconic urban setting. Finally, “The Ghost” was designed to leverage the inherent asymmetry of mixed reality media. For example, not all HMD players necessarily see the same virtual objects. Players can only see “ghosts” when activating their psychic abilities, while the ghost puppeteer can see and control “ghosts” constantly. This dynamic mimics the mechanism behind the movie “Ghostbusters”, creating an intriguing asymmetric multiplayer experience in the real world. We created a diverse range of experiences that showcase the unique possibilities of inter-bodily mixed reality street play while tapping into familiar and beloved fictional concepts.

SIGGRAPH: How does the involvement of non-headset-wearers enhance social engagement in “MOFA”?

BAH: We’ve observed that inter-bodily street play naturally is a performance, like street dance. People without headsets are often drawn to the spectacle of players chasing and interacting with invisible entities. To engage with this curiosity, we implemented a spectator view that can be displayed on smartphones or large screens. At SIGGRAPH, for instance, many attendees were captivated by the MR spectator view on our big screen. We then asked ourselves, “What if spectators could interact with the HMD players?” This led to the creation of our “puppeteer” mode. Using a smartphone or tablet, non-headset users can control elements like the dragon, interacting with the headset-wearing players. Interestingly, we’ve found that the dragon controller role has become even more popular than the headset-wearing wizard role. Many people relish the opportunity to play as the game’s “boss” and enjoy watching players chase after their avatar. This feature has significantly enhanced the social engagement aspect of MOFA, creating a bridge between headset wearers and non-wearers in the gameplay experience.

SIGGRAPH: Congratulations on receiving the SIGGRAPH 2024 Immersive Pavilion Best in Show Award. Tell us about your SIGGRAPH 2024 experience. What advice would you give to those contributing to a future SIGGRAPH conference?

BAH: As a creator who presented at SIGGRAPH 2024, I can confidently say it’s one of the most creator-friendly and supportive showcases for innovative media artworks. SIGGRAPH’s exhibition resources are impressively abundant. They’re incredibly accommodating — providing ample space and supplying large displays to meet your needs. Organizers actively engage in discussions about exhibition layouts and setups. In retrospect, I realize I could have dreamed even bigger. So, my first piece of advice to future contributors is simple: Think big.

Secondly, for those contributing to the Immersive Pavilion or Art Gallery, I recommend adopting a “passerby-inclusive” approach to your exhibits. Given the conference’s bustling nature, many attendees have limited time to queue or engage deeply with each exhibit. Designing your exhibition to be accessible and impactful for time-constrained participants can greatly enhance its overall reception. For instance, you could project a spectator view of your virtual reality or augmented reality content on a TV screen. This approach ensures that even those who can only spare a moment can still have a meaningful interaction with your work.

SIGGRAPH: What’s next for you, your lab, and your work?

BAH: I am a research-led experiential futures designer. The works from our Reality Design Lab lie at the intersection of speculative design, spatial computing, artificial life, and cryptographic human coordination. We create immersive mixed reality experiences that invite audiences to step into potential futures through embodied play, challenging their perceptions, sparking inspiration, and provoking dialogue about the paths we might take. Early in my career, I invented HoloKit — an open-source, smartphone-based MR headset that functions as an “Arduino for MR” in spatial computing design education. HoloKit democratizes access to mixed reality by offering an affordable yet powerful solution. This is crucial for enabling social MR projects like MOFA, as it allows multiple people to engage with MR simultaneously without the need for several expensive professional headsets. Looking ahead, I plan to create more experiential works using the HoloKit platform, expanding beyond games and plays into educational and media art realms. We welcome collaborations with researchers and artists who share our vision of realizing and challenging potential future realities through experience.

Can’t get enough immersive content? Check out this SIGGRAPH Spotlight podcast episode where SIGGRAPH 2024 Immersive Pavilion Chair Derek Ham and VR pioneer and educator Dr. Muhsinah L. Morris dive into a discussion on immersive technologies and education.


Botao Amber Hu is a researcher, designer, educator, and creative technologist. He founded and leads Reality Design Lab, an interdisciplinary research and design lab that focuses on the intersection of philosophy of technology, speculative design, spatial computing, and programmable cryptography. He also serves as a visiting lecturer at the China Academy of Art. His primary focus is on designing experiential futures within collocated mixed reality, democratizing education of mixed reality design, and exploring blockchain-based protocols for artificial life. His works have been featured at top conferences such as SIGGRAPH, CHI, UbiComp, CSCW, WWW, ALIFE, Halfway to the Future, VIS, CHI PLAY, Ars Electronica, SXSW, and TEDx, and have received accolades including the SIGGRAPH Best In Show, CHI Best Interactivity Award, Red Dot Design Award, iF Design Award, Webby Awards, A’ Design Awards, and Core77 Design Award. He holds a bachelor’s degree in computer science from Tsinghua University and a master’s degree in AI from Stanford University.

Related Posts