“Tech Tales: Digital Storytelling With the Royal BC Museum” © 2025 Derek Jacoby, Kieran Smith, Nathan Inbar, Chloë Farr, Jiayue Wong, Sofiia Khutorna, Tianyu Fang, Zeyu Li, Jinda Zhang, Isadora Weathered, Saiph Savage, Yvonne Coady
After participating in the SIGGRAPH 2025 Appy Hour, “Tech Tales: Digital Storytelling With the Royal BC Museum” invited us behind the scenes to explore how creativity and technology intersect. We sat down with creator Nathan Inbar to learn how his team transformed museum archives, Indigenous storytelling, and immersive VR into a captivating, interactive experience that lets audiences shape the narrative themselves.
SIGGRAPH: What inspired the collaboration between the University of Victoria and the Royal BC Museum, and what story were you most excited to bring to life through this project?
Nathan Inbar (NI): We started this collaboration roughly 10 years ago by developing a narrative-based VR installation for the Royal BC Museum. Since then, this project grew in many different ways, including with a team of Indigenous youth from the Verna Kirkness Foundation who created a “Tech Tales” experience based on mapping and stories from their home communities. With support from Sony and AWS, we have recently been able to extend this work into projects with the Maritime Museums in both Vancouver and San Francisco!
SIGGRAPH: How does your Unity application use extended reality, maps, satellite data, and museum collections to transform a traditional exhibit into an interactive storytelling experience?
NI: We were graciously provided with a sample of the Royal BC Museum archives to use in our project. Using the rich history of the objects we were provided with from the “100 Objects of Interest” collection, we could leverage an LLM to weave a historical narrative that the user directs. We are very excited about our next steps, working with Sony’s Spatial Reality Display technology to bring Avador into a new dimension!
SIGGRAPH: Conversational AI is a core part of the experience — how does it change the way visitors explore and engage with cultural and historical content?
NI: Our Unity application leverages OpenAI’s real-time voice-to-voice API for low latency conversation with our avatar. Our goal going into the project was to bring an exhibit to life with an emphasis on immersive audience interaction, while preserving the educational nature of an exhibit. Users can interactively and conversationally explore the archives of the museum, choosing where they go next and getting to ask questions as they come up to our AI expert.
SIGGRAPH: What was one of the biggest technical or creative challenges you faced while building this experience, and how did your team overcome it?
NI: We built a custom framework in Unity to bridge the gap between a game development engine and a live conversation with a frontier model and faced all kinds of challenges. We had to ensure that the connection to the LLM was kept responsive so as to not leave the user in an awkward pause during the conversation and to allow users to interrupt the model mid-sentence. Creatively, we wanted to design an avatar that felt warm, welcoming, and exciting to talk to. Our team came up with the adorable seal design that we knew was an instant winner.
SIGGRAPH: For creators thinking about submitting to Appy Hour at SIGGRAPH 2026, what advice would you give about crafting a project that resonates with the SIGGRAPH community?
NI: The Appy Hour audience is very patient and kind! We had things go wrong with some of the ways in which we were trying to demo, but the reception from the audience was so warm and the feedback was so helpful that we were able to directly apply it and accelerate it into these new projects with new museums.
It was also very helpful to meet and network with the other demos! Our advice would be to always have a backup video or slides for any part of the demo that might have challenges, but to go big with your ideas and be ready to tell the story behind your project to a very receptive and supportive group.
This project fuses technology, storytelling, and culture, transforming museum archives into an immersive, interactive experience. With XR and conversational AI, visitors can explore history like never before — making learning engaging, personal, and truly memorable. Feeling inspired? Share your ideas and submit to the SIGGRAPH 2026 Appy Hour program.

Nathan Inbar is a Computer Science student at the University of Victoria. His research centers on knowledge graph architectures and evaluation for LLMs. His technical interests include graph systems, AWS architecture, Unity3D, and VR/XR.



