Image Credit: XRLab@NTUT
At SIGGRAPH 2024’s Immersive Pavilion, one standout experience blurs the boundaries between virtual gameplay and real-world motion: “Metapunch X“. This innovative XR project combines multidisplay environments and exertion-based haptic interaction to reimagine how we watch and play combat sports in the digital age. We caught up with the creators behind this contribution to learn more about the inspirations, challenges, and ambitions driving this next-generation esports experience.
SIGGRAPH: Your project “Metapunch X: Combing Multidisplay and Exertion Interaction for Watching and Playing E-sports in Multiverse” is a fascinating and timely piece of work! Tell us what inspired your team to develop this work.
We began this project by developing a substitutional moving robot designed for professional boxing training[1]. During the exhibition, we received a lot of good feedback, and everyone really enjoyed it. So, we started thinking beyond training. What if we turned this into an esports experience? Therefore, we created a new XR sport called Metapunch for everyone to enjoy.
At the same time, our lab’s other team was also exploring how audience participation might influence the performers[2]. We started to think about the combat sports spectator. And what kind of participation makes a difference — being there in person, joining online, or engaging through a third-person broadcast experience? Would that make esports more exciting and competitive? Could it give spectators a stronger sense of involvement and connection with the players?
That’s how “Metapunch X” was born.
SIGGRAPH: What were the biggest technical challenges in integrating encountered-type haptic feedback with XR exertion gameplay, and how did you overcome them?
One of the biggest challenges we faced in this project was aligning the position of the virtual avatar with the physical punching bag. Since our encountered-type haptic feedback is passive and the punching bag is movable, maintaining accurate alignment during movement was especially difficult.
To enable immersive and exertion-based interaction in virtual environments, particularly in scenarios like substitutional reality, the system needs to accurately track the physical prop and represent it as a punchable object or avatar within the virtual space. Since the shapes of virtual avatars and physical objects often differ, precise alignment is essential to ensure effective haptic feedback.
We explored several strategies to address this challenge:
- Matching the visual outlines of the virtual and physical objects.
- Aligning only the punching surface to minimize the robot’s movement range.
- Displaying a single target point at a time to improve precision.
- Positioning the virtual target slightly deeper when the virtual avatar is larger, encouraging users to punch with the correct depth and angle — even if the surface is sloped or uneven.
SIGGRAPH: What role did user testing or playtesting play in shaping the final design of the game features?
We participated in a digital sports competition in Taiwan, ITSPORT[3]. Each year, we use insights from user testing during the event to refine and adjust the game’s scoring system, timing, and overall mechanics. Throughout this process, we’ve also collected a large amount of motion data for performance analysis.
SIGGRAPH: The world of XR is rapidly evolving, and we are seeing a growing admiration for XR in video games. What is next for your work? Where do you see your work five years from now?
We were also inspired by Prof. Inami and Prof. Koike‘s work, who are among the founding members of the Superhuman Sports Society[4]. We are currently in the process of applying to join the society, with the hope that this will serve as a starting point to bring our product to the global stage. Our goal is to first increase visibility across Asia and eventually expand our reach worldwide.
SIGGRAPH: Do you envision “Metapunch X” or similar XR/haptic-integrated games becoming part of future esports tournaments or even the Olympics?
Of course, we hope to see more products and applications like this integrated into international tech sports and esports competitions in the future. For instance, in Japan, the Superhuman Sports Society has already organized competitions. In Taiwan, there are also tech-based sports events called ITSPORT that are open to public participation. Even the Olympics have held demonstration events for esports (Olympic Virtual Series, Olympic Esports Week), including a taekwondo competition incorporating VR technologies.
With esports on track to become an official Olympic discipline, we sincerely hope our work will contribute to or be showcased in such an esteemed event. Hey, Olympics! We are here!
There is plenty more of exhilarating content like “Metapunch X” being offered at SIGGRAPH 2025. Join is in Vancouver or virtually 10-14 August to be a part of shaping our collective future of technology and computer graphics.
[1]Mendez S, L. A., Ng, H. Y., Lim, Z. Y., Lu, Y. J., & Han, P. H. (2022). MovableBag: Substitutional robot for enhancing immersive boxing training with encountered-type haptic. In SIGGRAPH Asia 2022 XR (pp. 1-2).
[2]Lin, K. F., Chou, Y. C., Weng, Y. H., Tsai Chen, Y., Lim, Z. Y., Lin, C. P., … & Pan, T. Y. (2023). Actualities: Seamless Live Performance with the Physical and Virtual Audiences in Multiverse. In ACM SIGGRAPH 2023 Immersive Pavilion (pp. 1-2).
[3]https://itsport.tw/
[4]https://superhuman-sports.org/

Kuan-Ning Chang received M.F.A. degree in Graduate Institute of Animation and Film Art from Tainan National University of the Arts. He is currently majoring doctoral program in design in National Taipei University of Technology. His research interests include Human-Computer Interaction (HCI), Extended Reality (XR), and he also focuses on Multisensory Technology.

I am Steven Yu-Hsiang Weng, a researcher and designer specializing in Human-Computer Interaction (HCI), User Experience (UX), and Augmented and Virtual Reality (AR/VR). My academic work focuses on the intersection of interactive system design, user experience, and emerging technologies. I hold a Master’s degree in Interaction Design from National Taipei University of Technology, where I explored user-centered methodologies, usability testing, and experience design for immersive media.
My background combines technical proficiency with interdisciplinary collaboration. With experience in UI/UX design, front-end development, and data analysis, I approach problem-solving through both research and prototyping. I am particularly interested in how interactive technologies can enhance engagement and accessibility across digital experiences.

Ping-Hsuan Han: I am an Associate Professor in the Department of Interaction Design at the National Taipei University of Technology. My current research interests include HCI, XR, Multisensory Technology, and Immersive Storytelling.