What’s More Real Than Real? The Metaverse!

by | 5 April 2023 | Conferences, Metaverse

Image credit: Photo composition by Shan-Yuan Teng, University of Chicago

Have you ever wanted to reach out and touch the metaverse? SIGGRAPH sat down with the speakers, Pedro Lopes (Professor at the University of Chicago) and Nicholas Colonnese (Research Science Director at Meta Reality Labs), behind the SIGGRAPH 2022 Frontiers Talk: “Haptics: A Touch of the Metaverse,” to discuss how haptics can make experiences in the metaverse more immersive, the difference between virtual reality and augmented reality, and what’s next for haptics.

SIGGRAPH: Share some background about this project. What inspired this initial research?

Pedro Lopes (PL): Nick and I have been working on haptics for the last eight years, and we are continuously inspired by the work in the computer graphics community. Our goal is to bring haptic-realism in the same way that SIGGRAPH has brought us photorealism. As such, I put together this joint Frontiers Talk by inviting Nick who can speak from a really interesting perspective, not only as a researcher at a university like myself, but as a researcher from one of the companies pushing VR and haptics (Meta). This was the inception for this Frontiers Talk!

Nicholas Colonnese (NC): The metaverse promises to revolutionize human-computer interaction, but current technologies allow for the reproduction of only two of the five human senses — sight and hearing. I believe that the next major disruptive technological step will be novel haptic hardware and software technologies that will unlock the sense of touch in virtual worlds.

SIGGRAPH: Why are haptics so important to making the metaverse experience more immersive?

PL: In past decades, advances in computer graphics made it easy to impress our eyes when we put on a virtual reality headset. However, the moment we reach out with our hands to touch what we see, this illusion shatters. This is because we haven’t spent nearly as much effort exploring how to render touch as we did on how to render graphics. Research, including some from my lab at the University of Chicago, has shown countless times that being able to touch virtual objects allows you to manipulate them better (you can use them more accurately) and leads to an improved user experience (you enjoy using this more without touch). In our lab, we have engineered a number of haptic devices that simulate aspects of touch, from simple contact with an object to even more rich aspects such as temperature. In our experiments, we let participants choose what they prefer most, a baseline experience without haptics or one with our haptic devices, the overwhelming majority of participants chooses haptics, despite the fact that our devices in the lab are still a million miles away from the precision we will have one day — probably sooner than later — in our hands and at home.

Moreover, in my lab, we don’t see VR as a escapist technology — we don’t think the metaverse is a place where people should go to escape the problems of life. We should solve and tackle those problems as a society, not run away to VR. As such, we think the metaverse is a place where you can experiment safely without the limitations of experimenting in physical reality. For instance, if you want to learn what to do in case a fire breaks out, you can practice in a simulator. It is safe (the fire is virtual and won’t hurt you), and you can manipulate the virtual environment to make it harder or easier (small fire that you can put out easily vs. escape a larger fire) to adapt it to your skill level. However, this VR fire safety training will be useful proportionally to how immersive it is — this means if all you do is “see” fires, it will not bring you much knowledge. It’s like reading a book about swimming and then jumping in the deep end of the pool for your first attempt. But, if we allow you to move your body and feel the heat of the virtual fire and objects you touch, such as the virtual fire extinguisher, then not only your eyes are learning but so is your body — this example was used in one of our recent papers at ACM CHI 2022 and demonstrated at SIGGRAPH 2022 Emerging Technologies where Ph.D. student Yudai Tanaka created a fire safety simulator in mixed reality, allowing a user to put out virtual fires in their own living room.

NC: One way to reason about the value of touch in the metaverse is to examine the value of touch in the physical world. In the real world, we use our sense of touch so often and for so many things that we hardly even think about it. In fact, the “sense of touch” is more a monolithic term than a careful description of human abilities. Touch implies a wide range of sensations including pressure, texture, vibration, the configuration of our bodies in space (which is called proprioception), force, temperature, and so on. Anyone that has dug out their keys from a bag using touch alone, or tried to tie their shoes when their hands are freezing, knows how important touch information is in the physical world. My hypothesis is that touch will be just as important in the virtual world as it is in the physical world. In other words, the metaverse needs touch.

SIGGRAPH: Where do you see your research heading in the next couple years? Will you continue to focus on haptics?

PL: Absolutely! What we are starting to see today, like companies at CES selling touch-gloves and force-feedback devices, is just the tip of the iceberg for haptics. In my lab, we are exploring haptics in two ways. The first is to use haptics to create more immersive experiences. One example by my Ph.D. student Jasmine Lu that I showed at SIGGRAPH was “Chemical Haptics: Rendering Haptic Sensations via Topical Stimulants,” where she pioneered the idea of applying chemicals to the user’s skin to generate rich haptic sensations, such as temperature or tingling vibrations, that otherwise require large mechanical contraptions with cooling elements, motors with weights, etc.

Now, we are exploring another way that we can use haptics: by letting haptics be the way we control our devices. For example, in one of our upcoming papers, to be published in one month at ACM CHI 2023, we created LipIO: a lip-based haptic wearable that lets you control your computer using your lips. You can move a cursor by swiping your tongue on your lips, and you can feel the state of the interface by feeling your lips vibrate in a specific location. With this lip-based haptic device, you can control an interactive device without ever using your hands or eyes, allowing extreme use cases such as tuning a guitar without looking at a guitar tuner. Instead, your lips vibrate to indicate if you are off tuning, or receive directions from Google Maps while biking without looking at your phone!

NC: I framed my portion of the talk based on what I think are the biggest challenges and opportunities for haptics in the metaverse. These are: novel actuation to go beyond today’s state-of-the-art narrowband vibration actuators; soft materials that more closely match the mechanical properties of the human body; and clever design leveraging haptic and multisensory perception. I consider haptic technology for human-computer interaction so interesting and deep that I plan to spend the entirety of my career working on it.

SIGGRAPH: The metaverse has quickly grown in popularity. Do you think it will continue on this upward trend, or do you think technology will head in a different direction?

PL: I do think there’s a new direction coming for the metaverse. While VR is proving to be an exciting medium for gaming at home, with Sony PlayStation and other companies strongly advocating for this, there are even more use cases for augmented reality (also called mixed reality). The difference between virtual reality (VR) and augmented reality (AR) is that in AR, you also see the real world, and the graphics are superimposed around you. Imagine seeing a virtual screen floating around you. This screen is not limited like the display you currently own. It can be scaled up or down, duplicated, mirrored, anchored to your couch or wall — anything is possible! As you can see, AR can also do games (like VR), but it can also be used for productivity tools, remote work, etc. However, you probably already notice a huge difference between virtual and augmented reality — everything around you in VR is rendered graphics, but in AR most objects around you are real: your couch, your keyboard, and so forth.

So, here we have a big problem. How will we do haptics for AR? The way companies are popularizing haptics right now is via haptic gloves. These are devices that you wear on your hand and can vibrate your finger pads when you touch a virtual object. This works well in VR where everything is virtual, but I don’t think this will work well in AR. Asking users to remove their gloves every time they want to type on their keyboard or touch their desk seems too much. Instead, what we need is haptic devices that can vibrate your finger pads without anything attached to it — it sounds like a paradox, but it is possible. At our SIGGRAPH workshop, my Ph.D. student Shan-Yuan Teng demonstrated Touch&Fold, a haptic device that does precisely this. It has an actuator that folds away when you touch physical objects and only unfolds to touch your finger pad when you touch a virtual object.

But Touch&Fold is still a mechanical device and, as such, it is pretty large. We are trying to push this envelop even further. Imagine really not wearing anything in your palm but being able to feel when you grasp a virtual object. My Ph.D. student Yudai Tanaka has a paper coming out in a month at ACM CHI 2023 that achieves this by means of electrical stimulation. We can stimulate the sense of touch in your finger pad without putting anything on it. In fact, we put the electrodes away from your palm — in the back of the hand. This electrical technique, Full-Hand Electro-Tactile Feedback without Obstructing Palmar Side of Hand, allows us to intercept the nervous system and remotely send signals to your brain that mimic what a touch on the finger pad would look like. This looks amazing in augmented reality!

NC: I began my portion of the talk claiming, “We are in the midst of a human-computer interaction revolution.” I was purposely trying to be attention-grabbing and provocative, but I also believe it’s true. I think the metaverse will eventually deeply change how we communicate, work, and play. That said, I don’t think anyone has a high confidence estimate on exactly when this will occur.

SIGGRAPH: What was your favorite part of presenting your research at SIGGRAPH 2022?

PL: It was such a warm reception. We presented our talk (a combined presentation of myself and Nick) at 8 am. This is early by SIGGRAPH standards. SIGGRAPH is a really fun conference and folks will arguably enjoy the night program too, with satellite events, dinners, etc. Yet, despite the 8 am timing, our auditorium was packed. One of my Ph.D. students, Shan-Yuan Teng, who was presenting at SIGGRAPH too, told me they had to watch it on the overflow room outside. That type of warm reception to hear our talk about haptics in metaverse was amazing. A real proof of this warm reception and interest in haptics for the metaverse was that we had a workshop on this topic (also part of the Frontiers program), and people really showed up in the workshop. We ran out of chairs at some point!

NC: The SIGGRAPH 2022 Frontiers Talk was my first-ever presentation at SIGGRAPH. I’ll admit that I was initially a bit nervous to speak to such a large and prestigious community. My favorite part was how welcoming the SIGGRAPH community received me and being able to make so many new connections.  

SIGGRAPH: What advice do you have for those submitting to Frontiers in the future?

PL: I had exceptional interactions with the Frontiers chairs and got their input on how to best fit our panel to SIGGRAPH, so consult the amazing people at SIGGRAPH!

NC: I believe that some of the most interesting areas of research lie in between traditional academic domains, for example, between graphics and haptic rendering. Because of this, I encourage researchers who work in areas outside of computer graphics to consider SIGGRAPH, and I especially recommend Frontiers!


Want to feel that excitement of having a packed session? Submit your innovative ideas to the SIGGRAPH 2023 Frontiers program by 1 May — submissions are accepted on a rolling basis.

Pedro Lopes is an assistant professor in computer science at the University of Chicago. Pedro focuses on integrating interfaces with the human body — exploring the interface paradigm that supersedes wearables. These include: muscle stimulation wearables that allow users to manipulate tools they have never seen before or that accelerate reaction time, or a device that leverages smell to create an illusion of temperature. Pedro’s work has received several academic awards, such as five CHI/UIST Best Papers, Sloan Fellowship, and NSF CAREER, and captured the interest of the public (e.g., New York Times, exhibited at Ars Electronica, etc.; more: https://lab.plopes.org).

Nicholas Colonnese is a research science director at Meta Reality Labs Research, working on novel interfaces for augmented and virtual reality. He leads a multidisciplinary haptic displays program, which includes researchers with expertise in materials, actuation, rendering algorithms, perception, and interaction design. His mission is to bring the sense of touch to the metaverse. He is an active member of several academic and industrial communities, including Haptics Symposium, World Haptics, UIST, CHI, ISWC, SIGGRAPH, Smart Haptics, and others. He holds a Ph.D. in mechanical engineering from Stanford University.

Related Posts