Carnegie Melon University’s Future Interfaces Group is working on a fascinating VR haptics project that could revolutionize how – and what – we experience in virtual reality. The device, an add-on to a VR headset, uses haptic ultrasound transducers to create a broad spectrum of sensations around your mouth. According to VRScout,
. . . ultrasonic transducers can be used to simulate various real-world sensations by focusing acoustic energy on the user’s mouth. By combining subtle pulsations, researchers can replicate everything from drinking hot coffee to smoking a cigarette. The same technology can also be applied to your teeth, as shown in the tooth brushing demo . . . In addition to the above-mentioned uses-cases, researchers went into surprising detail explaining how the system could be used to recreate various spider-based interactions. Using a combination of effects including single impulse, pulse trains, swipes in the x, y, and z directions, and persistent vibrations, users feel sticky cobwebs, jumping spiders, spider goo, and venom interact with their mouths in noticeably different ways.
Having the sensation of a spider walking across your mouth . . . now that’s a virtual experience you’ll never forget.
VR Haptics
As we explored in previous articles, VR haptics has a long way to go – and can raise challenging ethical questions. Many haptic vests still have sizing issues, and the available gloves make you feel like you’re auditioning for a sequel to Tim Burton’s Edward Scissorhands (sorry, no sequel’s coming). Some of the best work has been done in advancing the sense of smell in VR – especially at OVR Technology – a different but very related area.
As the Future Interfaces Group notes, the mouth is a fascinating area to explore, given its sensitivity (second only to our fingertips). Their solution takes a Meta Quest 2 headset and adds an array of ultrasonic transducers focused on the wearer’s mouth. It works without the need for additional hardware that would physically contact the user’s lips or tongue:
Virtual and augmented reality (VR/AR) headsets continue to make impressive strides in immersion and realism, particularly in visual and audio content. However, the delivery of rich tactile sensations continues to be a significant and open challenge. Critically, consumers want robust and integrated solutions – ones that do not require any extra devices or limit freedom of movement. For this reason, vibration motors in handheld controllers are the current consumer state of the art. While more sophisticated approaches exist (e.g., exoskeletons, haptic vests, body-cantilevered accessories, in-room air cannons), they have yet to see even modest consumer adoption.
Simultaneously, the mouth has been largely overlooked as a haptic target in VR/AR, despite being second in terms of sensitivity and density of mechanoreceptors, only behind the fingertips. Equally important, the proximity of the mouth to the headset offers a significant opportunity to enable on- and in-mouth haptic effects, without needing to run wires or wear an extra accessory. However, consumers do not want to cover their entire face, let alone put something up against (or into) their mouth. For AR, the industry is trending towards glasses-like form factors, so as to preserve as much facial expression as possible for human-human communication. Even in VR, smaller headsets are the consumer trend, with the mouth exposed and unencumbered.
If you’re interested in the technical details, there’s a paper from the recent CHI Conference on Human Factors in Computing Systems (CHI’ 22), April 29-May 5, 2022, held in New Orleans, LA. For a quicker visual introduction, the following video covers the highlights.
Still in the Experimental Phase
As fascinating as this VR haptics project is, there are currently no plans for a commercial device. This is still in the experimental stage and could face further challenges as our VR headsets shrink down in size. The device works on the Quest, but there would be no room for it on something like the much smaller Vive Flow. Still, it’s a remarkable development, especially since it implements haptics using ultrasound instead of a tactile solution.
It is experiments like this at Carnegie Melon University’s Future Interfaces Group that are paving the way to the future of XR. A future that will not only be visual but incorporate scent and touch in what will truly be fully immersive experiences.
Emory Craig is a writer, speaker, and consultant specializing in virtual reality (VR) and generative AI. With a rich background in art, new media, and higher education, he is a sought-after speaker at international conferences. Emory shares unique insights on innovation and collaborates with universities, nonprofits, businesses, and international organizations to develop transformative initiatives in XR, GenAI, and digital ethics. Passionate about harnessing the potential of cutting-edge technologies, he explores the ethical ramifications of blending the real with the virtual, sparking meaningful conversations about the future of human experience in an increasingly interconnected world.