For consumer VR headsets, Meta has been a leader in hand-tracking, and its newly released “First Hand” demo is an effort to push developers to embrace the technology. While Meta has offered a hand-tracking mode for years, most experiences still default to the ubiquitous hand controllers. With its aggressive goal of releasing four new headsets in the next two years, Meta is beginning to realize that XR won’t succeed on hardware alone. Virtual experiences need the realism that comes from using our actual hands.
Here’s a quick look at the First Hand demo:
Meta describes the hand tracking experience as follows.
First Hand showcases some of the Hands interactions that we’ve found to be the most magical, robust, and easy to learn but that are also applicable to many categories of content. Notably, we rely heavily on direct interactions. With the advanced direct touch heuristics that come out of the box with Interaction SDK (like touch limiting, which prevents your finger from accidentally traversing buttons), interacting with 2D UIs and buttons in VR feels really natural.
We also showcase several of the grab techniques offered by the SDK. There’s something visceral about directly interacting with the virtual world with your hands, but we’ve found that these interactions also need careful tuning to really work. In the app, you can experiment by interacting with a variety of object classes (small, large, constrained, two-handed) and even crush a rock by squeezing it hard enough.
First Hand Demo
The hand tracking demo targets developers with an SDK that should help with content creation. Up to now, it’s been easier to develop controller-based experiences, which are effective for virtual gaming, but far less so in XR learning and workforce training initiatives. Only when XR can fully mirror our real-world interactions will the full potential of XR come into its own.
Serious challenges remain in developing virtual hand tracking experiences. Interaction with objects is a significant breakthrough, but we need to be able to experience resistance and weight. That will require more than an HMD, and haptic gloves are still bulky and expensive. We need one-size-fits-all style gloves that automatically pair with a Quest (or other HMD) and feel natural.
As Meta notes in their demo, even getting natural interactions with virtual objects is challenging:
Building great Hands-driven experiences requires optimizing across multiple constraints: technical (like tracking capabilities), physiological (like comfort and fatigue), and perceptive (like hand-virtual object interaction representation).
As an SDK, First Hand should help developers integrate these dimensions into VR experiences. Meta isn’t saying, but they may have much more planned in this area. With new headsets on the way, will they rely on third parties to supplement the Quest with haptic gloves? Are they still experimenting with wrist devices that could simultaneously enhance tracking accuracy and provide a semblance of haptic feedback? Meta isn’t saying, but we suspect there is much more than meets the eye in this demo of a controller-less future in our XR experiences.
For a generation in the future, our XR hand controllers will be yet another tech curiosity, stashed away in the back of a closet with floppy disks, early game consoles, and other detritus of a never-ending tech revolution.
Emory Craig is a writer, speaker, and VR consultant with extensive experience in art, new media, and higher education. He speaks at global conferences on innovation, education, and ethical technology in the future. He has published widely and worked with the US Agency for International Development, the United Nations, and the Organization for Economic Co-operation and Development (OECD). Living at the intersection of learning, games, and immersive storytelling, he is fascinated by AI-based avatars, digital twins, and the ethical implications of blurring the boundaries between the real and the virtual.