When a company unexpectedly adds a major new feature to a device after you’ve purchased it, or downplays the feature as a “bonus” that isn’t core to the product’s functionality, it’s tempting to accept the feature — flaws and all — as better than not having it at all. That thought has been on my mind as I’ve tested Oculus Quest’s new hand tracking feature over the past few days: I’m thrilled that the feature is here, and I fully understand its potential, but I’m not actually finding it useful for anything. Yet.
Brief recap: Facebook rolled out the feature last week using its terrible and crazy software update mechanism, which requires people to leave their Quests on and keep checking back for days after firmware first hits Oculus servers. Once it’s installed, you need to dig into a menu to activate it as an Experimental Feature, then manually select “use hands” in settings every time you want an alternative to the included Oculus Touch controllers.
Perhaps the single best part of the hand tracking feature is how it looks. Instead of representing your hands as skeletal lines and dots, Oculus uses shadowy gloves that so precisely mimic your actual movements that they might as well be real. Since Quest already nailed nearly photorealistic 3D controller representations, it’s not totally surprising that it did such justice to (currently genderless) human hands, but it’s amazing to see them move fluidly in space and respond with such low latency to pinches, pointing, and other gestures.
The key problem — and one I suspect will be solved at some point in the future — is that virtual finger gestures aren’t yet a reliable alternative to using physical controllers. You might expect that if your hands were inserted into a virtual space, such as Oculus’ Home interface, you could just touch or tap on anything and have it respond to your fingertips. As of now, that’s not how it works. Your fingers instead vaguely control a floating pointer that can be used to select things, like a Quest controller’s line-shaped selection tool, minus the line. You’re supposed to pinch your thumb and index finger to confirm a selection.

Above: A demo of the pinching gesture from Oculus Connect 6.
On average, I’ve found myself “pinching” to select something three or more times before it actually gets selected. I’ll also occasionally hear a series of pinch confirmation sounds even when I’m not trying to select anything. If I look carefully at what the hand recognition system is showing when I’m pinching, it seems like it’s sometimes confusing another finger with my index finger, and sometimes just not acknowledging the pinch gesture. This seems to happen regardless of where I’m using Quest, but it’s possible that if I were to be against a more neutral background, the cameras might perform better.
Facebook rolled out the feature in two stages. Users gained access through the version 12 software update last week, and third-party developers are officially getting SDK support this week. While hand tracking is presently limited to the Quest’s own interface, developers are already itching to release updated apps with preliminary hand support, which if properly implemented could be very impressive. For now, trying to load any app without hand tracking support will force you to switch back to controller input.
Being pushed to go back and forth from hands to controllers led me to an unexpected conclusion: The haptics in Quest’s controllers may give them a long-term advantage over direct hand control. Holding a controller lets you feel clicks, vibrations, and other sensory cues in a way that quickly begins to feel absent when your hands are floating around in actually empty 3D space. Moreover, while Quest’s hand tracking system is remarkably capable of recognizing multi-finger positional data at this stage, all the sensors in the Quest’s controllers have been finely tuned for precision input. That makes the controllers easier to like, for now.
Given how Quest’s setup process currently works, and the direction Facebook has been going with its Oculus Quest Safety Video and recent updates to the Guardian system, I can very easily imagine a day when controllers become an afterthought rather than an integral part of Oculus onboarding. Open the box, put the headset on, turn on the power, and use your hands to move through the setup menus for everything; that clearly seems like the future of Quest (and VR headsets in general).
If and when that happens, the usage paradigm for VR will be even simpler than it is today: Turn it on, and you’re ready to interact in VR without the need to fumble for controllers. This will be ideal for social applications, where you’ll be able to wave to or high-five friends, and retail, where you’ll be able to point at a button to change the way a virtual car or sweater looks as you’re inspecting it.
But we’re not quite there yet, nor is the impact on games — Quest’s big selling point — totally clear. Anything requiring twitch-level precision isn’t going to work as reliably with hands as with a controller, and whether you’re swinging a sword or pointing a gun, you’re going to find that Oculus Touch is a better solution than using your forearm or a thumb-triggered index finger to fight off enemies.
Facebook and its developers may well change that over the next year. Regardless of how it performs today, I’m really looking forward to seeing where the hand tracking feature goes in 2020, and glad to have a chance to start playing with it now.