The Step Needed to Make Virtual Reality More Real

Eyefluence hopes it has an answer to a big question. Graphics and sound in VR has gotten great, as people will see when two highly anticipated headsets, Oculus’s Rift and HTC’s Vive, are released to consumers. But for all the progress, we still haven’t figured out how best to control the things we will see.
“People are now really starting to see that interaction in VR is far from a solved problem,” says Evan Suma, a research assistant professor at the University of Southern California’s Institute for Creative Technologies. “This is something the VR research community has been looking at for a number of years, going back decades.”
Game controllers are not always ideal because they don’t match the ways you use your body when, say, exploring the dark depths of a cave (no buttons to press there, in my experience). They could make virtual-reality exploration feel less immersive. They could also tire you out, especially if you’re waving your arms wildly while holding them.
That’s why companies like Eyefluence are working on other ways to interact with virtual reality. “You’re always looking at something. And you have the display right in front of your eyes. Why wouldn’t you want your eyes to be able to control that?” says David Stiehr, an Eyefluence cofounder.
Eyefluence grew out of technology originally developed by a company called Eye-Com that CEO Jim Marggraff purchased in 2013. The company has developed a flexible circuit that holds a camera, illumination sources, and other tiny bits of hardware. It is meant to fit around a small metal chassis that the company hopes to have embedded in future virtual-reality headsets.
A startup called Gest is trying to take advantage of another body part: your fingers. The San Francisco startup is building a motion-sensor-filled device that wraps around your palm and includes rings that slide around four of your fingers. The company plans to roll out its gadget in November and will give developers tools to make Gest work with virtual reality and other applications.
Gest cofounder and CEO Mike Pfister sees it being useful not just for playing games in virtual reality but also for getting work done. A designer might want to use Gest to work on a computer-generated model, for instance, or you might want to type on a virtual keyboard simply by moving your fingers around.
While hand- or eye-tracking from Gest and Eyefluence could be a long way off, virtual reality can already be manipulated without wands or video-game controllers. Basic head-tracking technology, which uses sensors to monitor the position of your head and can translate that into actions, will be built into headsets like Rift and Vive. This kind of interaction is even possible with the sensor-laden smartphones that you can use with Samsung’s Gear VR mobile headset and Google Cardboard.
This technology will be used by a number of companies making virtual-reality games and experiences, including a San Francisco startup called Metta. It’s employing head-tracking as the main way you traverse its service for sharing short, homemade virtual-reality videos on a Gear VR or Google Cardboard.
On the Gear device, for example, videos are arranged by point of origin on a giant 360-degree map that you navigate by moving your head slightly; to select a video or collection of videos, you simply keep your head steady on a specific spot. For now, the only thing you need to press is the “back” button on the Gear VR, though the company is considering ways to eliminate that step, too.
Metta cofounder Jacob Trefethen says the idea is to cut down on interruptions that remind the viewer that the virtual world is, in fact, not real. “We’re very much trying,” he says, “to kill all of those moments where you have some disbelief.”