Go ahead, wake up your smartphone. No, don’t touch it! Just look at it. Wait for a second…and…yes! It recognizes you. You don’t even need to key in a pass code. Your phone identifies the unique way your eyes flicker. See? What app do you want to open? News? Okay, stare at the icon. Want to scroll through an article? Look down. Pause a video? Look away.


The day when eye tracking becomes a common feature in mobile gadgets may not be far off. The technology got a lot of buzz this March when Samsung demonstrated finger-free scrolling and video control on its flagship phone, the Galaxy S4, during the product’s launch in New York City. The same week, LG Electronics announced it would include similar capabilities in its newest smartphone, the Optimus G Pro.


But while the South Korean giants may be trailblazers in bringing gaze-based interfacing to the mobile market, analysts and researchers say their products only scratch the surface of what’s possible. The Galaxy S4, for instance, knows when to start or stop a video simply by discerning the presence of a user’s face through its front-facing camera. And it scrolls text by sensing the tilts of the user’s wrist. 


Other innovators are working to build systems that not only detect face orientation but also follow the subtle motions of the head and eyes, allowing for more sophisticated applications. Israeli start-up
 Umoove, for example, is developing head- and eye-tracking software that the company says will be able to work on any mobile platform. Last month, it shared its tool kit with a select group of app developers, including game and e-book designers. Umoove says it expects to release a commercial version of the kit for Apple’s iOS and Google’s Android later this year.


Eye-tracking systems for desktop computers have been around for decades—for instance, to conduct laboratory experiments and to help disabled people control their machines. But these systems often require bulky external hardware and complex algorithms that demand ample processing power. So there are huge challenges when trying to adapt the technology for mobile devices.

The first hurdle is overcoming instability. “You have to separate between the movements of the device and the movements of the user,” says Umoove CTO Yitzi Kempinski. His company’s software solves this problem, he says, by pulling data from a smartphone’s various sensors, including its gyroscope, accelerometer, and compass. The system then combines the results with image data to filter out unwanted information.


Then there’s the problem of computing resources. “Eye tracking needs to be something that can run almost invisibly in the background of any application,” Kempinski says. Three years ago, when Umoove began pursuing mobile eye tracking, its engineers thought they could borrow many of the algorithms used in PC-based systems. They ran their first program on a Nokia smartphone, Kempinski says, because it was one of the only models with a front-facing camera. “Within about 3 seconds, the phone crashed,” he recalls. “That’s when we realized, okay, we’re really going to have to start from scratch.”


In traditional image tracking, a system searches a large portion of each frame to identify relevant features, such as irises or eyelids. It then models how those features shift between frames. To save processing power, Umoove’s software follows a different set of rules. For instance, it uses information extracted from previous frames, such as the angle of the user’s head or the acceleration of a blink, to predict where to look for facial targets in the next frame. This anticipation minimizes the amount of computation needed to scan each image. When run on a Galaxy S III, Umoove’s system uses less than 2 percent of the phone’s CPU power, Kempinski says.


Other researchers are skeptical that today’s smartphone cameras alone can support refined eye tracking. “I’d be really surprised if someone could do accurate pointing or perform similar actions reliably for a wide range of users and conditions,” says Ralf Biedert, a senior interaction researcher at Tobii Technology. The Swedish company is commercializing a consumer-grade eye tracker that would plug into a computer’s USB port. The candy-bar-size device projects infrared light and follows a user’s gaze by capturing the eyes’ reflections with a pair of cameras.


Biedert and others agree that whatever the underlying system, eye tracking has the potential to transform the way consumers interact with their devices. The technology is already used in some cars to warn drivers when they are dozing off. It could also enable chefs to browse recipes while cooking and inform authors where readers lose interest in a text. Even simple tasks such as reading or browsing “will be more natural and more convenient,” says Robert Jacob, an expert on computer interfaces at Tufts University, in Medford, Mass. “You don’t have to interrupt your flow of thought,” he explains, which often happens when you must manually point to something your eye has already found.


Widespread adoption of eye-tracking technology could also invite some unwanted consequences, cautions John Villasenor, a technology and policy expert at the Brookings Institution and the University of California, Los Angeles. “The big concern is privacy,” he says. Your phone could collect data on your eye movements, such as which Google results you skimmed, which advertisements you lingered at, and whether your pupils dilated when you read about certain subjects. That information could easily pass into the hands of advertisers or law enforcers. While you are intently watching your device, Villasenor says, “your device, perhaps unbeknownst to you, could be watching you.”


This article originally appeared in print as "Rise of the Eye Phones."