We've written about eye-tracking hardware from Tobii a couple of times over the years. The company builds PC peripherals that can follow the gaze of your eyes and determine where on screen you're looking. Until now, much of Tobii's end-user products have tended to focus on gamers. They've used the eye tracking to either track how a game is being played (for example, to better understand how often you look at the minimap) or to add another style of input to a game—letting you aim where you're looking, for example.
Microsoft and Tobii announced a new feature that will expand eye control to work across Windows. Available in the latest Insider Preview builds, Eye Control lets you use tracking hardware to control a mouse cursor and an on-screen keyboard using the power of your gaze.
Currently, this requires a specific piece of Tobii hardware, though the company is planning to expand this functionality to other devices in its range. In principle, other compatible devices will also work with Windows.
Microsoft is positioning Eye Control as an accessibility feature, and it says that the work originated with a hackathon request in 2014. Former NFL player Steve Gleason, who has amyotrophic lateral sclerosis (aka Lou Gehrig's disease or, for UK readers, motor neurone disease), wanted the company to help him address the challenges that come with the loss of muscle movement that his disease causes. The muscles controlling eye movement are often unaffected by the disease, making eye control a robust option. The hackathon produced a gaze-based way of controlling a wheelchair, and it inspired Microsoft to investigate the scope of eye control more extensively.
The new Windows 10 features are the first fruits of that work. The keyboard is a familiar variant of the existing Windows on-screen keyboard, with letters picked by dwelling on them for a moment. It even supports swipe typing; a longer gaze is used to pick the first letter, with subsequent letters picked just by glancing at the appropriate place.