One of the least obvious, but very useful announcements of the past WWDC 2018 was the updated platform ARKit 2. After the release of the first beta version of iOS 12, the developers began to actively test new features responsible for augmented reality engine.
One of the findings is the ability to control iOS with the help of the eyes. Published by developer Matt Moss, a short video demonstrates how easy it is to manage the interface with the new ARKit2.
— Matt Moss (@thefuturematt) 7 June 2018
The developer’s head is fixed in one position, and the motion is performed only by eyeballs.
The sensors of the front camera and the TrueDepth sensor read the position of the eyes, and a miniature cursor moves around the screen. To confirm the choice of the item or imitate tapa, it is enough to blink.
The possibility of such contactless control not only diversifies the software available in the App Store, but also significantly alleviates the lives of people with disabilities. [ Twitter ]