As developer Matt Moss discovered, ARKit 2.0, which Apple introduced with iOS 12, potentially supports eye control with its superior eye tracking capabilities.
New technologies introduced by Apple in conjunction with iOS 12 one was ARKit 2.0. Many looked at ARKit 2.0 as a technology that would increase the experience of increased reality. For Matt Moss, ARKit 2.0 represented much more.
Recognizing that ARKit 2.0 had a spectacular eye-tracking capability, Moss thought it could use a more useful field than Snapchat effects, and developed a small demo application. 19659003]
The application was in real-time tracking where the device user looked, and could detect blinking.
Matt Moss only wanted to discover all the capabilities of ARKit 2.0 and noticed that the technology supported visual control of the device.
As you know, advertisers keep a record of how many seconds a user is looking at a photo. ARKit 2.0's eye tracking capability makes it possible for advertisers to monitor where users look at every second.
At the moment, there is no application that can be controlled by any means, nor an advertisement that can misuse this data. Matt Moss's discovery, however, shows that Apple can offer more useful features to people with disabilities in future iOS releases.
https://mashable.com/2018/06/08/iphone-ios-12-beta-eye-tracking/?utm_campaign=mash-prod-rss-feedburner-all-partial&utm_cid=mash-prod-rss-feedburner-all -Partial # 8ajfsqytgpqf