Apple’s new eye-tracking features coming with iOS 18, which allow the user to control their device using only their eyes, seem like an exceptionally useful step forward for accessibility on phones, but you might be wondering just how this stuff works.
Let’s chat eye tracking.
How does Apple’s iOS 18 eye tracking work?
Apple’s eye-tracking phone controls work by utilising the selfie camera, tracking your eye movements with them after some quick setup. To use your eyes as a pointer device, Apple claims that it has some on-device machine learning processing going on.
“These features combine the power of Apple hardware and software, harnessing Apple silicon, artificial intelligence and machine learning to further Apple’s decades-long commitment to designing products for everyone,” Apple claims.
The company claims that the calibration data does not leave the device, and no additional devices are required, be the device an iPad or iPhone. That’s seriously impressive; I originally assumed that Apple’s eye tracking relied on 3D depth sensors beside the front-facing camera (limited to some devices), but it’s actually a mix between sophisticated software and the selfie camera itself.
That being said, while it’s coming with iOS 18, it’s not coming to every device compatible with iOS 18.
When you turn the feature on in your settings, you’ll go through the calibration process as seen above. After that, you can simply navigate across your phone using your eyes.
What devices are getting Apple’s eye-tracking tech?
iOS 18 is coming to devices as old as the iPhone XR, but it’s not coming to every one of those devices. Of the mainline devices,
The iPhone 12 and later (along with the 2nd and 3rd generation iPhone SE) will be getting Apple’s eye-tracking technology, which means that the iPhone XR and iPhone 11 models will be missing out.
Image: Apple
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : Gizmodo (AU) – https://gizmodo.com.au/2024/06/apples-ios-18-eye-tracking-explained/