You'll soon be able to control iPhone, iPad with your eyes, Apple says. Here's how
Apple eye-tracking feature has been designed especially for people with physical disabilities and setting it up will be quick and easy.
Apple announced new accessibility features that give a sneak peak into the tech giant's AI plans. These include Eye Tracking which uses artificial intelligence (AI) that can help users control their iPhone and iPad using only their eyes. The feature has been designed especially for people with physical disabilities and setting it up will be quick and easy as users just need to use front-facing camera for a few seconds to calibrate it, Apple said.
Read more: Tesla layoffs: Elon Musk is now rehiring this team that he fired weeks ago
The feature can be used on both iPadOS and iOS, and it doesn’t need any extra hardware or accessories, Apple informed. Using Eye Tracking, you can move around in apps, activate different parts using Dwell Control, press buttons, swipe and use gestures as well.
Read more: McDonald's new plan for US: $5 value meal that 'customers really need'
Apple also announced another feature called Listen for Atypical Speech which uses on-device machine learning to help Siri understand a larger range of voices.
Read more: Is OpenAI using Youtube to train AI? Sundar Pichai says will ‘sort out’ issue
Apple said, “These features combine the power of Apple hardware and software, harnessing Apple silicon, artificial intelligence, and machine learning to further Apple’s decades-long commitment to designing products for everyone.” These features will be available “later this year” most likely with the iOS 18 and iPadOS 18 fall updates, the company informed.