Menu Menu
[gtranslate]

Apple unveils new accessibility features including eye-tracking

In a positive step forward for disability awareness, Apple’s latest updates will make navigation easier via eye-tracking. Its software will also be available to a wider range of users.

Coinciding with Global Accessibility Awareness Day, Apple has announced several new updates specifically designed for people with physical disabilities.

These will include eye-tracking, music haptics, and vocal shortcuts.

‘We believe deeply in the transformative power of innovation to enrich lives,’ said CEO Tim Cook.

‘That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software.’

‘We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.’

Powered by Artificial Intelligence, eye-tracking will offer users a built-in option to control their iPhones and iPads with just their eyes.

No additional hardware or accessories are required for this to work, eliminating the need to purchase pricey add-on gadgets.

The front-facing camera and on-device machine learning will decipher eye movements, carry out commands, and activate extra functions like buttons, swipes, and other gestures.

With this enabled, people can look at their screen to move through elements like apps and menus, then linger on an item to select it.

For users who are deaf or hard of hearing, music haptics will use the iPhone’s Taptic Engine vibration system (which enables haptic feedback for actions like clicking and typing) to ‘play taps, textures and refined vibrations to the audio of the music’ for supported Apple tracks.

And, finally, vocal shortcuts will use built-in speech recognition so users can set a ‘custom utterance’ – it doesn’t need to be an understandable voice command or statement – to launch various actions through Siri, such as setting a timer, calling someone, getting directions, taking a screenshot, and scrolling.

Siri will also have a new feature called ‘listen for atypical speech,’ which will use on-device machine learning to decipher a wider range of speech and recognise those patterns.

‘Each year, we break new ground when it comes to accessibility,’ Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, said in a statement.

‘These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.’

Accessibility