OpenAI, then Google, and now finally Apple…. Yes, Apple has finally “officially” announced some new AI-based accessibility features for iPhones, iPads and Vision Pro. It has been a while since Apple announced any AI features, it did announce the M4 chipset during its ‘Let Loose’ event. Let’s take a quick look at what new accessibility features have been unveiled by Apple.
This is one of the most interesting features that we previously saw on the Vision Pro. It has now made its way to iPhones and iPads. Users can control their devices through their eyes. To use it users have to gaze at a particular button to highlight it and lock their gaze for a few seconds to select it. This is specially designed for users with physical disabilities.
This feature is created keeping in mind users who can be deaf or hard of hearing. This allows them to enjoy music like others too. It uses the Taptic Engine on the iPhone which then plays taps, textures, and refined vibrations according to the audio of the music.
These are basically custom words that you say that enable Siri to launch shortcuts.
Apple claims that this feature will help reduce motion sickness when using an iPhone on the go.
The tech giant has also introduced a couple of features for CarPlay. There’s Voice Control which allows you to navigate through CarPlay just by your voice. Then, there’s also Sound Recognition that deaf or hard-of-hearing users can use to get notified if someone is honking. And those who are colourblind can use Colour Filters which makes the CarPlay interface visually easier to use.
Apple will introduce some accessibility features on the VisionPro too. These are Live captions, Reduce Transparency, Dim Flashing Lights, and Smart Invert.
These features can be accessed with upcoming OS updates, namely, iOS 18, iPadOS 18, and visionOS 2. This can be launched during Apple’s upcoming WWDC on June 10th.