Apple's New AI-Based Accessibility Features Are Pretty Wild

Adding AI to accessibility offers so many new options

  • Apple has announced the accessibility features for the upcoming iOS 18.
  • You'll be able to control your iPhone and iPad using only your eyes.
  • Plus, you may no longer get sick when reading on the bus.
Person in a wheelchair using iPad eye control at a standing desk
Control your iPad with just your eyes in iPadOS 18.

Apple

iOS 18 will let you control the iPhone and iPad with just your eyes and can even stop you from getting sick when you read on the bus or in a car.

Apple has previewed the new accessibility features in iOS 18 and iPadOS 18, and it's an impressive lineup, relying on AI and clever design. And it's not the only one. To coincide with Global Accessibility Awareness Day, Google has also outlined new AI-based accessibility features, although those are higher-concept and possibly less useful in everyday life than Apple's. And there's one feature that has gotten the biggest reaction when discussing this topic with friends—the car-sickness mode.

"AI will have a tremendous impact on accessibility. It represents a generational opportunity to create equity in ways both big and small. AI-driven technologies can adapt to individual needs in real time, offering personalized solutions that were previously unimaginable. For example, eye-tracking features in iOS 18 are a game-changer for individuals with mobility impairments, allowing them to control their devices with just their eyes. This level of customization and responsiveness is only possible through advanced AI," Mike Nellis, founder of Quiller.ai and CEO of Authentic, who describes himself as a "disabled worker," told Lifewire via email.

AI Accessibility

iOS 18's accessibility features aren't all about AI, but it's notable by its presence and sits there in the typically understated way that Apple prefers. The most impressive is the eye-tracking, which lets you control your devices with just your eyes.

AI will have a tremendous impact on accessibility. It represents a generational opportunity to create equity in ways both big and small.

Presumably Apple learned a lot about tracking eyes from developing the Vision Pro. It has not yet detailed how you will actually interact—the Vision Pro also watches your hands so you can use gestures—but I'm excited to see how it works. Perhaps it will also be useful for general hands-free use. You could control a music app while using your hands to play an instrument, for example.

Another neat addition is haptic feedback for music, turning your iPhone into a kind of rumble-pack for Apple Music (it looks like it only works with Apple's Music app so far). This is also pretty neat for musicians, giving the kind of physical feedback you usually only experience from acoustic instruments.

Then, your iPhone—and Siri—will be able to learn how you speak, to understand atypical speech patterns better, and then use these to trigger Vocal Shortcuts. Apple says that the new speech-recognition features use AI to get the job done.

Accessibility in Motion

Let's talk about motion sickness. Research, says Apple, shows that motion sickness can arise when what we see does not match what we feel. One example is being inside a ship that's getting tossed on the waves as your stomach lurches, but the ship's interior offers no cues about what's happening outside. Another is reading in the car or on the bus, where you're trying to focus on a non-moving page or screen while your body sways.

A screenshot of the dots on an iPhone screen that represent the motion of a vehicle.
Motion Cues will help those who suffer from motion sickness.

Apple

Vehicle Motion Cues puts dots on the screen to represent the movement sensed by the iPhone or iPad and can be set to show automatically on the iPhone. I know a ton of people who cannot read in the car or on the bus, and they are extremely happy about this news.

"Addressing Apple's focus on conditions not traditionally viewed as disabilities, such as motion sickness, underscores their commitment to broader inclusive design. It shows an understanding that accessibility isn't just about addressing disabilities in the conventional sense but also improving the everyday experiences of all users," David Pumphrey, the CEO of Riveraxe LLC, told Lifewire via email.

Apple also hints at the future. The Vision Pro headset will offer live captions, so deaf and hard-of-hearing people will get real-time subtitles for real-life conversations. That's great, and if Apple ever makes a pair of Vision Pro glasses, it could be life-changing.

Person on sofa, talking, with subtitles floating next to them
Subtitles for the real world.

Apple

"As a hard of hearing person living in a country where they don't speak my first language, it would change my world if Apple could get this to work on regular glasses," says designer and Apple user Graham Bower on Mastodon.

There's definitely a different vibe to this year's accessibility updates from Apple. The features seem bolder and more ambitious. That may be down to AI, in which case, we're here for it.

Was this page helpful?