Apple Debuts Eye Tracking, Music Haptics, and Vocal Shortcuts in Accessibility Features

Today, Apple unveiled upcoming accessibility enhancements that are set to launch later this year. These include eye tracking, which enables individuals with physical disabilities to navigate their iPad or iPhone using their eyes. Moreover, music haptics will provide a novel method for those who are deaf or hard of hearing to enjoy music through the taptic engine on an iPhone. 

Vocal Shortcuts will empower users to execute tasks by creating personalized sound commands. Additionally, Vehicle Motion Cues aim to alleviate motion sickness while using an iPhone or iPad in transit. Furthermore, more accessibility features are slated to arrive on visionOS. 

These advancements leverage Apple's hardware and software capabilities, incorporating Apple silicon, artificial intelligence, and machine learning, aligning with Apple's longstanding dedication to creating inclusive products.

“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software.

We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

Apple New Accessibility Features

Eye Tracking is Introduced to iPad and iPhone.

Using artificial intelligence, Eye Tracking provides users an integrated method to navigate their iPad and iPhone solely through eye movements. Tailored for individuals with physical disabilities, Eye Tracking employs the device's front-facing camera for quick setup and calibration, leveraging on-device machine learning to ensure data privacy. 

This feature seamlessly operates across both iPadOS and iOS applications without the need for extra hardware or attachments. 

With Eye Tracking, users can effortlessly navigate app interfaces and utilize Dwell Control to activate various elements, enabling access to additional functionalities such as physical buttons, swipes, and gestures—all accomplished with eye movements.

Music Haptics Enhances Accessibility of Songs

A new feature called Music Haptics helps people who are deaf or hard of hearing experience music on iPhones in a new way. When turned on, it uses the phone's vibration motor to create rhythms, textures, and subtle vibrations corresponding to the music. 

It works with many songs in Apple Music. It will also be available for app developers to use, allowing them to make their apps more accessible for people with hearing difficulties.

Fresh Enhancements Catering to Various Speech Needs

iPhone and iPad users can now create custom phrases that Siri understands to launch shortcuts and perform complex tasks with a feature called Vocal Shortcuts. Additionally, a new feature, Listen for Atypical Speech, improves speech recognition for a wider range of speech patterns. 

It uses on-device machine learning to identify unique speech characteristics. Designed for people with conditions that affect speech, these features provide more control and customization, building on the speech features introduced in iOS 17 for users with speech challenges.

“Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers,” said Mark Hasegawa-Johnson, the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign’s principal investigator.

“The Speech Accessibility Project was designed as a broad-based, community-supported effort to help companies and universities make speech recognition more robust and effective, and Apple is among the accessibility advocates who made the Speech Accessibility Project possible.”

Apple New Accessibility Features

Vehicle Motion Cues Aid in Mitigating Motion Sickness

Vehicle Motion Cues introduces a new functionality for iPhone and iPad aimed at alleviating motion sickness experienced by passengers in motion. 

Studies indicate that sensory conflicts between visual perception and physical sensations often trigger motion sickness, hindering some users from comfortably using their iPhone or iPad while in transit. 

Through animated dots positioned at the screen's edges, Vehicle Motion Cues visually signify changes in vehicle movement, minimizing sensory conflicts without obstructing primary content. 

Leveraging the devices' built-in sensors, Vehicle Motion Cues detects when a user is in a moving vehicle and adjusts accordingly. This feature can be set to appear automatically on iPhone or toggled on and off via the Control Center.

CarPlay Introduces Voice Control and Additional Accessibility Enhancements

CarPlay will soon incorporate new accessibility functionalities, including Voice Control, Color Filters, and Sound Recognition. Voice Control enables users to navigate CarPlay and manage applications solely through voice commands. Sound Recognition allows individuals who are deaf or have hearing impairments to receive alerts for car horns and sirens. 

Additionally, Color Filters enhance the CarPlay interface's visual accessibility, providing a more user-friendly experience for those with color blindness. Other visual accessibility features such as Bold Text will also be available.

Upcoming Accessibility Features for visionOS

This year, visionOS will introduce new accessibility features, including Live Captions systemwide, aiding users, including those who are deaf or hard of hearing, in following spoken dialogue in live conversations and app audio. 

With Live Captions for FaceTime, more users can effortlessly engage and collaborate using their Persona. Apple Vision Pro will also offer the ability to reposition captions via the window bar during Apple Immersive Video, alongside extended support for Made for iPhone hearing devices and cochlear hearing processors. 

Vision accessibility improvements will include reducing transparency, intelligently inverting colors, and dimming flashing lights. This will benefit people with low vision and those who prefer less screen brightness and fewer flashes.

These additions complement the numerous accessibility features already present in Apple Vision Pro, which boasts a versatile input system and user-friendly interface catering to a diverse audience. 

Functions like VoiceOver, Zoom, and Color Filters facilitate spatial computing access for individuals who are blind or have low vision, while features like Guided Access offer support for users with cognitive impairments. 

Vision Pro can be operated using a combination of eyes, hands, or voice, with accessibility tools such as Switch Control, Sound Actions, and Dwell Control aiding individuals with physical disabilities.

“Apple Vision Pro is without a doubt the most accessible technology I’ve ever used,” said Ryan Hudson-Peralta, a Detroit-based product designer, accessibility consultant, and co-founder of Equal Accessibility LLC.

“As someone born without hands and unable to walk, I know the world was not designed with me in mind, so it’s been incredible to see that visionOS just works. It’s a testament to the power and importance of accessible and inclusive design.”

Apple New Accessibility Features

Further Enhancements

  • For individuals who are blind or have low vision, VoiceOver will introduce new voices, a flexible Voice Rotor, customizable volume control, and the ability to personalize VoiceOver keyboard shortcuts on Mac devices.
  • Magnifier will incorporate a new Reader Mode and provide the convenience of launching Detection Mode quickly with the Action button.
  • Braille users will benefit from a streamlined method to initiate and remain in Braille Screen Input for quicker control and text editing. Additionally, Japanese language support will be available for Braille Screen Input, along with compatibility for multi-line braille with Dot Pad and the option to select different input and output tables.
  • Hover Typing will cater to low-vision individuals by displaying larger text in their preferred font and color while typing in a text field.
  • Personal Voice will now support Mandarin Chinese for those at risk of losing their ability to speak. Users with difficulty pronouncing or reading complete sentences can create a Personal Voice using abbreviated phrases.
  • Live Speech will introduce categories and simultaneous compatibility with Live Captions for nonspeaking individuals.
  • Virtual Trackpad for AssistiveTouch will empower users with physical disabilities to control their devices using a resizable trackpad located within a small screen region.
  • iPhones and iPads will allow users to control their devices using finger taps detected by the cameras through a feature called Switch Control.
  • Voice Control will enhance its capabilities with support for custom vocabularies and complex words.

Join Apple in Commemorating Global Accessibility Awareness Day

This week, Apple will unveil new features, handpicked collections, and more in honor of Global Accessibility Awareness Day:

  • In May, specific Apple Store locations will offer complimentary sessions to assist customers in exploring and accessing built-in accessibility features across their favorite products. Apple Piazza Liberty in Milan will showcase the creators of “Assume that I Can,” a viral campaign for World Down Syndrome Day. Additionally, Apple Store locations will continue to provide Today at Apple group reservations year-round, offering opportunities for friends, families, schools, and community groups to learn about accessibility features together.
  • Shortcuts introduce Calming Sounds playing ambient soundscapes to minimize distractions and aid in focus or relaxation.
  • Discover remarkable apps and games promoting access and inclusion for all in the App Store, including the accessible App Store Award-winning game Unpacking, AAC tools, and more.
  • The Apple TV app will spotlight groundbreaking creators, performers, and activists sharing the experiences of individuals with disabilities under this year's theme, Remaking the World, encouraging viewers to envision a world where everyone can contribute to the broader human narrative.
  • Apple Books will feature curated collections of firsthand narratives by disabled writers, available in ebook and audiobook formats, highlighting lived experiences of disability.
  • Apple Fitness+ will offer workouts, meditations, and trainer tips incorporating American Sign Language for users who are deaf or hard of hearing. Time to Walk now includes transcripts in the Apple Podcasts app. Additionally, Fitness+ workouts will include Audio Hints for users who are blind or have low vision, along with modifiers to ensure inclusivity across all fitness levels.
  • Explore Apple Support to learn about customizing Apple devices using built-in accessibility features, including gesture adaptation and screen information presentation customization. The Apple Accessibility playlist offers guidance on personalizing Apple Vision Pro, iPhone, iPad, Apple Watch, and Mac to suit individual needs.