Apple today announced a series of innovative accessibility features designed to empower users with disabilities and make technology more inclusive. These features, set to roll out later this year, include Eye Tracking, Music Haptics, Vocal Shortcuts, Vehicle Motion Cues, and more enhancements for visionOS.
Eye Tracking, powered by artificial intelligence, allows users with physical disabilities to control their iPad or iPhone using only their eyes. This groundbreaking feature uses the front-facing camera for quick setup and calibration, ensuring data privacy with on-device machine learning. Eye Tracking enables seamless navigation and interaction with various apps, providing users with greater independence and control.
Music Haptics offers a new way for users who are deaf or hard of hearing to experience music through vibrations and taps on their iPhone’s Taptic Engine. This feature works with millions of songs in the Apple Music catalog and will be available as an API for developers to integrate into their apps.
Vocal Shortcuts empowers users to create custom voice commands for Siri to launch shortcuts and complete complex tasks, while Listen for Atypical Speech enhances speech recognition for a wider range of speech patterns. These features,designed for users with speech-related conditions, offer greater customization and control.
Vehicle Motion Cues is a new feature that aims to reduce motion sickness for passengers in moving vehicles by providing visual cues on the screen that represent changes in vehicle motion. This helps to minimize sensory conflict and enhance comfort while using iPhone or iPad in a car.
CarPlay is also getting accessibility upgrades, including Voice Control, Color Filters, and Sound Recognition. Voice Control allows users to navigate and control CarPlay using only their voice, while Sound Recognition alerts drivers or passengers who are deaf or hard of hearing to car horns and sirens. Color Filters improve the visual experience for colorblind users, with additional options like Bold Text and Large Text.
Apple Vision Pro, the company’s new spatial computing platform, will also receive several accessibility features,including systemwide Live Captions for live conversations and audio from apps. This feature will be available for FaceTime, along with the ability to move captions during Apple Immersive Video. Updates for vision accessibility will include Reduce Transparency, Smart Invert, and Dim Flashing Lights.
Additionally, Apple is introducing new features to celebrate Global Accessibility Awareness Day, such as free sessions at select Apple Store locations to explore accessibility features, curated collections on Apple Books and the Apple TV app highlighting stories of people with disabilities, and more.
These new accessibility features, along with the existing ones, demonstrate Apple’s unwavering commitment to creating products that are accessible to everyone. The company continues to push the boundaries of technology to enhance the lives of users with disabilities, fostering a more inclusive and accessible digital world.
