
Apple unveiled new accessibility features coming to the iPhone, iPad, Mac, Apple Watch and Vision Pro this fall with iOS 19 and other software updates.

Every year before its Worldwide Developers Conference (WWDC), Apple previews a slew of accessibility features coming to its software platforms in the fall. Apple did it in 2023 and 2024, and this year is no exception; the company today announced new accessibility features ahead of Global Accessibility Awareness Day 2025 on May 15 that will be publicly available for all customers this fall alongside the iOS 19, iPadOS 19, macOS 16, watchOS 12 and visionOS 3 software updates.
The announcement mentions a new systemwide reading mode, dubbed Accessibility Reader, an expansion of the Live Captions feature to the Apple Watch, an Enhanced View for the Vision Pro headset and a bunch of other updates.
Apple previews new accessibility features coming this fall
Other new accessibility features announced today include accessibility nutrition labels on the App Store, a new Magnifier app for the Mac, the new Braille Access feature, the systemwide Accessibility Reader and more, in addition to a bunch of updates to existing features such as Live Listen, Background Sounds, Personal Voice, Vehicle Motion Cues and others.
Apple says the new features take advantage of the latest advances in on-device machine learning and artificial intelligence. “Powered by the Apple ecosystem, these features work seamlessly together to bring users new ways to engage with the things they care about most,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives.
Accessibility nutrition labels
In December 2020, Apple brought privacy nutrition labels to the App Store, highlighting how apps use your data, and they’re now expanding to the accessibility features. These will have their own section listing accessibility features within apps and games, such as VoiceOver, Voice Control, Larger Text, Sufficient Contrast, Reduced Motion, captions and more. The labels will help people decide whether an app will be accessible to them before they download it.
Magnifier comes to the Mac
Magnifier, a versatile accessibility tool that turns your iPhone or iPad into a magnifying glass to zoom in on and detect objects near you, is coming to the Mac. It leverages Continuity Camera, an existing feature that lets you use your iPhone as a webcam for your Mac. Magnifier has been available on the iPhone and iPad since 2016. A new app in macOS 16, Magnifier requires an external USB camera or Continuity Camera on your iPhone. The built-in webcam is unsupported.
Like on your iPhone, Magnifier lets you zoom in on objects like a whiteboard or a screen. Interestingly, Magnifier supports Desk View, a lesser-known macOS feature that uses Continuity Camera to show a top-down view of your desk, mimicking an overhead camera. With Desk View, Magnifier allows you to read documents and books on your desk with automatic perspective correction.
“With multiple live session windows, users can multitask by viewing a presentation with a webcam while simultaneously following along in a book using Desk View.”
Magnifier for the Mac supports custom views that can be captured, grouped, and saved to add to later on, as well as image controls such as brightness, contrast and color filters. Magnifier also works with Accessibility Reader, another new accessibility feature that transforms text from the physical world into a custom legible format.
Say hello to Accessibility Reader
Accessibility Reader is a new systemwide reading mode in iOS 19, iPadOS 19, macOS 16 and visionOS 3 that makes it easier to read text in apps and the real world. The latter is possible thanks to Accessibility Reader’s integration with the Magnifier feature, which lets you interact with text in physical books, on dining menus, receipts and so forth. Accessibility Reader supports extensive options to customize text with font, color, and spacing. It also supports the Spoken Content feature, so you can hear your device speak the text.
Live Captions on the Apple Watch
Live Captions, an iPhone, iPad and Mac feature that shows real-time transcriptions of your voice chats on FaceTime and more is coming to your wrist. Real-time Live Captions in watchOS 12 work with Live Listen, an existing feature that lets you turn your iPhone into a remote microphone to stream audio directly to your AirPods.
Live Captions will automatically transcribe what your iPhone hears, on your wrist. Your watch becomes a remote control for Live Listen, with options to start, stop or resume sessions. Live Listen also supports Apple’s hearing health features available on the AirPods Pro 2, including Hearing Aid.
Introducing Braille Access
The existing Braille feature on the iPhone, iPad, Mac and Vision Pro supports Braille entry directly on the device’s screen or a Bluetooth-enabled Braille display to help you read and navigate. With the new Braille Access feature in iOS 19, iPadOS 19, macOS 16 and visionOS 3, your device doubles as a full-featured braille note taker and an app launcher that lets you open apps by typing with Braille Screen Input or a dedicated braille device.
With Braille Access, you can also perform calculations using Nemeth Braille, which is a braille code typically used in classrooms for math and science, as well as import Braille Ready Format (BRF) files to read books and files created on a braille-compatible device. Apple also mentions integration with Live Captions, letting you transcribe conversations in real time directly on braille displays.
Enhanced vision accessibility on Vision Pro
Vision Pro owners who are blind or have low vision will love an improved systemwide Zoom feature in visionOS 3, which allows you to magnify anything in your real-world view. The Live Recognition capability in VoiceOver now uses on-device machine learning to describe your surroundings, find objects, read documents and more. Apple also areas a new API for developers to access the headset’s main camera for live person-to-person assistance. Apple says this will be great for visual interpretation in apps like Be My Eyes.
Accessibility improvements
In addition to the new accessibility features outlined above, iOS 19 and other updates launching this fall will bring a host of enhancements to the existing features.
Background Sounds: This feature which plays various sounds in the background like the slashing ocean waves or white noise is gaining EQ settings and the much-needed option to stop after a period of time. This will be great for those who use Background Sounds to fall asleep but don’t want the selected sound to play all night. Apple says that some people find this feature helpful for their tinnitus symptoms.
Personal Voice: This feature is now faster and leverages AI to create “a smoother, more natural-sounding voice in less than a minute, using only ten recorded phrases,” with support for Spanish (Mexico) in the works.
Vehicle Motion Cues: This feature for using your iPhone in the car without vomiting is coming to the Mac. The feature also gains options to customize the animated onscreen dots.
Eye Tracking: Eye Tracking on the iPhone and iPad will let you use a switch or dwell to make selections. Keyboard typing with Eye Tracking will be easier thanks to a new keyboard dwell timer, reduced steps when typing with switches, and enabling QuickPath typing on the iPhone and Vision Pro.
Head Tracking: Apple has improved the accuracy and reliability of head tracking, saying you’ll be able to “more easily control” your iPhone and iPad with head movements.
Switch Control: Switch Control in iOS 19, iPadOS 19 and visionOS 3 now supports Brain Computer Interfaces (BCIs) to let you control your device using brain signals.
Assistive Access: tvOS 19 picks up a new custom app with a simplified media player, with a new Assistive Access API available to developers.
Music Haptics: This iPhone-only feature that lets you experience music with haptics will bring new customization options in iOS 19, like turning on haptics for vocals only and changing the intensity of taps, textures and vibrations.
Sound Recognition: This feature can now recognize when someone is calling your name.
Voice Control: This feature will finally sync vocabularies across your devices via iCloud. Also, language support will expand to include Korean, Arabic (Saudi Arabia), Turkish, Italian, Spanish (Latin America), Mandarin Chinese (Taiwan), English (Singapore) and Russian.
Live Captions: This one’s gaining support for English (India, Australia, UK, Singapore), Mandarin Chinese (Mainland China), Cantonese (Mainland China, Hong Kong), Spanish (Latin America, Spain), French (France, Canada), Japanese, German (Germany) and Korean.
Large Text on CarPlay, accessibility settings sharing
Apple says CarPlay users will be able to turn on the new Large Text option to make onscreen text bigger and easier to read. Also, Sound Recognition in CarPlay will notify you of the sound of a crying baby and outside sounds like horns and sirens.
Last but not least, you will be able to temporarily share your accessibility settings with another iPhone or iPad, which should be useful when borrowing a friend’s device or using a public kiosk in a setting like a cafe.