The accessibility features Apple and Google include in their mobile software can help people of all abilities get more from their devices.
Send any friend a story
As a subscriber, you have 10 gift articles to give each month. Anyone can read what you share.
J.D. Biersdorfer writes the monthly Tech Tip column and the weekly interactive Book Review Quiz Bowl for The Times.
Smartphones have gradually become more useful for people with a range of physical abilities, thanks to tools like screen readers and adjustable text sizes.
With the recent release of Apple’s iOS 16 and Google’s Android 13 software, even more accessibility features have been introduced or upgraded, including improved live transcription tools and apps that use artificial intelligence to identify objects. When enabled, your phone can send you a visual alert when a baby is crying, for example, or a sound alert when you’re approaching a door.
And many accessibility tools, old and new, make using the phone easier for everyone. Here’s a tour.
On either an iOS- or Android-based phone, open the Settings app and select Accessibility to find all of the tools and features available. Take time to explore and experiment.
For full reference, the websites of both Apple and Google have dedicated Accessibility sections, but note that your exact features will vary based on your software version and phone model.
Swiping and tapping by hand to navigate a phone’s features doesn’t work for everyone, but iOS and Android provide several ways to move through the screens and menus, including quick-tap shortcuts and gestures to perform tasks.
These controls (like Apple’s Assistive Touch tools and its Back Tap function, which completes assigned actions when you tap the back of the phone) are in the iOS Touch settings.
Android’s accessibility shortcuts offer similar options. One way to access these is by opening the main Settings icon, selecting System, then Gestures and System Navigation.
Both platforms support navigation through third-party adaptive devices like Bluetooth controllers or by using the camera to recognize facial expressions assigned to actions, like looking to the left to swipe left. These devices and actions can be configured in the iOS Switch Control and Head Tracking settings, or in Google’s Camera Switches and Project Activate apps for Android.
Apple and Google provide several tools for those who can’t see the screen. Apple’s iOS software offers the VoiceOver feature, and Android has a similar tool called TalkBack, which provides audio descriptions of what’s on your screen (like your battery level) as you move your finger around.
Turning on the iOS Voice Control or Android’s Voice Access option lets you control the phone with spoken commands. Enabling the iOS Spoken Content or Android’s Select to Speak setting has the phone read aloud what’s on the screen — and can be helpful for audio-based proofreading.
Don’t forget a few classic methods of hands-free interaction with your phone. Apple’s Siri and the Google Assistant can open apps and perform actions with spoken commands. And Apple’s Dictation feature (in the iOS Keyboard settings) and Google’s Voice Typing function let you write text by speaking.
In their Accessibility settings, iOS and Android include shortcuts to zoom in on sections of the phone screen. But if you’d generally like bigger, bolder text and other display adjustments, open the Settings icon, choose Accessibility and select Display & Text Size. In Android, go to Settings, then Accessibility and choose Display Size and Text.
The Magnifier app, Apple’s digital magnifying glass for enlarging objects in the camera’s view, has been upgraded in iOS 16. The app’s new functions are designed to help people who are blind or low vision use their iPhones to detect doors and people nearby, as well as to identify and describe objects and surroundings.
Magnifier’s results are spoken aloud or displayed in large type on the iPhone’s screen. The door-and-people detection uses the device’s LiDAR (light detection and ranging) scanner to calculate distance and requires an iPhone 12 or later.
To set up your preferences, open the Magnifier app and select the Settings icon on the lower left corner; if you can’t find the app on your phone, it’s a free download in the App Store. The Magnifier is just one of many vision tools in iOS, and the company’s site has a guide to setting up the app on the iPhone and iPad.
Google’s recently updated Lookout assisted-vision app (a free download in the Play store) can identify currency, text, food labels, objects and more. Google introduced Lookout in 2018, and it works on Android 6 and later.
Both platforms offer controls to amplify speech around you through your headphones. In iOS, go to the Audio/Visual section for Headphone Accommodations, In Android, visit the Sound Amplifier setting.
With the iOS 16 update, Apple includes Live Captions, a real-time transcription feature that converts audible dialogue around you into text onscreen. Android’s Accessibility toolbox includes the Live Caption setting that automatically captions videos, podcasts, video calls and other audio media playing on your phone.
Google’s free Live Transcribe & Notification Android app converts nearby speech to onscreen text, and can also provide visual alerts when the phone recognizes sounds like doorbells or smoke alarms. The Sound Recognition tool in the iPhone’s Hearing section of the Accessibility settings does the same. And check your phone’s settings for multisensory notifications, like LED flash alerts or vibrating alerts, so you don’t miss a thing.