While it's not usually the first thing everyone looks at after installing a new iOS software update, I'd give the new accessibility features on your iPhone priority with iOS 16, because there are some highly valuable tools that even users without disabilities can enjoy.
Of course, Apple created the new accessibility features with disabled people in mind, but more and more users are using them because they can vastly improve the user experience. What started as a project to make iPhones accessible to all users became one of the iPhone's biggest advantages over other smartphones.
Accessibility features allow you to customize your iPhone and get the most out of it. You can force your iPhone to read for you, detect sounds, respond to voice commands, or take photos hands-free. And now, you can do even more with iOS 16's new assistive features.
Magnifier has a new Door Detection option, which helps blind and low-vision users locate entryways when they arrive at their destination. The tool can tell you how far away the door is, if the door is open or closed, how to open it (push it, turn the knob, pull the handle, etc.), what any signs say (like room numbers), what any symbols mean (like people icons for restrooms), and more.
Door Detection uses the lidar scanner for light detection and ranging, which is only available on the iPhone 13 Pro and Pro Max, iPhone 12 Pro and Pro Max, iPad Pro 11-inch 2nd and 3rd generation, and iPad Pro 12.9-inch 4th and 5th generation users.
Another new feature in the Magnifier app is Image Descriptions. When you point your camera at something, it will show (or read) you detailed descriptions of what it sees. Unlike Door Detection, this feature is available for all iOS 16 users. It's not always accurate, but it should improve as development continues.
Now that there are two more detection tools in Magnifier, a new Detection Mode menu is available that houses Door Detection, People Detection, and Image Descriptions.
If you don't have one of the iPhone or iPad models that supports Door Detection (see above), which are also the same models that support People Detection, you can only add Image Descriptions to your controls, not the Detection Mode menu.
Magnifier also supports activities, which lets you save your current Magnifier configuration, including the controls panel, camera, brightness, contrast, filters, and detection modes. That way, you can use specialized setups for a particular recurring task or situation. To save your current layout, use "Save New Activity" from the Settings cog. You can switch between layouts via the cog, too. In the Activities settings, you can delete or duplicate customized options.
One of the biggest new accessibility features is Live Captions, which are helpful for people with hearing problems and anyone who cannot hear audio on their iPhone for any reason. It will work in phone and FaceTime calls, video calls on social media apps, streaming shows and other media, and even teleconferencing apps.
It's also possible to customize the font size, color, and background color for easier reading. You can even move the captions like you can with the Picture in Picture player and set its idle opacity. And if you use a Mac for calls, you can reply to the conversation by typing and having your words read out loud in real time.
For now, Live Captions is available in the U.S. and Canada for iPhone 11 and later, iPad with A12 Bionic and later, and Macs with Apple silicon. If you're worried about privacy, Apple promises that user information will stay private as Live Captions are generated directly on the device. You can't record them when you take a screenshot.
If you have an Apple Watch, you can use most of your paired iPhone's accessibility features to control it remotely, thanks to Apple Watch Mirroring.
With Apple Watch Mirroring, users can control Apple Watch using iPhone's assistive features like Voice Control and Switch Control, and use inputs including voice commands, sound actions, head tracking, or external Made for iPhone switches as alternatives to tapping the Apple Watch display. Apple Watch Mirroring uses hardware and software integration, including advances built on AirPlay, to help ensure users who rely on these mobility features can benefit from unique Apple Watch apps like Blood Oxygen, Heart Rate, Mindfulness, and more.
Apple Watch Mirroring is available on Apple Watch Series 6 and later. To enable it, go to Settings –> Accessibility –> Apple Watch Mirroring, then toggle on the switch. Once connected, you can control your Watch completely through your iPhone.
You can now enable even more languages for VoiceOver, Speak Selection, and Speak Screen. The supported languages include:
Dozens of new voices are also available for VoiceOver, Speak Selection, and Speak Screen — all optimized assistive features and languages. For English, new voices include Agnes, Bruce, Eloquence, Evan, Joelle, Junior, Kathy, Nathan, Noelle, Ralph, Vicki, and Zoe.
There are also novelty voices, including Albert, Bad News, Bahh, Bells, Boing, Bubbles, Cellos, Good News, Jester, Organ, Superstar, Trinoids, Whisper, Wobble, and Zarvox.
There are a few new options to work with in Settings –> Accessibility –> VoiceOver –> Activities –> Programming, the menu that lets you create groups of preferences for specific uses.
First is Typing Style, which lets you choose between Default, Standard, Touch, and Direct Touch. The second is Navigation Style, with Default, Flag, and Grouped choices. And the third is Braille Alert Messages, where you can pick either Default, On, or Off. These options were available before, just not for programming activities.
When using VoiceOver in Apple Maps, you'll get automatic sound and haptic feedback to help you identify the starting point for walking directions.
If you have difficulty using a game controller, the new Buddy Controller feature lets a friend or care provider help you play a game. It works by combining two game controllers into one, so you can effectively play together as a single player. If this sounds familiar, that's because Xbox consoles offer a similar feature called Co-pilot.
In Settings –> Accessibility –> Siri, you'll find a new section called Siri Pause Time, which lets you set how long Siri waits for you to finish speaking. You can leave the default setting or choose Longer or Longest. This tool is perfect for you if Siri always seems like an impatient interrupter.
Sound Recognition has been available since iOS 14, but in iOS 16, you can train your iPhone to recognize specific sounds from your environment. Go to Settings –> Accessibility –> Sound Recognition –> Sounds, and choose "Custom Alarm" or "Custom Appliance or Doorbell."
To delete custom alarms and sounds, swipe left on them from the Sounds menu. You can also tap "Edit," then the delete icon (red circle with a white line in the middle), and confirm with "Delete."
You're probably already used to the iPhone's dictation feature, but now you can use Spelling Mode in Voice Control to spell out a word letter by letter so there are no misunderstandings. Use it to dictate names, addresses, acronyms, and more. The feature is currently only available in US English.
Aside from Spelling Mode, Voice Control also has new commands for:
In iOS 16, the Apple Books app comes with new themes and accessibility options. The app has been redesigned, and the new interface is simplified, which also helps make it more accessible. You can bold text and customize spacing for easier reading. And there are a few new themes you can use to make the app easier on the eyes.
It wasn't possible to ask Siri to end a phone or FaceTime call for you, but now you can do it by saying, "Hey Siri, hang up" while you're talking to someone. The downside is that the person you're talking to will hear you saying the command, but it's great for ending the call hands-free for whatever reason. You can enable it in the Siri & Search settings or Siri accessibility settings.
Apple's website says this feature is available on iPhones with A12 Bionic and later, but in the accessibility settings, it says it works with iPhone 11 and newer. The former might be true since we could end a call on our iPhone XR , which has an A12 Bionic chip. The settings also say it's available on older iPhone models when using AirPods or Siri-enabled Beats headphones.
The auto-answer calls option is a great help to some users with disabilities. Still, there was one catch — it had to be turned on manually via Settings –> Accessibility –> Touch –> Call Audio Routing –> Auto-Answer Calls. Now, you can say, "Hey Siri, turn on auto-answer," or "Hey Siri, turn off auto-answer." Besides iOS 16, it's also available for WatchOS 9.
Your iPhone can read incoming messages and notifications, but the feature only worked when combined with AirPods or Beats headphones. In iOS 16, it also works on your iPhone's speaker and with Made for iPhone hearing aids. It's an essential tool for anyone who can't pick up their iPhone to read the latest text or notification.
Have Siri read out notifications. Siri will avoid interrupting you and will listen after reading notifications so you can respond or take actions without saying "Hey Siri." Siri will announce notifications from new apps that send Time Sensitive notifications or direct messages.
You can also set Siri to send a reply in supported apps without asking you if you're sure you want to send it.
If you use the Health app, you can now import your audiograms into it on your iPhone. Go to Browse –> Hearing –> Audiogram, then tap "Add Data." You can use your camera to take a picture of your audiogram, choose an audiogram image from your Photos app, or upload an audiogram document from Files.
Don't Miss: Snap Photos on Your iPhone Hands-Free for Better Selfies, Group Shots, and Low-Light Pictures
Keep Your Connection Secure Without a Monthly Bill. Get a lifetime subscription to VPN Unlimited for all your devices with a one-time purchase from the new Gadget Hacks Shop, and watch Hulu or Netflix without regional restrictions, increase security when browsing on public networks, and more.
Other worthwhile deals to check out: