Apple details new iPhone features like door detection and live captions


Global Accessibility Awareness Day is Thursday, so Apple took to its newsroom blog this week to announce several major new accessibility features for the iPhone, Apple Watch, iPad, and more. at the Mac.

One of the most widely used will likely be Live Captions, which is coming to iPhone, Mac, and iPad. The feature displays AI-driven live update captions for speech from any audio source on the phone, whether the user is “on a phone or a FaceTime call, using an app video conferencing or social media, stream media or have a conversation with someone next to them.”

Text (which users can resize at will) appears at the top of the screen and scrolls as the subject speaks. Additionally, Mac users will be able to type responses and have them read aloud to other call participants. Live Captions will enter public beta on supported devices (“iPhone 11 and later, iPad models with A12 Bionic and later, and Mac with Apple Silicon”) later this year.

There is also door detection. This will unfortunately only work on iPhones and iPads with a lidar sensor (so the iPhone 12 Pro, iPhone 13 Pro or recent iPad Pro models), but it seems useful for those who are blind or have low vision. It uses the iPhone’s camera and AR sensors, in tandem with machine learning, to identify doors and audibly tell users where the door is, if it’s open or closed, how it can be opened and what handwriting or label it might have.

Door detection will join person detection and image descriptions in a new “detection mode” for blind or visually impaired users in iOS and iPadOS. However, Apple’s blog post did not specify when this feature would launch.

Other accessibility additions that Apple says are imminent include 20 new Voice Over languages, new hand gestures on Apple Watch, and a feature that lets players receive help from a “buddy” with a another game controller without disconnecting theirs. Additionally, there are new Siri and Apple Books customizations to expand accessibility for people with disabilities, sound recognition customizations, and Apple Watch screen mirroring to iPhone, allowing users from Watch to access many accessibility features available on the iPhone but not on the Watch.

Tech enthusiasts often lament that smartphones (and personal tech in general) have become stagnant, without many exciting new developments. But that couldn’t be further from the truth for many people with disabilities. Google, Apple, and many researchers and startups have made significant progress, bringing powerful new accessibility features to mobile devices.

Announcement image by Apple


Leave a Reply

Your email address will not be published.