Apple on Tuesday morning announced a new wave of accessibility features for its various computing platforms, which are set to roll out later this year as software updates for the iPhone, iPad, Mac and Apple Watch.
Apple said it will test a Live Captions feature that can transcribe any audio content — FaceTime calls, video conferencing apps (with automatic attribution for speaker selection), video streams or personal conversations — in English across the iPhone, iPad and Mac. Google Push for Live Caption features started around the release of Android 10 and is now available in English on Pixel 2 and later devices, as well as “select” other Android phones and in additional languages for the Pixel 6 and Pixel 6 Pro phones. So it’s good to see the Apple ecosystem catching up and introducing it to more people.
Like the Android app, Apple says the captions will be created on the user’s devices, while keeping the information private. The beta will launch later this year in the US and Canada for iPhone 11 and later, iPads with A12 Bionic CPUs and later, and Macs with Apple Silicon CPUs.
Apple Watch will expand the Assistive Touch gesture recognition controls it added last year with quick actions that recognize double taps to end a call, dismiss notifications, take a photo, pause media, or start a workout. To learn more about what the gesture controls actually do and how they work, we explain more about how to use your Apple Watch hands-free here.
Apple Watch is also made easier to use for people with physical and mobility disabilities with a new mirroring feature that will add a remote control from a paired iPhone. Apple Watch Mirroring incorporates technology taken from AirPlay, making it easier to access the watch’s unique features without relying specifically on your ability to tap on its small screen or what voice controls can enable.
Apple introduced voice recognition with iOS 14 to capture certain sounds such as smoke or runoff alarms and alert users who may be deaf or hard of hearing. Soon, voice recognition will allow tuning to allow custom recognition of voices. As shown in this screenshot, it can listen for recurring alerts and learn how to enter alerts specific to the user’s environment, such as an unusual doorbell alert or a device ringing.
New improvements to the VoiceOver screen reader app and Speak Selection and Speak Screen features will add support for 20 new “languages and languages” covering Arabic (international), Basque, Bengali (India), Bhojpuri (India), Bulgarian, Catalan, Croatian, Farsi, and French ( Belgium), Galician, Kannada, Malay, Mandarin (Liaoning, Shanxi, Sichuan), Marathi, Shanghai (China), Spanish (Chile), Slovenian, Tamil, Telugu, Ukrainian, Valencian, and Vietnamese. On the Mac, VoiceOver’s new text checker will check for formatting issues like extra spaces or capitalization, while on Apple Maps, VoiceOver users can expect new audio and tactile feedback indicating where walking directions begin.
At Apple, we design for accessibility from the ground up and continually innovate on behalf of our users. The cutting-edge features we’re sharing today will provide new ways for people with disabilities to move around, communicate, and more. https://t.co/Zrhcng95QA
– Tim Cook (@tim_cook) May 17 2022
Now, Apple says on-device processing will use lidar sensors and cameras on your iPhone or iPad in order to detect the door. The new feature coming to iOS will help users find entries in a new location, tell them where they are, describe whether it works with a knob or a knob as well as whether it is open or closed.
This will all be part of a detection mode that Apple is adding to Magnifier in iOS, which also brings together existing features that allow the camera to zoom in and describe nearby objects or identify people nearby and alert the user with sounds, speech, or haptic feedback. The use of the LiDAR sensor means that the person and door detection system will require an iPhone Pro or iPad Pro model that includes the feature.
Another new feature on the way is Buddy Controller, which combines two game controllers into one unit so a friend can help someone play a game, similar to the Copilot feature on Xbox.
Finally, other tweaks include a voice control spelling mode with letter-by-letter input, controls to adjust how long Siri waits to respond to requests, additional visual tweaks for Apple Books that can type bold text, change themes or adjust font or character, and word spacing to make it more readability.
The announcements come as part of Apple’s recognition of World Accessibility Awareness Day on May 19 this week. He notes that Apple Store sites will offer live sessions to help people learn more about current features, and a new Accessibility Assistant shortcut will come to Mac and Apple Watch this week to recommend specific features based on user preferences.