Jump to content
Facebook Twitter Youtube

[Software] Apple packs iOS 14 with new accessibility features, like AirPods Pro audio tweaks


. PREDATOR
 Share

Recommended Posts

Apple packs iOS 14 with new accessibility features, like AirPods Pro audio tweaks

It's added headphone sound customizations, a quick launch feature called Back Tap and improvements to its po[CENSORED]r Magnifier.

 

 

Apple packs iOS 14 with new accessibility features, like AirPods ...

 

With iOS 14, Apple brings tons of new accessibility features to its devices, including some that people without disabilities could also find helpful. The list ranges from the ability to customize Transparency mode in AirPods Pro to capturing multiple frames with the iPhone Magnifier function. And the new Back Tap feature lets you tap the backside of your iPhone to do things like take a screenshot. 

Many of the new enhancements will likely appeal to people who are deaf or have hearing loss, while other features will benefit users who are blind or low vision, expanding on Apple's efforts over the years to make its devices and software more accessible. 

The improvements aren't just for iPhones and iPads. Apple Watch users now will have the option to configure accessibility features as they go through the process of setting up a watch, as well as turn on an extra-large watch face for bigger and bolder "complications" -- glanceable bits of info about things like the weather -- to help people with low vision see them better. 

On Monday, Apple unveiled iOS 14, iPadOS 14 and its other updated software during its annual Worldwide Developers Conference. The company uses WWDC to show off the biggest updates to its operating systems before making them available for all Apple device users later in the year. Right now, developers and other beta testers have access to early versions of the software to make their apps and help Apple detect bugs before the improvements are rolled out broadly. That includes accessibility features.

The US Centers for Disease Control and Prevention estimates that a quarter of Americans live with some sort of disability. In the past, people with special needs had to shell out thousands of dollars for technology that magnified their computer screens, spoke navigation directions, identified their money and recognized the color of their clothes. Today, users only need smartphones, computers and a handful of apps and accessories to help them get through their physical and online worlds.

Apple has built accessibility features into its products for years. It offers technology to do things like help people with low vision navigate the iPhone's touchscreen or allow those with motor impairments to virtually tap on interface icons. It has a Made for iPhone program that certifies hearing aids that work with its devices, and two years ago, Apple gave users the ability to turn their iPhones and AirPods into remote microphones through its Live Listen feature. 

iOS 14, iPadOS 14, WatchOS 7 and its other upcoming software expand those offerings. 

 

img-d888e71544ac-1.jpg

 

Hearing features

Headphones Accommodations lets users adjust the frequencies of audio streamed through their AirPods Pro, second-generation AirPods, select Beats headphones and EarPods. Each individual can customize the settings for what's right for them, either dampening or amplifying particular sounds. Users can set up to nine unique profiles (like a movie setting and a different calls setting) that tap into three amplification tunings and three varying strengths.

AirPods Pro Transparency Mode gets its own unique benefit from Headphones Accommodations: the ability to customize how much of the surrounding environment you hear. Quiet voices can become more crisp, and outside environmental sounds can become more detailed.

Sound Recognition makes it easier for people who are deaf to be aware of sound-based alerts, alarms and notifications. When an iPhone, iPad or iPod Touch picks up a particular type of sound or alert, it will send a notification to the user's device, including an Apple Watch. The sounds the system can detect are alarms like sirens, smoke alarms at home or building fire alarms; and household noises like doorbell chimes, car horns, appliance beeps and running water. Apple also is working on detecting sounds from people or animals.

Group FaceTime calls will now be accommodating for people who are using sign language instead of talking. Typically, in a group call, the person speaking appears more prominently to the other participants, with that person's video box becoming larger. With iOS 14, FaceTime will be able to detect if someone is using sign language and will make that person's video window prominent. 

The Noise app, introduced in last year's WatchOS 6, measures ambient sound levels to give users a sense of how loud their surrounding environment is. With WatchOS 7, customers will be able to see how loudly they're listening to audio through their headphones via their iPhone, iPod or Apple Watch. A hearing control panel displays a live UI that shows whether the audio is playing above the World Health Organization's recommended limit, which is listening to audio at 80 decibels for about 40 hours a week without hurting hearing. When reaching the safe weekly listening amount, the Apple Watch sends a notification to the wearer. 

Real-Time Text lets people who have hearing difficulties or speech disabilities communicate using two-way text in real time while on a phone call. The iPhone has had RTT since 2017, but Apple has now made it simpler for users to multitask while interacting with calls and incoming RTT messages. They'll get notifications even when they're not in the phone app and don't have RTT conversation view enabled. 

 

Vision features

VoiceOver, Apple's technology that translates on-screen text into speech, gets some updates with iOS 14. It now taps into Apple's on-device machine learning and Neural Engine to recognize and audibly describe more of what's happening on screen -- even when third-party developers haven't enabled the ability in their apps. An iPhone or iPad will now automatically provide better optical recognition of more objects, images, text or controls displayed on a screen, and VoiceOver gives more natural and contextual feedback. When it comes to images or photos, VoiceOver now reads compete sentence descriptions to detail what's on the screen. And it automatically detects user interface controls like buttons, labels, toggles, sliders and indicators. 

Rotor, a gesture-based way to customize the VoiceOver experience, now can do more than before. The system already lets users make tweaks like adjust the speaking rate and volume, select special types of input such as braille or adjust how VoiceOver moves from one item to the next on the screen. WatchOS 7 brings the technology to Apple Watches, letting users customize characters, words, lines, headings and links. And with MacOS Big Sur, users can configure Rotors with preferred braille tables and access more options to adjust code while developing apps in Xcode. 

Apple's Magnifier technology, one of its most-used accessibility features, gets an upgrade with iOS 14 and iPadOS 14. It now lets users magnify more of the area they're pointing at, as well as capture multi-shot freeze frames. They also can filter or brighten images for better clarity and capture multiple images at once to make it simpler to review multipage documents or longer content all at once. Magnifier also works with multitasking on the iPad.

Apple's new software expands support for Braille with Braille AutoPanning. It lets users pan across larger amounts of Braille text without needing to press a physical pan button on their external refreshable displays.

 

Back Tap

One accessibility feature that many people could end up using is Back Tap. The feature, found in iOS 14, lets iPhone users do a variety of quick action by double or triple tapping on the back of an iPhone. Users can turn on specific accessibility features or take a screenshot. They also can scroll, open the control center, go to the home screen or open the app switcher. 

One thing Back Tap doesn't easily do is launch the camera or take a photo. Users can configure those actions by first making a Siri Shortcut. The Shortcut app, introduced two years ago, automates common and routine tasks. With Shortcuts, people have been able to create customized commands, like setting up a request that brings together a surf report, current weather, travel time to the beach and a sunscreen reminder, all by just saying, "Hey Siri, surf time." Those Shortcuts can be mapped to the Back Tap settings.

 

 

 

 

 

Edited by XZoro™
Closed Topic / Complete 24 hours.
  • I love it 1
Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
 Share

WHO WE ARE?

CsBlackDevil Community [www.csblackdevil.com], a virtual world from May 1, 2012, which continues to grow in the gaming world. CSBD has over 70k members in continuous expansion, coming from different parts of the world.

 

 

Important Links