Skip to main content

Apple reveals new Accessibility features: AssistiveTouch for Apple Watch, eye-tracking on iPad, more

Apple has announced a variety of new software features coming soon to iPhone, iPad, and Apple Watch. All of these features focus on Accessibility, including AssistiveTouch for Apple Watch, eye-tracking support for iPad, and much more.

Apple says that AssistiveTouch for watchOS allows users to control their Apple Watch without ever having to touch the display. The company explains that this feature will be especially useful to those users with limited mobility. Here are the details:

To support users with limited mobility, Apple is introducing a revolutionary new accessibility feature for Apple Watch. AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls.

Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench. AssistiveTouch on Apple Watch enables customers who have limb differences to more easily answer incoming calls, control an onscreen motion pointer, and access Notification Center, Control Center, and more.

Furthermore, iPadOS will soon add support for eye-tracking devices, which will allow users to control their iPad using their eyes.

iPadOS will support third-party eye-tracking devices, making it possible for people to control iPad using just their eyes. Later this year, compatible MFi devices will track where a person is looking onscreen and the pointer will move to follow the person’s gaze, while extended eye contact performs an action, like a tap.

VoiceOver, one of the most powerful Accessibility features for iPhone, is getting a major upgrade. VoiceOver can now give users more details on images, including people, text, table data, and other objects within images. VoiceOver can also now describe a person’s position position in an image.

Apple is introducing new features for VoiceOver, an industry‑leading screen reader for blind and low vision communities. Building on recent updates that brought Image Descriptions to VoiceOver, users can now explore even more details about the people, text, table data, and other objects within images. Users can navigate a photo of a receipt like a table: by row and column, complete with table headers. VoiceOver can also describe a person’s position along with other objects within images — so people can relive memories in detail, and with Markup, users can add their own image descriptions to personalize family photos.

Finally, Apple has announced support for bidirectional hearing aids and support for recognizing audiograms:

In a significant update to the MFi hearing devices program, Apple is adding support for new bi-directional hearing aids. The microphones in these new hearing aids enable those who are deaf or hard of hearing to have hands-free phone and FaceTime conversations. The next-generation models from MFi partners will be available later this year.

Apple is also bringing support for recognizing audiograms — charts that show the results of a hearing test — to Headphone Accommodations. Users can quickly customize their audio with their latest hearing test results imported from a paper or PDF audiogram. Headphone Accommodations amplify soft sounds and adjust certain frequencies to suit a user’s hearing.

Apple does not provide a specific release date for these new features. The announcement comes ahead of Global Accessibility Awareness Day, which is tomorrow, May 20. The company also announced a new Backgrounds Sounds feature for iOS and a new Apple Store initiative called SignTime.

Apple says that additional features coming later this year include:

  • Sound Actions for Switch Control replaces physical buttons and switches with mouth sounds — such as a click, pop, or “ee” sound — for users who are non-speaking and have limited mobility.
  • Display and Text Size settings can be customized in each app for users with colorblindness or other vision challenges to make the screen easier to see. Users will be able to customize these settings on an app-by-app basis for all supported apps.
  • New Memoji customizations better represent users with oxygen tubes, cochlear implants, and a soft helmet for headwear.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Chance Miller Chance Miller

Chance is an editor for the entire 9to5 network and covers the latest Apple news for 9to5Mac.

Tips, questions, typos to chance@9to5mac.com