iOS
(Photo : Unsplash/ William Hook) iOS

The VP and managing director of Apple Greater China, Ge Yue, talked about Apple's new machine learning project at the 2022 World Artificial Intelligence Conference in Shanghai. 

Yue pointed out the project's benefits for accessibility and health by illustrating the technology through Apple Watch, AirPods Pro, and other Apple products. 

Apple's Machine Learning Projects

According to NPR, Yue said that Machine Learning plays an important role in Apple's hope that its products can help people innovate, create, and provide the support they need daily. 

Yue added that one of Apple's core values is accessibility, and the company wants to manufacture suitable products for everyone. 

Apple's Machine Learning can help disabled users provide independence and convenience, including people with the visually impaired, the hearing impaired, people with physical and motor disabilities, and those with cognitive impairment. 

The most recent Machine Learning features added to Apple products are Assistive Touch, Eye-Tracking, and Voice Over. 

Also Read: Apple's Ex-AI Exec, Who Disagreed with its Return-to-Work Rules, Heads Back to Google 

Apple Watch's Assistive Touch

In 2021, Apple Watch launched the Assistive Touch feature alongside eye-tracking on the iPad. 

These features are meant to support users with limited mobility and allow users with upper body limb differences to enjoy the Apple Watch's benefits without touching the displays or controls. 

Apple Touch uses built-in motion sensors, along with the optical heart rate sensor and on-device machine learning.

The device can also detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on display through a series of hand gestures. 

Assistive Touch also enables users who have disabilities to answer incoming calls easily, control an onscreen motion pointer, access Notification Center and more. 

Yue said that Assistive Touch combines machine learning on the device with data from the sensors to help detect subtle differences in muscle movement and tendon activity to replace display tapping, according to Patently Apple. 

Eye-Tracking and Voice Over

iPadOS will soon add support for eye-tracking devices, allowing users to control their iPad using only their eyes. 

In late 2022, compatible MFi devices will track where a person is looking on the screen, and the pointer will move to follow the user's gaze while extended eye contact acts, similar to a tap. 

Another Machine Learning feature is Voice Over, which can be accessed via iPhone. The feature can now give users more details on images, including text, table data, and other objects with images. 

Voice Over can also now describe a person's position in an image. 

According to 9to5Mac, Apple will introduce new features for Voice Over but did not specify the date. The tech giant plans to add Image Descriptions to Voice Over for low vision and visually impaired iPhone users. 

With Image Descriptions, users can now explore more details about the people, table data, text, and other objects seen within images. 

Users can also navigate a receipt photo by row, column, and headers. The feature can also describe a person's position and other objects within images so that users can relieve memories in detail.  

Related Article: Apple Spends $200M To Acquire Seattle-Based Turi, Boosts Machine Learning And Artificial Intelligence Capabilities 

This article is owned by Tech Times

Written by Sophie Webster

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion