Apple on Wednesday announced new accessibility features such as Eye Tracking, Music Haptics, Vocal Shortcuts, and others for iPhone and iPad that will come later this year.
According to the company, the Eye Tracking feature will provide a way for users with physical disabilities to control their iPad or iPhone with their eyes.
“We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users,” said Tim Cook, Apple’s CEO.
Additionally, the company mentioned that the Music Haptics feature will offer a new way for users who are deaf or hard of hearing to experience music using the Taptic Engine in iPhone.
The Vocal Shortcuts feature will allow users to perform tasks by making a custom sound, whereas Vehicle Motion Cues can help reduce motion sickness when using iPhone or iPad in a moving vehicle.
“These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives.
In addition, the company said that more accessibility features will come to visionOS as well.
The accessibility features that will come to visionOS will include systemwide Live Captions to help everyone — including users who are deaf or hard of hearing — follow along with spoken dialogue in live conversations and in audio from apps.
With Live Captions for FaceTime in visionOS, more users will be able to easily enjoy the unique experience of connecting and collaborating using their Persona.
Apple Vision Pro will also add the capability to move captions using the window bar during Apple Immersive Video, as well as support for additional Made for iPhone hearing devices and cochlear hearing processors, the tech giant stated.
(Except for the headline, this story has not been edited by DNA staff and is published from IANS)