Apple is set to launch three new accessibility functions to enable greater usability for people with disabilities.
The Assistive Access, Personal Voice and Point and Speak functions will be operational later this year, the company announced earlier this week.
Other features to help deaf or hard-of-hearing users and those with low vision across Apple products including Macs and iPads are also being introduced.
It comes as the company welcomed several guide dogs from the charity Guide Dogs to its store in Birmingham as part of Global Accessibility Awareness Day on Thursday.
The Point and Speak function allows those who are blind or have vision issues to point a camera at text and hear back what can be seen, helping people to navigate their visual environment.
Tommy Dean, technology development lead at Guide Dogs, said: “In today’s digitally driven world, Apple devices offer users with vision impairments the freedom to live life on their own terms.
“With inclusive design and comprehensive training, these devices become essential tools for independence.
“Guide dogs are dedicated to enhancing our service delivery and empowering our service users to embrace the opportunities that technology offers, enabling them to live life on their own terms.”
As part of the launch, Apple and Guide Dogs delivered bespoke training to 13 vision rehabilitation specialists in iOS accessibility settings and features for individuals with vision loss on Thursday, after Guide Dogs delivered similar training to 85 staff members in March.
Siobhan Meade, digital technology content officer at Guide Dogs, said: “Technology is a part of our everyday lives and plays such an important role in making the world a much more accessible place.
“I use the Maps app daily to navigate the world through my fingertips along with my guide dog Marty.
“It’s great to be able to mark Global Accessibility Awareness Day with the training with Apple that will allow our specialists to continue supporting people with a vision impairment to use technology with confidence to live the life they choose.”
The Assistive Access feature helps to distil apps including Camera, Photos, Music, Calls and Messages to their essential features, so as to lighten the cognitive load and help users with cognitive disabilities.
Users will be able to use high-contrast buttons and large text labels, or emoji-only keyboards for people who prefer to communicate visually.
The Live Speech function will allow users to type what they want to say so that it can be spoken out loud during phone conversations, to help those who are losing or have lost their speech.
The Personal Voice feature also allows those who are losing their speech to keep a voice that sounds like them, made by recording 15 minutes of audio on an iPhone or iPad.
No comments yet.