Accessibility and UX/navigation for Voice devices/interfaces

I’ve just started a new role, working with Voice devices/apps/skills, and as accessibility is close to my heart, I’m really fascinated by how we can make the apps/skills we make as accessibile as possible. There’s some interesting work with letting Alexa read sign language: https://www.fastcompany.com/90202730/this-clever-app-lets-amazon-alexa-read-sign-language but I’m wondering what things peaople are doing to make voice interfaces accessible

3 Likes

From playing about with Alexa on my Echo Dot, voice control is pretty impressive. Amazing how quickly the voice control technology has come on in recent years.
I was trying to think of situations where it would be really useful, not just asking Alexa to play a different song/playlist as I use it mostly for.
Then I thought voice control in the car would be the ideal place to use it, so the driver doesn’t have to use their hands to control anything, instead they can use their voice.
As I don’t drive, I’m not sure what progress is being made with voice control in the car, but it seems the ideal place to use it.
Also, voice control must be ideal for users who can’t easily use a standard input interface like a keyboard or touch screen device, so that would include the visually challenged, the elderly and also people who have a disability of some sort.

The project I am currently working on requires the interface to be ADA (Americans with Disabilities Act) compliant. While there are many screen readers on the market, not all of them are necessarily easy to use or generically compatible. It is good to know what these can/can’t read in regards to font, color, icons, etc. There is crossover between visual and hearing impairment that limits what types of fonts, colors, icons, etc. that can be used due to compatibility with the e-reader and whether is falls within government guidelines.