I just heard that touchscreen devices like the iPhone are really useful for blind users. What do I need to know?
Sherpa Derek Featherstone answers:
This really is an exciting time for people with disabilities. Touchscreen interfaces on both iOS and Android, and even Windows Phone, continue to get better in terms of accessibility. And it isn’t just blind people who benefit from those interfaces. For example, the touchscreen doesn’t take a lot of strength to operate, so people with low hand or arm strength do well with such interfaces. (Of course, there’s a lot more to these mobile operating systems than just the touchscreen and just blind users, but that was your question, so I’ll limit it to that for now.)
Here’s how I’d love for you to think of it: a person that is blind is just trying to accomplish something using that touchscreen. They have a goal. They want to buy a movie or a sweater, or they want to read an article about their favourite video game. The touchscreen with VoiceOver on their iPhone is really just a tool to achieve that goal.
With user goals in mind, here are a few key things to know and understand:
The iPhone and other touchscreens use gestures instead of traditional keystrokes.
On a desktop computer with keyboard and screen reader, we might press the H key to move from one heading to the next on a page. Or we might pull up a list of all the headings that exist in a page.
By comparison, on touchscreens we use gestures instead of a traditional keyboard. We might use the rotor in iOS to switch to headings mode and then flick down to move to the next heading, or flick up to move to the previous heading.
(Incidentally, an iPhone user could connect an external keyboard via Bluetooth, and they could use more traditional keystrokes, but for simplicity, think of it more like the above.)
The iPhone’s VoiceOver screenreader is highly configurable.
There’s a decent amount of personalization and customization within VoiceOver. Options allow you to select how verbose the screenreader should be, whether or not you want it to speak tooltips or other help text hidden in the page, and which items on the page you want VoiceOver to use when you’re navigating through the page.
You can even set up VoiceOver’s rotor to include ARIA Landmarks as one of the items that you can use for navigation. But keep in mind that people “out there in the real world” may not have clue what landmarks are.
As developers, it’s important to remember that the experience you get when testing is different than what you’d get working with real people. Always remember that their settings may be different than your testing setup. What’s the best way to learn about the impact of those settings? Test with real people.
And, speaking of ARIA …
ARIA support continues to evolve and improve.
This isn’t specific to the iPhone, but it’s an important aspect of today’s technology: ARIA (Accessible Rich Internet Applications) support in mobile browsers and screen readers is becoming more robust and something that can be relied upon when being used by a VoiceOver user.
That doesn’t mean ARIA is without its shortcomings. For example, it is still not supported by desktop voice recognition software. So yes, go ahead and use ARIA, but be aware of where it works well, where it doesn’t, and what the impact of your choices is on the person who is trying to buy that sweater or read that article.
Where iPhone has VoiceOver, Android has TalkBack, and Windows Phone 8 has Narrator and/or Mobile Accessibility.
I know you only mentioned the iPhone in your question, but it’s actually a good thing that users have many options. All platforms continue to improve their accessibility offerings. This competition is good, as it generally means that features implemented on one platform that prove useful may end up making their way onto other platforms. Which means more options for people who actually need the software. That iOS, Android and Windows Phone all include these tools at no additional cost makes it even better.