I posted recently about how sharing images through services like Twitter are often inaccessible to users with visual impairments because they lack meaningful alt text. In its March 28 release (version 6.50 for iOS), Twitter now provides some mobile users the option of including alt text. (No word on when the feature will be available for website users.)
To enable the ability to add alt text to your posted pictures from within the iOS Twitter app:
Go to your profile page
Tap the settings (gear) icon
Choose the “Accessibility” option
Turn on the “Compose image description” option
Save your settings
The first time you insert an image into a tweet, you will be prompted to “Describe this image for the visually impaired”.
Tap the “Add description” button to provide a meaningful explanation of the contents of the image.
I’ve been using Waze for a few months now, and every now and then, a bar shows up on the left side. I’d glance down and see that it showed “something” was estimated to happen (or last?) for a few minutes, but I could not figure out what. We have a hands-free law here, so I could not legally take my phone off its holder and look at it more closely.
The screen is probably 18″ from my eyes and I wear polarized sunglasses which makes the app even harder to interpret when glancing down for fractions of a second.
It took using Waze as a passenger to see that the bar’s label also had the word “Jam” (traffic jam?) in a light blue font. However, other times I’ve been driving and the bar has no label, so I’m still not sure what it’s for!
This is an easy one. Change the font color to white and bold it so that the word “Jam” is just as visible as the time estimate. And always include a label to indicate why the bar is there.
Many of us encounter situations daily where we are not able to fully utilize our sense of hearing. Some people work in noisy warehouses or factories where headphones are mandatory; others work in office cubicles on computers that do not have speakers. Now imagine that were the case all the time, every day—no speech, no music, no warning sirens—no sound. That is the reality for an estimated 360 million people worldwide—over 5% of the population—who experience disabling hearing loss according to the World Health Organization’s 2015 fact sheet on deafness and hearing loss, including 2-4 of every 1000 people in the United States. This translates to more than 1,000,000 Americans over the age of five who are considered functionally deaf or “those identified as either unable to hear normal conversation at all, even with the use of a hearing aid, or as deaf.” The following are some accessibility points to consider in order to better support these users.
Use of Sounds
Relying on sound alone isolates and potentially endangers users who are functionally deaf. Providing no visual indicators can lead to confusion in using systems where these users would receive no feedback on errors or successes in a process. It is also necessary to provide a context for the visual indicator of an auditory event, not just that one has occurred because without context, a visual indicator is just as useless as if none had been provided at all.
Text Equivalents of Auditory Material
Besides including contextual visual indicators along with auditory indicators to improve system access for functionally deaf users, visual equivalents need to be included in any communication that contains auditory cues, such as video and audio material, as outlined by the Web Content Accessibility Guidelines. No official, user-tested standards for captioning or subtitling exist but for an in-depth look at issues with captioning and suggested best practices, visit the Captioning Sucks! website of the Open & Closed Project. Providing text equivalents includes providing subtitles for podcasts, streaming online video and any other contexts in software or systems where auditory material is being presented. Beyond simple transcripts, it is important to provide meaningful subtitles that facilitate comprehension.
Video and Sign Language
For those who use sign language—though not all people who are deaf or hearing impaired do—providing sign language interpretations of content in addition to subtitles can improve access. Keep in mind the same issues as with translation as text equivalents since sign languages vary across the globe. With an increase in broadband access and video sharing sites, more and more people are able not only to consume content but create their own using sign language to communicate online.
When it comes to providing avenues for deaf users of sign language to communicate with one another remotely, providing video options via webcams is crucial. One deaf ASL instructor couldn’t emphasize enough how video phone technology with devices like smartphones and Skype has dramatically improved sign language users’ ability to communicate easily, compared to relay services and TTY.
Writing with the Deaf in Mind
Lisa Herrod’s 2008 article for A List Apart, Deafness and the User Experience, is a must-read for anyone concerned about creating content for d/Deaf users. Her guidelines when writing for the web include
Use headings and subheadings
Use plain language whenever possible
Avoid unnecessary jargon and slang
Provide a glossary of specialized vocabulary
Providing written content in clear and concise terms that leave little room for misinterpretation help all people accessing content, not just those with disabilities.
An in-depth examination and understanding of the needs of deaf users is crucial to creating systems that allow fair and equal access. A lack of public awareness and familiarity with the needs of deaf people is still common which can lead to oversights in serving these users. We should educate video creators on tools available for embedding subtitles and encourage integration into projects.
An awareness of the information poverty deaf users experience can help designers build systems that go beyond a reliance on sound—using textual equivalents for all auditory material as well as conveying meaning with clear, concise language—in order to provide improved access to this often overlooked group.