The Tink Tank » Understanding screen reader interaction modes
Léonie gives a great, clear description of how screen readers switch modes as they traverse the DOM snapshot.
A fascinating account of the history of JAWS and NVDA.
Léonie gives a great, clear description of how screen readers switch modes as they traverse the DOM snapshot.
Personas are often toothless, but these accessibility personas from gov.uk are more practical and useful than most:
Each profile has a different simulation of their persona’s condition and runs the assistive technology they use to help them.
You can use these profiles to experience the web from the perspective of the personas and gain more understanding of accessibility issues.
The street finds its own uses for things, and it may be that the use for Google Glass is assistive technology. Here’s Léonie’s in-depth hands-on review of Envision Glasses, based on Google Glass.
The short wait whilst the image is processed is mitigated by the fact a double tap is all that’s needed to request another scene description, and being able to do it just by looking at what I’m interested in and tapping a couple of times on my glasses is nothing short of happiness in a pair of spectacles.
Some really interesting ideas here from Hidde on how browsers could provide optional settings for users to override developers when it comes to accessibility issues like colour contrast, focus styles, and autoplaying videos.
Some good advice from Hidde, based on his recent talk Six ways to make your site more accessible.