As you might guess, it is not possible to live blog and present at the same time, and the network has been dropping frequently with the number of participants, so I’m a little late getting this one out ;).
My talk was about Accessibility foremost, but I also made some recommendations to consider using Inclusive Design to help the process. With this being an Agile testing conference, it fits well with the idea of moving the testing forward. Rather than trying to shoe-horn Accessibility after the fact, what about designing up front with the idea that you will make your products effective for the broadest number of users as possible. Wouldn’t it be great if you didn’t have to make special accommodations for that, or have to make multiple paths to make that happen? Inclusive Design makes that possible.
Accessibility deals with the absolutes on the disabled scale. The fact is, an app that is visual will be unusable for a person who cannot see, unless special accommodation is made to allow for interacting with assistive technology. For many of us, we may be able to see, but we may be developing challenges around the clarity of that vision. I fit that model right now. Readers are a part of my reality, whether I like it or not. Anything three feet away from me? Crystal clear. Anything within three feet? I start to lose clarity. Text? Ugh!!! I’ve gotten to the point where I have to have readers to interact with my phone.
What I’ve described is what would be considered a secondary, or situational, disability. I’m OK in most cases, but for a particular range of interactions, I need help. I don’t need Accessibility level of interaction, but there are some methods of displaying information or controls that would make my life a bit easier if I don’t have to dig for my glasses every time I want to look at my phone or my computer.
I highlighted an app I am currently using, and that several people have heard me discuss on my blog in the past couple of months, called LoseIt. It’s been a terrific app for counting calories and tracking exercise, and it’s helped me meet a key goal, that of losing weight and getting into better physical shape. Additionally, I appreciate the decisions that LoseIt has made, intentional or not, to make many parts of their app usable by a broad group of people, even those of us who struggle to read small screens and even smaller type. The main screens are accessible with a side swipe, and the main displays are based on dials, sliders, pie charts and bar graphs. What does this mean? A lot of information can be communicated in that small space without my having to hunt for my glasses. Yes, the areas where I have to interact intensively I still need them, but for a quick glance at overall stats, the displays are such that I can look at them and get feedback that is meaningful even without my glasses on. That’s a good example of inclusive design in action. I have no idea if LoseIt did that for that purpose (and truth be told, if they did, they could use some bolder contrast options) but it’s appreciated nonetheless.
I had some fun questions after my talk, and a really good point was made by one of the participants, in that she told me she was working with a company that developed games for young children, and that try as they might, they were not able to effectively test the games at a level that was effective for their target audience. She likened this to the limitations of testing with personas for other disabilities, where we can imagine that we understand the issues, we can empathize with the issues, but we can’t really be stand-ins for those individuals and the challenges they may have or the development level they are at. I agree completely. At the end of the day, I will likely never be able to listen to a screen reader at the speed that a friend of mine who is sightless can listen to it and make sense of it (literally ten times faster than the default setting). We can be surrogates, but we are ultimately poor surrogates. Our experiences in their shoes can help, but it can’t replace the feedback they can give.