Many people take devices like touchscreens for granted, but not all devices are universally accessible. For users with disabilities, the increasing ubiquity of technology in day-to-day life can present challenges, as well as opportunities. For researchers in the HCIL, the question is how to implement these emergent technologies to enhance the experiences of users with disabilities.
This question is of particular interest to Dr. Leah Findlater, an Assistant Professor in the iSchool and a recipient of the prestigious NSF CAREER Award. The project recognized by the National Science Foundation focuses on designing new touchscreen interfaces that are accessible to individuals with motor impairments. As she notes in her project abstract:
Touchscreen interfaces are becoming increasingly prevalent as the interface with which people interact with computers and yet, for people with motor impairments, many touchscreen commands are difficult or impossible to execute. With the increased deployment of touchscreen interfaces, it becomes critically important for hardware and software developers to ensure that such devices are accessible to a broad range of users.
Findlater is working to address these issues by studying how users’ motor abilities affect their touchscreen interactions, in order to personalize those interactions to better support individual users’ abilities.
Findlater is also developing technology for users with limited hearing or vision. In 2014, Findlater, along with Dr. Jon Froehlich (Computer Science) and Dr. Rama Chellappa (Electrical and Chemical Engineering), received a $1 million dollar grant from the Department of Defense to develop a wearable device to assist visually-impaired veterans. The system, called HandSight, senses non-tactile information and provides feedback to the user in realtime. HandSight can help visually-impaired users read in real time and even offer color coordination suggestions for clothing.
Elsewhere in the HCIL, other members are exploring ways to use existing technologies to assist users with disabilities. PhD student Kotaro Hara has been working on leveraging crowdsourced analysis of Google Street View images to improve the availability of information about curb ramps. Mapping and evaluating curb cuts can provide much-needed details for people with disabilities, making it significantly easier for those with limited mobility to navigate city streets. Under the guidance of his advisor Dr. Jon Froehlich, Hara has developed Tohme, the first smart system to combine machine learning, computer vision, and custom crowd interfaces to find curb ramps remotely using Google Street View. With Tohme, Hara hopes to decrease the amount of human effort required to evaluate curb ramps, making it faster and easier to assess their accessibility.
Whether they are crowdsourcing accessibility information or developing new assistive technologies, HCIL members understand that technology can improve the day-to-day lives of users, as long as that technology is designed with all users in mind.