BBL Speaker Series: Embodied Non-visual Interaction: Tangible, Haptic, and Robotic Mediation of Graphical User Interface for Blind People
Talk Title: Embodied Non-visual Interaction: Tangible, Haptic, and Robotic Mediation of Graphical User Interface for Blind People
Speaker:Jiasheng Li, Ph.D. Student in Computer Science, University of Maryland (UMD)
Location: HBK 2105 and Zoom
Abstract: The widespread use of graphical user interfaces (GUIs) shapes how people perceive, manipulate, and create digital content. Yet most GUIs rely on the visual channel for both input and output, which creates steep barriers for blind and low-vision (BLV) people. Screen readers and keyboards provide basic access to text-based content and input modality, but they offer limited support for creating, editing, and understanding graphical content. In this talk, I will present three projects—Toucha11y, TangibleGrid, and Hand Haptic for VR—that help BLV people understand and manipulate graphical content across diverse GUIs, from online web pages to offline public touchscreens, and also enable creation of custom graphical layouts though non-visual interactions. Building on these prototypes, I sketch a path toward conveying not only text and graphical content, but also aesthetics and style of GUIs though non-visual modalities for BLV people.
Bio: Jiasheng Li is a Ph.D. student in Computer Science at the University of Maryland (UMD). He conducts interdisciplinary human-computer interaction (HCI) research, spanning from accessibility, robotics and human-ai interaction. He is passionate about creating an inclusive environment for blind individuals by designing accessible interfaces and building systems leveraging robotics, multimodal interaction techniques and artificial intelligence. His work has been published and recognized at top-tier HCI conferences such as CHI, UIST, and IEEE VR contributing to the accessible technologies for blind and low vision people.


