Join Assistant Professor Syed Billah for his talk, "Reinterpreting Fitts' Law for Non-Visual Interaction.” He'll discuss his current research projects, new findings, and opportunities for collaboration. All talks are free and open to the public.
Abstract
Fitts' Law has been a cornerstone of user interface design since the advent of graphical user interfaces. However, its traditional interpretation assumes certain user abilities, potentially limiting its applicability for diverse user groups, particularly those who are blind or have low vision. This talk presents a critical reexamination and reinterpretation of Fitts' Law to overcome these initial assumptions and extend its utility to non-visual interaction paradigms. We will present recent research that builds upon this reinterpretation, showcasing novel interaction techniques such as multi-linear and multi-wheel-based interactions, grid-based interfaces, and abacus-inspired mid-air gestures. We conclude by illustrating how this reinterpretation of Fitts' Law serves as a bridge between traditional HCI research and other disciplines, particularly control theory and reinforcement learning from human feedback, opening new avenues for research and design in accessible human-computer interaction.
About the Speaker
Syed Billah’s research focuses on human-computer interaction with a strong emphasis on accessible computing, a topic that is broadly concerned with making computing devices and digital information accessible for people with special needs or people with special situations. Billah investigates the low-level accessibility issues in computer systems and designs efficient, robust, and extendible accessibility supports in modern operating systems. He also develops assistive technologies to make non-visual interaction fast, cross-platform, and ubiquitous. His research promotes equality for people with vision impairments and unlocks their opportunities in education and employment. More recently, he studies the impact and opportunities of AI, 3D fabrication, augmented reality, data sonification, and smart sensing technologies in accessibility and intelligent interactive system research.