Designing Multimodal Touchscreen Interactions for Accessible Data Visualization Supporting Blind Users

Guardat en:
Dades bibliogràfiques
Publicat a:ProQuest Dissertations and Theses (2025)
Autor principal: Chundury, Venkata Sai Pramod
Publicat:
ProQuest Dissertations & Theses
Matèries:
Accés en línia:Citation/Abstract
Full Text - PDF
Etiquetes: Afegir etiqueta
Sense etiquetes, Sigues el primer a etiquetar aquest registre!
Descripció
Resum:Data visualization can be a democratizing force for providing advanced data analysis tools and capabilities to everyday users. However, data visualization also creates barriers for blind and low-vision (BLV) individuals, a fact that has long been recognized in the accessibility research community. Assistive technologies such as tactile graphics, data sonification (using audio to convey data), and refreshable tactile displays (RTDs) can be utilized to lower the accessibility barriers to data visualization. Yet it is only recently that visualization research has realized this fact. Since this realization, there have been various efforts from the visualization community, such as data-centric alt-text, accessible tables, and richer screen reader experiences. However, the visualization community, which arguably is best poised to tackle these challenges, has so far only scratched the surface of creating rich human-data interactions for blind individuals.Commercial touchscreen devices such as smartphones and tablets now have in-built accessibility features, and are thus increasingly being adopted by blind individuals. These devices are also well-suited to support direct data manipulation through touch interactions. I adopted a mixed-methods approach to design multimodal (audio and haptic) chart representations and interactions that leverage such touchscreen devices by conducting four studies.The first study involved semi-structured interviews with Ten Orientation and Mobility (O&M) experts who train BLV individuals in non-visual navigation and spatial understanding. The goal was to derive design principles for effective non-visual data interaction. Findings emphasized the usefulness of crossmodal sensory substitution (CMSS)—a strategy where tactile interactions are paired with sonification to enhance spatial awareness. Participants highlighted that BLV individuals have diverse preferences for sensory modalities, necessitating personalized multimodal experiences that cater to different skill levels and cognitive strategies. These insights informed the design of an accessible data visualization system.The second study explored the lived experiences of BLV professionals in data-related fields through a two-step online survey. Responses from BLV individuals engaged in data analysis revealed persistent accessibility barriers at multiple stages of the data workflow—including data loading, transformation, analysis, and visualization authoring. Despite expertise in programming (e.g., Python, R, and SAS) and GUI-based tools (e.g., Excel), participants reported substantial reliance on assistive technologies, often requiring sighted colleagues’ assistance to interpret visualizations. These findings highlight the need for “born accessible” tools that allow independent and efficient data exploration without requiring external support.The third study introduces TactualPlot, a multimodal data interaction system that leverages CMSS principles to enable blind users to explore data through touch and sound on touchscreen devices. TactualPlot was developed through an iterative participatory design process involving a blind collaborator who provided feedback on early prototypes. The system supports scatter-plots, bar charts, line graphs, and pie charts, allowing users to explore data through multi-finger touch gestures combined with audio cues and spatial feedback. Unlike traditional sonification approaches, TactualPlot employs direct touch interactions (similar to tactile exploration) to guide users through high-level data trends before enabling deeper exploration. The final study presents an empirical evaluation comparing TactualPlot to other accessibility solutions, including screen readers (Olli) and refreshable tactile displays (Monarch). Ten blind participants, recruited from blind individuals working in data-intensive fields, performed data analysis tasks of varying complexity across multiple visualization types. The study assessed task correctness, completion times, and user preferences, revealing that hybrid approaches combining touch and sound were preferred over uni-modal (audio-only or tactile-only) solutions. Novel multi-line braille displays such as the Monarch offer features that can combine both touch-screen interactions and haptic feedback. To better understand how blind individuals can create charts for and use RTDs in the future, I also conducted a 3-hour long co-design session with a blind participant, providing insights into how blind users conceptualize and create tactile-based charts.This dissertation contributes to accessible data visualization research by demonstrating the effectiveness of multimodal (touch-audio) interactions and highlighting new design opportunities for refreshable tactile displays. The findings provide practical guidelines for creating “born accessible” data tools for BLV individuals in data-intensive fields. By integrating touch, sound, and personalized interaction techniques, this work helps with the creation of next-generation accessible visualization systems, empowering BLV individuals to engage with data independently and effectively.
ISBN:9798286436811
Font:ProQuest Dissertations & Theses Global