A multi-platform electronic travel aid integrating proxemic sensing for the visually impaired

Loading...
Thumbnail Image

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Multidisciplinary Digital Publishing Institute (MDPI)

Abstract

Visual impairment (VI) affects over two billion people globally, with prevalence increasing due to preventable conditions. To address mobility and navigation challenges, this study presents a multi-platform, multi-sensor Electronic Travel Aid (ETA) integrating a combination of ultrasonic, LiDAR, and vision-based sensing across head-, torso-, and cane-mounted nodes. Grounded in orientation and mobility (OM) principles, the system delivers context-aware haptic and auditory feedback to enhance perception and independence for users with VI. The ETA employs a hardware–software co-design approach guided by proxemic theory, comprising three autonomous components—Glasses, Belt, and Cane nodes—each optimized for a distinct spatial zone while maintaining overlap for redundancy. Embedded ESP32 microcontrollers enable low-latency sensor fusion providing real-time multi-modal user feedback. Static and dynamic experiments using a custom-built motion rig evaluated detection accuracy and feedback latency under repeatable laboratory conditions. Results demonstrate millimetre-level accuracy and sub-30 ms proximity-to-feedback latency across all nodes. The Cane node’s dual LiDAR achieved a coefficient of variation at most 0.04%, while the Belt and Glasses nodes maintained mean detection errors below 1%. The validated tri-modal ETA architecture establishes a scalable, resilient framework for safe, real-time navigation—advancing sensory augmentation for individuals with VI.

Description

Citation

Naidoo, N. and Ghaziasgar, M., 2025. A Multi-Platform Electronic Travel Aid Integrating Proxemic Sensing for the Visually Impaired. Technologies, 13(12), p.550.