EMG-Based Adaptive Mobile Interaction System for Visually Impaired Users: Gesture Recognition and AI-Driven Control


Ms. Tasnim Ferdous (TNMF)

Senior Lecturer

tasnim.ferdous@bracu.ac.bd

Synopsis

An EMG-based adaptive mobile interaction system for visually impaired users to control devices using muscle gestures. Wearable EMG sensors detect gestures, providing haptic and audio feedback. An AI-driven model enhances real-time intent recognition. The project includes a wearable prototype, gesture recognition software, and user trials. It aims to improve accessibility, reduce reliance on voice assistants, and enable independent device control. Challenges include EMG signal noise and user variability, while opportunities arise from the growing demand for assistive tech.
 

Skills Learned

Deep Learning; Wearable EMG Sensors, Biomedical Signal Processing; Real time EMG data processing; 

 


©2025 BracU CSE Department