An EMG-based adaptive mobile interaction system for visually impaired users to control devices using muscle gestures. Wearable EMG sensors detect gestures, providing haptic and audio feedback. An AI-driven model enhances real-time intent recognition. The project includes a wearable prototype, gesture recognition software, and user trials. It aims to improve accessibility, reduce reliance on voice assistants, and enable independent device control. Challenges include EMG signal noise and user variability, while opportunities arise from the growing demand for assistive tech.
Deep Learning; Wearable EMG Sensors, Biomedical Signal Processing; Real time EMG data processing;