BrainBraille: Towards Passive Training in Brain-Computer Interfaces using fNIRS
Abstract
Amyotrophic Lateral Sclerosis (ALS) is a debilitating movement disability that causes patients to gradually lose their ability to voluntarily control their muscles. In some cases, patients who are "locked-in" are unable to move any muscles, leaving them with no means of communicating with caregivers. Brain-computer interfaces (BCIs) attempt to create a means of communication directly through brain activity, removing the need for movement. BrainBraille is a novel interaction method for BCIs, enabling complex text-based communication using attempted movements with a six-region pseudo-binary encoding. In this dissertation, I explore a wearable BCI using functional near-infrared scanning (fNIRS) to make BrainBraille mobile. In an early study, I show that transitional gestures based on executed movements of two hands can be classified in two participants with up to 93% accuracy. I explore how transitional gestures can benefit BrainBraille by expanding the vocabulary and enabling faster responses. Finally, I evaluate future paths for integrating passive haptic training into BrainBraille to reduce the physical exertion needed to learn a BCI for ALS patients.