Auditory and tactile code-modulated BCI

Context
A brain-computer interface (BCI) can use a multitude of control signals that are decodable from measured EEG. An example of such a control signal is the code-modulated visual evoked potential (c-VEP). It is the response to a pseudo-random sequence of flashes. The c-VEP BCI is one of the most accurate and fastest BCI in the literature to-date. It thus provides a good means for communication and control, for instance for ALS patients. However, in later stages of ALS, patients may lose the ability to keep their eyes open and fixated or their sight might deteriorate. It would thus be good to inspect other sensory modalities as well, for instance the auditory and tactile domains. Both of these have so far not yet been tested with code-modulated paradigms.
Image credit: Gao, S., Wang, Y., Gao, X., & Hong, B. (2014). Visual and auditory brain–computer interfaces. IEEE Transactions on Biomedical Engineering, 61(5), 1436-1447.
Research question
In this project, it shall be investigated if c-AEP and/or c-SEP as measured by EEG provide a reliable (i.e., higher than chance) control signal for BCI. In the auditory case, two audio streams could be presented, one to each of the ears, while in the tactile domain both hands could be stimulated with a vibrotactile device. Because of the lateralisation in the task setup (left versus right), changes in the locus of the responses can be expected as well (e.g., lateralization), that can be incorporated in the machine learning methods to improve decoding performance.
Skills / background required
- Very proficient in Python
- Proficient in machine learning
- Knowledge of auditory/tactile neuroscience