2014 - 2017
Signal Processing, Machine Learning
Have you ever watched any X-Man movies? In the series, there's a character named Prof. Xavier who have an ability to control things using his mind, including his wheelchair. This project aims to develop a Brain-computer interface to connect and translate signal from your brain (EEG) to an action which -- in this case -- used to control the wheelchair, like Prof. Xavier.
We use a visual stimulus called Steady State Visually Evoked Potentials (SSVEP) in this system. We use four boxes flickering in different frequencies in four positions related to the direction of wheelchair, UP position for *Go straight*, RIGHT position for *Turn right*, LEFT position for *Turn left*, and DOWN position for *Reverse*. The subject (wheelchair user) needs to concentrate on the flickering box according to the direction of wheelchair he/she intended to go.
In theory, when a subject see the flickering box in SSVEP stimulus, the brain produces a signal with same frequency (or multiple of it). After some signal processing to remove artefacts, noises, etc., we convert the signal to the frequency domain. This frequency is what we detect and translate to steering command. But in practice the frequency is not always the exact same as the stimulus. This is where the machine learning helps.