Engineers Create Brain-Machine Interface for Controlling Robot Exoskeleton

Using a cap, the system allows users to move forwards, turn left and right, sit and stand by staring at one of five LEDs. Each LED flickers at a different frequency, and when the user focusses attention on a specific LED this frequency is reflected within the electroencephalogram readout.
 
This signal is identified and used to control the exoskeleton. A key problem has been separating these precise brain signals from those associated with other brain activity, and the highly artificial signals generated by the exoskeleton. “Exoskeletons create lots of electrical ‘noise.’ The electroencephalogram (EEG) signal gets buried under all this noise, but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal,” said team member Dr Klaus-Robert Muller from TU Berlin and Korea University.
 
Although the team reports tests on healthy individuals, the system has the potential to aid sick or disabled people. “People with amyotrophic lateral sclerosis or high spinal cord injuries face difficulties communicating or using their limbs. Decoding what they intend from their brain signals could offer means to communicate and walk again,”