MIT CSAIL’s Daniela Rus has developed an EEG/EMG robot control system based on brain signals and finger gestures.
Building on the team’s previous brain-controlled robot work, the new system detects, in real-time, if a person notices a robot’s error. Muscle activity measurement enables the use of hand gestures to select the correct option.
According to Rus: “This work, combining EEG and EMG feedback, enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback. By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”
The researchers used a humanoid robot from Rethink Robotics, while a human controller wore electrodes on her or his head and arm.
Human supervision increased the choice of correct target from 70 to 97 per cent.
The goal is system that can be used for people with limited mobility or language disorders.
Click to view CSAIL video
Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab. Speakers include: Rudy Tanzi – Mary Lou Jepsen – George Church – Roz Picard – Nathan Intrator – Keith Johnson – Juan Enriquez – John Mattison – Roozbeh Ghaffari – Poppy Crum – Phillip Alvelda – Marom Bikson
REGISTRATION RATES INCREASE FRIDAY, JUNE 22nd