Implanted electrodes + algorithm allow thought-driven 4 limb exoskeleton control

Alim Louis Benabid and Clinatec/University of Grenoble colleagues have developed a brain computer interface controlled exoskeleton that enabled a tetraplegic man to walk and move his arms.  Two 64 electrode brain implants drove the system.

Benabid explained the benefits, stating that “previous brain-computer studies have used more invasive recording devices implanted beneath the outermost membrane of the brain, where they eventually stop working. They have also been connected to wires, limited to creating movement in just one limb, or have focused on restoring movement to patients’ own muscles.”

The exoskeleton can only be used in the lab at this point, as it still must be connected to a ceiling-harness, since it is unable to make small adjustments necessary to prevent falls.


Join ApplySci at the 12th Wearable Tech + Digital Health + Neurotech Boston conference on November 14, 2019 at Harvard Medical School featuring talks by Brad Ringeisen, DARPA – Joe Wang, UCSD – Carlos Pena, FDA  – George Church, Harvard – Diane Chan, MIT – Giovanni Traverso, Harvard | Brigham & Womens – Anupam Goel, UnitedHealthcare  – Nathan Intrator, Tel Aviv University | Neurosteer – Arto Nurmikko, Brown – Constance Lehman, Harvard | MGH – Mikael Eliasson, Roche – Nicola Neretti, Brown

Join ApplySci at the 13th Wearable Tech + Neurotech + Digital Health Silicon Valley conference on February 11-12, 2020 on Sand Hill Road featuring talks by Zhenan Bao, Stanford – Rudy Tanzi, Harvard – Shahin Farshchi – Lux Capital – Sheng Xu, UCSD – Carla Pugh, Stanford – Nathan Intrator, Tel Aviv University | Neurosteer – Wei Gao, Caltech

CTRL-Labs acquired by Facebook for 500M – 1B

Congratulations to CTRL-Labs and Lux Capital on Facebook’s acquisition of the four year old Neurotech startup. The company, whose technology assists in decoding brain activity and intention, will join Facebook’s AR/VR team.

CTRL-Labs participated in a recent ApplySci panel of startups at Stanford led by Lux Capital’s Shahin Farshchi. Facebook presented its Brain Computer Interface work at the ApplySci conference at the MIT Media Lab in 2017.

ApplySci’s next conference, at Harvard Medical School, will take place on November 14, 2019. It will again include a panel of startups — perhaps the next unicorns — and a series of talks by leading brain and body health scientists.

I hope that you’ll join us.


Join ApplySci at the 12th Wearable Tech + Digital Health + Neurotech Boston conference on November 14, 2019 at Harvard Medical School featuring talks by Brad Ringeisen, DARPA – Joe Wang, UCSD – Carlos Pena, FDA  – George Church, Harvard – Diane Chan, MIT – Giovanni Traverso, Harvard | Brigham & Womens – Anupam Goel, UnitedHealthcare  – Nathan Intrator, Tel Aviv University | Neurosteer – Arto Nurmikko, Brown – Constance Lehman, Harvard | MGH – Mikael Eliasson, Roche – Nicola Neretti – Brown

BCI reads whole words from thoughts; no virtual keyboard necessary

Edward Chang at UCSF, Mark Chevillet at Facebook, and colleagues, have published a study where implanted electrodes were used to “read” whole words from thoughts.  Previous technology required users to spell words with a virtual keyboard.

Subjects listened to multiple-choice questions and spoke answers aloud.  An electrode array recorded activity in parts of the brain associated with understanding and producing speech, and sought patterns that matched with words and phrases in real-time.

Participants responded to questions with one of several options while their brain activity was recorded. The system guessed when they were asking a question and when they were answering it, and then the content of both speech events. The predictions were shaped by prior contex. Results were 61 to 76 percent accurate, compared with 7 to 20 percent accuracy expected by chance.

This builds on Facebook technology described by Mark Chevillet at the ApplySci conference at the MIT Media Lab in September, 2017, and could result in the ability for the speech-impaired to freely communicate.


Join ApplySci at the 12th Wearable Tech + Digital Health + Neurotech Boston conference on November 14, 2019 at Harvard Medical School featuring talks by Brad Ringeisen, DARPA – Joe Wang, UCSD – Carlos Pena, FDA  – George Church, Harvard – Diane Chan, MIT – Giovanni Traverso, Harvard | Brigham & Womens – Anupam Goel, UnitedHealthcare  – Nathan Intrator, Tel Aviv University | Neurosteer – Arto Nurmikko, Brown – Constance Lehman, Harvard | MGH – Mikael Eliasson, Roche – David Rhew, Samsung

Join ApplySci at the 13th Wearable Tech + Neurotech + Digital Health Silicon Valley conference on February 11-12, 2020 at Stanford University featuring talks by Zhenan Bao, Stanford – Rudy Tanzi, Harvard – David Rhew, Samsung – Carla Pugh, Stanford – Nathan Intrator, Tel Aviv University | Neurosteer

Study: Noninvasive BCI improves function in paraplegia

Miguel Nicolelis has developed a non-invasive system for lower-limb neurorehabilitation.

Study subjects wore an EEG headset  to record brain activity and detect movement intention. Eight electrodes were attached to each leg, stimulating muscles involved in walking.  After training, patients used their own brain activity to send electric impulses to their leg muscles, imposing a physiological gait. With a walker and a support harness, they learned to walk again, and increased their sensorimotor skills. A wearable haptic display delivered tactile feedback to forearms, to provide continuous proprioceptive walking feedback.

The system was tested on two patients with chronic paraplegia. Both were able to move with less dependency on walking assistance, and one displayed motor improvement. Cardiovascular capacity and muscle volume also improved.

Click to view EPFL video


Join ApplySci at the 12th Wearable Tech + Digital Health + Neurotech Boston conference on November 14, 2019 at Harvard Medical School and the 13th Wearable Tech + Neurotech + Digital Health Silicon Valley conference on February 11-12, 2020 at Stanford University

Thought generated speech

Edward Chang and UCSF colleagues are developing technology that will translate signals from the brain into synthetic speech.  The research team believes that the sounds would be nearly as sharp and normal as a real person’s voice. Sounds made by the human lips, jaw, tongue and larynx would be simulated.

The goal is a communication method for those with disease and paralysis.

According to Chang: “For the first time, this study demonstrates that we can generate entire spoken sentences based on an individual’s brain activity.”

Berkeley’s Bob Knight has developed related technology, using HFB activity to decode imagined speech to develop a BCI for treatment of disabling language deficits.  He described this work at the 2018 ApplySci conference at Stanford.


Join ApplySci at the 12th Wearable Tech + Digital Health + Neurotech Boston conference on November 14, 2019 at Harvard Medical School and the 13th Wearable Tech + Neurotech + Digital Health Silicon Valley conference on February 11-12, 2020 at Stanford University

Thought controlled tablets

The BrainGate/Brown/Stanford/MGH/VA consortium has published a study describing three teraplegic patients who were able to control an off the shelf tablet with their thoughts. They surfed the web, checked the weather and shopped online. A musician played part of Beethoven’s “Ode to Joy” on a digital piano interface.

The BrainGate BCI included a small implant that detected and recorded signals associated with intended movements produced in the motor cortex. Neural signals were routed to a Bluetooth interface that worked like a wireless mouse, which was paired to an unmodified tablet.

Participants made up to  22 point-and-click selections per minute while using several apps. They typed up to 30 characters per minute with standard email and text interfaces.

The researchers believe that the technology can open new lines of communication between brain disorder patients and their caregivers.

Click to view BrainGate video


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod KhoslaWalter Greenleaf – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi Emmanuel Mignot Michael Snyder Joe Wang – Josh Duyan – Aviad Hai Anne Andrews Tan Le – Anima Anandkumar – Hugo Mercier

Thought controlled television

Samsung and EPFL researchers, including Ricardo Chavarriaga, are developing Project Pontis, a BCI system meant to allow the disabled to control a TV with their thoughts.

The prototype uses a 64 sensor headset plus eye tracking to determine when a user has selected a particular movie. Machine learning is used to build a profile of videos one is interested in, allowing future content suggestions.  The user ultimately makes a selection using eye tracking.  The team is now working on a system that relies on brain signals alone for users who aren’t able to control their eyes or other muscles reliably,

Click to view Samsung video


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod KhoslaWalter Greenleaf – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi Emmanuel Mignot Michael Snyder Joe Wang – Josh Duyan – Aviad Hai Anne Andrews Tan Le – Anima Anandkumar

Brain-to-brain communication interface

Rajesh Rao and University of Washington colleagues have developed BrainNet, a non-invasive direct brain-to-brain interface for multiple people.  The goal is a social network of human brains for problem solving. The interface combines EEG to record brain signals and TMS to deliver information to the brain, enabling 3 people to collaborate via direct brain-to-brain communication.

In a recent study, two of the three subjects were “Senders.” Their brain signals were decoded with real-time EEG analysis to extract decisions about whether to rotate a block in a Tetris-like game before it is dropped to fill a line. The Senders’ decisions were sent via the Internet to the brain of a third subject, the “Receiver.”  Decisions were delivered to the Receiver’s brain via magnetic stimulation of the occipital cortex. The Receiver integrated the information and made a decision, using an EEG interface, to either turn a block or keep it in the same position.  A second round of the game gave Senders another chance to validate and provide feedback to the Receiver’s action.


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod Khosla – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi

DARPA: Three aircraft virtually controlled with brain chip

Building on 2015 research that enabled a paralyzed person to virtually control an F-35 jet, DARPA’s Justin Sanchez has announced that the brain can be used to command and control three types of aircraft simultaneously.

Click to view Justin Sanchez’s talk at ApplySci’s 2018 conference at Stanford University


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson – Ed Simcox – Sean Lane

Combined BCI + FES system could improve stroke recovery

Jose Millan and EPFL colleagues have combined a brain computer interface with functional electrical stimulation in a system that, in a study, showed the ability to enhance the restoration of limb use after a stroke.

According to Millan: “The key is to stimulate the nerves of the paralyzed arm precisely when the stroke-affected part of the brain activates to move the limb, even if the patient can’t actually carry out the movement. That helps re-establish the link between the two nerve pathways where the signal comes in and goes out.”

27 patients with a similar lesion that resulted in moderate to severe arm paralysis following a stroke participated in the trial. Half were treated with the dual-therapy approach, and reported clinically significant improvements.  A BCI system  enabled the researchers to pinpoint where the electrical activity occurred in the brain when they tried to extend their hands. Each time the electrical activity was identified, the system stimulated the muscle controlling the corresponding wrist and finger movements.

The control group received FES only, and had their arm muscles stimulated randomly. This allowed the scientists to understand how much additional motor function improvement could be attributed to the BCI system.


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE JUNE 29TH