Category Archives: BCI

TMS + VR for sensory, motor skill recovery after stroke

FacebooktwitterlinkedinFacebooktwitterlinkedin

EPFL’s Michela Bassolino has used transcranial magnetic stimulation to create hand sensations when combined with VR.

By stimulating the motor cortex,  subjects’ hand muscles  were activated, and involuntary short movements were induced.

In a recent study, when subjects observed a virtual hand moving at the same time and in a similar manner to their own during TMS, they felt that a virtual hand was a controllable body part.

25 of 32 participants experienced the effect within two minutes of stimulation. Bassolino believes that the effect may also be achieved through less immersive video.

The technology could  help patients recover sensory and motor skills after a stroke — and also be used as a gaming enhancement.


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 25, 2018 at the MIT Media Lab

Bone-conduction headset for voice-free communication

FacebooktwitterlinkedinFacebooktwitterlinkedin

MIT’s Arnav Kapur has created a device that senses and interprets neuromuscular signals created when we subvocalize. AlterEgo rests on the ear and extends across the jaw.  A pad sticks beneath the lower lip, and another below the chin. It senses jaw and facial tissue bone-conduction, undetectable by humans.

 Two bone-conduction headphones pick up inner ear vibrations, and four electrodes detect neuromuscular signals. Algorithms determine what a wearer is subvocalizing, and can report silently back. This enables communication with out speaking.

In studies,  researchers interacted with a computer to solve problems; a participant asked a computer the time and got an accurate response; and  another played a game of chess with a colleague.

Click to view MIT Media Lab video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 25, 2018 at the MIT Media Lab

Software records, organizes, analyzes 1 million neurons in real-time

FacebooktwitterlinkedinFacebooktwitterlinkedin

Martin Garwicz and Lund University colleagues have developed a novel method for recording, organizing, and analyzing enormous amounts of neurohysiological data  from  implanted brain computer interfaces.

The technology simultaneously acquires data from 1 million neurons in real time. It converts spike data and sends it for processing and storage on conventional systems. Subject feedback is provided in  25 milliseconds — stimulating up to 100,000 neurons.

This has implications for  basic research, clinical diagnosis, and brain disease treatment, and is built for implantable, bidirectional brain computer interfaces, used to communicate complex data between neurons and computers. This includes monitoring the brain of paralyzed patients, early detection of epileptic seizures, and real-time feedback to control to robotic prostheses.


Announcing ApplySci’s 9th Wearable Tech + Digital Health + Neurotech conference — September 25, 2018 at the MIT Media Lab

Lightweight, highly portable, brain-controlled exoskeleton

FacebooktwitterlinkedinFacebooktwitterlinkedin

EPFL’s José Millán has developed a brain-controlled, highly portable exoskeleton, that can be quickly  secured around joints with velcro. Metal cables act as soft tendons on the back of each finger, with the palm free to feel hand sensations.  Motors that push and pull the cables are worn on the chest. Fingers are flexed when the cables are pushed and extended when they are pulled.

The control interface can be eye-movement monitoring, phone-based voice controls, residual muscular activity, or EEG-driven brainwave analysis. Hand motions induced by the device elicited brain patterns typical of healthy hand motions.  Exoskeleton-induced hand motions combined with the brain interface lead to unusual neural patterns that could facilitate control of the device. Contralateral brain activity was observed in people who passively received hand motion by the exoskeleton. When subjects were asked to control the exoskeleton with their thoughts, same-side patterns were consistent.

Click to view EPFL video


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University. Speakers include:  Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld – Bob Knight – Phillip Alvelda – Paul Nuyujukian –  Peter Fischer – Tony Chahine – Shahin Farshchi – Ambar Bhattacharyya – Adam D’Augelli – Juan-Pablo Mas – Michael Eggleston – Walter Greenleaf – Jacobo Penide – David Sarno – Peter Fischer

Registration rates increase on January 26th

 

Closed loop EEG/BCI/VR/physical therapy system to control gait, prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Earlier this year, University of Houston’s Jose Luis Contreras-Vidal developed a closed-loop BCI/EEG/VR/physical therapy system to control gait as part of a stroke/spinal cord injury rehab program.  The goal was to promote and enhance cortical involvement during walking.

In a study, 8 subjects walked on a treadmill while watching an avatar and wearing a 64 channel EEG headset and motion sensors at the hip, knee and ankle.

The avatar was first activated by the motion sensors, allowing its movement to precisely mimic that of the test subject. It was  then controlled by the brain-computer interface, although this was less precise than the movement with the motion sensors. Contreras-Vidal believes that as subjects learn how to use the interface, the result will be closer to that of the sensors. The researchers reported increased activity in the posterior parietal cortex and the inferior parietal lobe, along with increased involvement of the anterior cingulate cortex.

The team built on this reasearch  to demonstrate how brain activity is used to identify different terrains to develop prosthetics that automatically adjust to changing ground conditions in real time. 

Click to view University of Houston video


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University. Speakers include:  Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld – Bob Knight – Phillip Alvelda – Paul Nuyujukian –  Peter Fischer – Tony Chahine – Shahin Farshchi – Ambar Bhattacharyya – Adam D’Augelli – Juan-Pablo Mas – Michael Eggleston – Walter Greenleaf – Jacobo Penide – David Sarno – Peter Fischer

Registration rates increase on January 26th

VR + neurofeedback for movement training after stroke

FacebooktwitterlinkedinFacebooktwitterlinkedin

Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg

Direct brain path for sight, sound via implanted microscope

FacebooktwitterlinkedinFacebooktwitterlinkedin

Rice University’s Jacob Robinson, with Yale and Columbia colleagues, are developing FlatScope — a flat, brain implanted microscope that monitors and triggers neurons which are modified to be fluorescent when active.

While capturing greater detail than current brain probes, the microscope also goes through deep levels that illustrate  sensory input processing — which they hope to be able to control.

Aiming to produce a super high-resolution neural interface, FlatScope is a part of  DARPA’s NESD program, founded by Phillip Alvelda, and now led by Brad Ringeisen.


Phillip Alvelda will be a featured speaker at ApplySci’s Wearable Tech + Digital Health + NeuroTech Boston conference on September 19, 2017 at the MIT Media Lab.  Other speakers include:  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar  – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen – Daniela Rus

Registration rates increase Friday, July 21st

BCI-controlled exoskeleton helps motor recovery in stroke

FacebooktwitterlinkedinFacebooktwitterlinkedin

Ipsihand, developed by Eric Leuthardt and Washington University colleagues, is a brain controlled glove that helps reroute hand control to an undamaged part of the brain.  The system uses a glove or brace on the hand, an EEG cap, and an amplifier.

One’s hands are controlled by the opposite side of the brain. If one hemisphere is damaged, it is difficult to control the other hand.

According to Leuthard, the idea of Ipsihand is that if one can “couple those motor signals that are associated with moving the same-sided limb with the actual movements of the hand, new connections will be made in your brain that allow the uninjured areas of your brain to take over control of the paralyzed hand.”

Ipsihand’s cap detects intention signals to open or close the hand, then the computer amplifies them. The brace then opens or closes in a pincer-like grip with the hand inside, bending the fingers and thumb to meet.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary