Bone-conduction headset for voice-free communication

MIT’s Arnav Kapur has created a device that senses and interprets neuromuscular signals created when we subvocalize. AlterEgo rests on the ear and extends across the jaw.  A pad sticks beneath the lower lip, and another below the chin. It senses jaw and facial tissue bone-conduction, undetectable by humans.

 Two bone-conduction headphones pick up inner ear vibrations, and four electrodes detect neuromuscular signals. Algorithms determine what a wearer is subvocalizing, and can report silently back. This enables communication with out speaking.

In studies,  researchers interacted with a computer to solve problems; a participant asked a computer the time and got an accurate response; and  another played a game of chess with a colleague.

Click to view MIT Media Lab video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 25, 2018 at the MIT Media Lab

Software records, organizes, analyzes 1 million neurons in real-time

Martin Garwicz and Lund University colleagues have developed a novel method for recording, organizing, and analyzing enormous amounts of neurohysiological data  from  implanted brain computer interfaces.

The technology simultaneously acquires data from 1 million neurons in real time. It converts spike data and sends it for processing and storage on conventional systems. Subject feedback is provided in  25 milliseconds — stimulating up to 100,000 neurons.

This has implications for  basic research, clinical diagnosis, and brain disease treatment, and is built for implantable, bidirectional brain computer interfaces, used to communicate complex data between neurons and computers. This includes monitoring the brain of paralyzed patients, early detection of epileptic seizures, and real-time feedback to control to robotic prostheses.


Announcing ApplySci’s 9th Wearable Tech + Digital Health + Neurotech conference — September 25, 2018 at the MIT Media Lab

Lightweight, highly portable, brain-controlled exoskeleton

EPFL’s José Millán has developed a brain-controlled, highly portable exoskeleton, that can be quickly  secured around joints with velcro. Metal cables act as soft tendons on the back of each finger, with the palm free to feel hand sensations.  Motors that push and pull the cables are worn on the chest. Fingers are flexed when the cables are pushed and extended when they are pulled.

The control interface can be eye-movement monitoring, phone-based voice controls, residual muscular activity, or EEG-driven brainwave analysis. Hand motions induced by the device elicited brain patterns typical of healthy hand motions.  Exoskeleton-induced hand motions combined with the brain interface lead to unusual neural patterns that could facilitate control of the device. Contralateral brain activity was observed in people who passively received hand motion by the exoskeleton. When subjects were asked to control the exoskeleton with their thoughts, same-side patterns were consistent.

Click to view EPFL video


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University. Speakers include:  Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld – Bob Knight – Phillip Alvelda – Paul Nuyujukian –  Peter Fischer – Tony Chahine – Shahin Farshchi – Ambar Bhattacharyya – Adam D’Augelli – Juan-Pablo Mas – Michael Eggleston – Walter Greenleaf – Jacobo Penide – David Sarno – Peter Fischer

Registration rates increase on January 26th

 

Closed loop EEG/BCI/VR/physical therapy system to control gait, prosthetics

Earlier this year, University of Houston’s Jose Luis Contreras-Vidal developed a closed-loop BCI/EEG/VR/physical therapy system to control gait as part of a stroke/spinal cord injury rehab program.  The goal was to promote and enhance cortical involvement during walking.

In a study, 8 subjects walked on a treadmill while watching an avatar and wearing a 64 channel EEG headset and motion sensors at the hip, knee and ankle.

The avatar was first activated by the motion sensors, allowing its movement to precisely mimic that of the test subject. It was  then controlled by the brain-computer interface, although this was less precise than the movement with the motion sensors. Contreras-Vidal believes that as subjects learn how to use the interface, the result will be closer to that of the sensors. The researchers reported increased activity in the posterior parietal cortex and the inferior parietal lobe, along with increased involvement of the anterior cingulate cortex.

The team built on this reasearch  to demonstrate how brain activity is used to identify different terrains to develop prosthetics that automatically adjust to changing ground conditions in real time. 

Click to view University of Houston video


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University. Speakers include:  Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld – Bob Knight – Phillip Alvelda – Paul Nuyujukian –  Peter Fischer – Tony Chahine – Shahin Farshchi – Ambar Bhattacharyya – Adam D’Augelli – Juan-Pablo Mas – Michael Eggleston – Walter Greenleaf – Jacobo Penide – David Sarno – Peter Fischer

Registration rates increase on January 26th

VR + neurofeedback for movement training after stroke


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg

Direct brain path for sight, sound via implanted microscope

Rice University’s Jacob Robinson, with Yale and Columbia colleagues, are developing FlatScope — a flat, brain implanted microscope that monitors and triggers neurons which are modified to be fluorescent when active.

While capturing greater detail than current brain probes, the microscope also goes through deep levels that illustrate  sensory input processing — which they hope to be able to control.

Aiming to produce a super high-resolution neural interface, FlatScope is a part of  DARPA’s NESD program, founded by Phillip Alvelda, and now led by Brad Ringeisen.


Phillip Alvelda will be a featured speaker at ApplySci’s Wearable Tech + Digital Health + NeuroTech Boston conference on September 19, 2017 at the MIT Media Lab.  Other speakers include:  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar  – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen – Daniela Rus

Registration rates increase Friday, July 21st

BCI-controlled exoskeleton helps motor recovery in stroke

Ipsihand, developed by Eric Leuthardt and Washington University colleagues, is a brain controlled glove that helps reroute hand control to an undamaged part of the brain.  The system uses a glove or brace on the hand, an EEG cap, and an amplifier.

One’s hands are controlled by the opposite side of the brain. If one hemisphere is damaged, it is difficult to control the other hand.

According to Leuthard, the idea of Ipsihand is that if one can “couple those motor signals that are associated with moving the same-sided limb with the actual movements of the hand, new connections will be made in your brain that allow the uninjured areas of your brain to take over control of the paralyzed hand.”

Ipsihand’s cap detects intention signals to open or close the hand, then the computer amplifies them. The brace then opens or closes in a pincer-like grip with the hand inside, bending the fingers and thumb to meet.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary

Tetraplegic patient moves arm with thoughts via BCI/FES system

Bolu Ajiboye and Case Western colleagues used an implanted  BrainGate2 brain-computer interface to allow a tetraplegia patient to control arm movements via an implanted FES muscle stimulation system.  A robotic arm, which was needed in previous BrainGate experiments,  was no longer required.

Neural activity was recorded from two 96-channel microelectrode arrays implanted in the motor cortex. The implanted brain-computer interface translated recorded brain signals into specific command signals that determine the amount of stimulation to be applied to each functional electrical stimulation electrode in the hand, wrist, arm, elbow and shoulder, and to a mobile arm support.

The researchers first exercised patient’s arm and hand with cyclical electrical stimulation patterns. Over 45 weeks, his strength, range of motion. and endurance improved. He then learned how to use his own brain signals to move a virtual-reality arm on a computer screen. After the 36 electrode FES system was implanted, he was able to make each joint in his right arm move individually. Or,  by thinking about a task such as feeding himself or getting a drink, the muscles are activated in a coordinated fashion.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator