Thought, gesture-controlled robots

MIT CSAIL’s Daniela Rus has developed an EEG/EMG robot control system based on brain signals and finger gestures.

Building on the team’s previous brain-controlled robot work, the new system detects, in real-time, if a person notices a robot’s error. Muscle activity measurement enables the use of hand gestures to select the correct option.

According to Rus: “This work, combining EEG and EMG feedback, enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback. By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

The researchers used  a humanoid robot from Rethink Robotics, while a human controller wore electrodes on her or his head and arm.

Human supervision  increased the choice of correct target from 70 to 97 per cent.

The goal is system that can be used for people with limited mobility or language disorders.

Click to view CSAIL video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE FRIDAY, JUNE 22nd

Phillip Alvelda: More intelligent; less artificial | ApplySci @ Stanford

Phillip Alvelda discussed AI and the brain at ApplySci’s recent Wearable Tech + Digital Health + Neurotech Silicon Valley conference at Stanford:


Dr. Alvelda will join us again at Wearable Tech + Digital Health + Neurotech Boston, on September 24, 2018 at the MIT Media Lab.  Other speakers include: Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum Marom Bikson

REGISTRATION RATES INCREASE JUNE 22nd

Algorithm predicts low blood pressure during surgery

UCLA’s Maxime Cannesson has developed an algorithm that, in a recent study, predicted  an intraoperative hypotensive event 15 minutes before it occurred in 84 percent of cases, 10 minutes before in 84 percent of cases, and five minutes before in 87 percent of cases.

The goal is early identification and treatment, to prevent complications, such as postoperative heart attack, acute kidney injury, or death.

The algorithm is based on recordings of the increase and decrease of blood pressure in the arteries during a heartbeat—including episodes of hypotension. For each heartbeat, the researchers were able to derive 3,022 individual features from the arterial pressure waveforms, producing more than 2.6 million bits of information. They then identified which of the features—when they happen together and at the same time—predict hypotension.

Cannesson said that the research “opens the door to the application of these techniques to many other physiological signals, such as EKG for cardiac arrhythmia prediction or EEG for brain function” and “could lead to a whole new field of investigation in clinical and physiological sciences and reshape our understanding of human physiology.”


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE FRIDAY, JUNE 15TH

Nano-robots remove bacteria, toxins from blood

UCSD’s Joe Wang and Liangfang Zhang have developed tiny ultrasound-powered robots that can swim through blood, removing harmful bacteria and toxins.

Gold nanowires were coated with platelet and red blood cell membranes,  allowing the nanorobots to perform the tasks of two different cells at once—platelets, which bind pathogens, and red blood cells, which absorb and neutralize toxins. The gold body responds to ultrasound, giving the nanorobots the ability to swim rapidly without chemical fuel, helping them mix with blood bacteria and toxins and speed detoxification.

The robots are about 25 times smaller than the width of a human hair and can travel 35 micrometers per second in blood. They were tested on MRSA contaminated blood samples, which three times less bacteria and toxins than untreated samples after five minutes.

Click to view UCSD video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE FRIDAY, JUNE 15TH

Bob Knight on decoding language from direct brain recordings | ApplySci @ Stanford


Berkeley’s Bob Knight discussed (and demonstrated) decoding language from direct brain recordings at ApplySci’s recent Wearable Tech + Digital Health + Neurotech Silicon Valley at Stanford:


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda

Join Apply Sci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22, 2019 at Stanford University

“Artificial nerve” system for sensory prosthetics, robots

Stanford’s  Zhenan Bao has developed an artificial sensory nerve system that can activate the twitch reflex in a cockroach and identify letters in the Braille alphabet. Bao describes it as “a step toward making skin-like sensory neural networks for all sorts of applications”  which would include artificial skin that creates a sense of touch in prosthetics.

The artificial nerve circuit integrates three components:

  • A touch sensor that can detect minuscule forces.
  • A flexible electronic neuron which receives signals from the touch sensor.
  • An artificial synaptic transistor modeled after human synapses which is stimulated by theses sensory signals.

The system was successfully tested to generate both reflexes and a sense touch. The team  also hopes to create low-power, artificial sensor nets to cover robots, to provide feedback that makes them more agile.

Click to view Science video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda

Join Apply Sci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22, 2019 at Stanford University.  Zhenan Bao will be the keynote speaker.

Body heat-powered, self-repairing health sensor system

Hossam Haick at Technion-Israel Institute of Technology has developed a body heat powered, self-repairing system of sensors for disease detection and monitoring.

Unlike other wearables, the ability to derive energy from the wearer,  and to fix tears and scratches, prevents the need to turn off the device for repair or charging, allowing truly continuous tracking.


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum

Nathan Intrator on epilepsy, AI, and digital signal processing | ApplySci @ Stanford

Nathan Intrator discussed epilepsy, AI and digital signal processing at ApplySci’s Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 26-27, 2018 at Stanford University:


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum

Ingestible “bacteria on a chip” detects blood, inflammation

MIT’s Timothy Lu has developed an ingestible sensor with embedded genetically engineered bacteria to  diagnose bleeding or other gastrointestinal issues.

The “bacteria-on-a-chip” approach combines living cell sensors with ultra-low-power electronics that convert the bacterial response into a signal read by a phone.

The technology has only been tested in pigs, but shows promise in detecting gastrointestinal blood and inflammation. The researchers believe that the sensor will be able to be remain in the digestive tract for days or weeks, sending continuous signals.

Click to view MIT video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum

Cheap, noninvasive patch monitors glucose

UCSD’s Joe Wang‘s needless adhesive glucose monitor has begun a phase I clinical trial.  The small patch measures insulin levels through sweat on the skin, eliminating the need for a skin prick.  The paper – tattoo is printed with two integrated electrodes that apply a small amount of electrical current.  Glucose molecules residing below the skin are forced to rise to the surface, allowing blood sugar to be measured.

Through its SENSOR study,  the team s testing the tattoo-like sensor’s accuracy, compared to a traditional glucometer. The  trial is enrolling 50 adults, ages 18 to 75, with type 1 or 2 diabetes, or diabetes due to other causes. Participants wear a sensor while fasting, and up to 2 hours after eating.

The goal is a cheap, noninvasive, discreet, user friendly glucose monitor that provides continuous measurement.  The sensor currently provides only one readout.


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum