Cortical founder and former DARPA NESD program manager Phillip Alvelda discusses AI and the brain at ApplySci’s Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 26-27, 2018 at Stanford University:
Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference – September 25, 2018 at the MIT Media Lab
Earlier this year, University of Houston’s Jose Luis Contreras-Vidal developed a closed-loop BCI/EEG/VR/physical therapy system to control gait as part of a stroke/spinal cord injury rehab program. The goal was to promote and enhance cortical involvement during walking.
In a study, 8 subjects walked on a treadmill while watching an avatar and wearing a 64 channel EEG headset and motion sensors at the hip, knee and ankle.
The avatar was first activated by the motion sensors, allowing its movement to precisely mimic that of the test subject. It was then controlled by the brain-computer interface, although this was less precise than the movement with the motion sensors. Contreras-Vidal believes that as subjects learn how to use the interface, the result will be closer to that of the sensors. The researchers reported increased activity in the posterior parietal cortex and the inferior parietal lobe, along with increased involvement of the anterior cingulate cortex.
The team built on this reasearch to demonstrate how brain activity is used to identify different terrains to develop prosthetics that automatically adjust to changing ground conditions in real time.
Click to view University of Houston video
Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University. Speakers include: Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld – Bob Knight – Phillip Alvelda – Paul Nuyujukian – Peter Fischer – Tony Chahine – Shahin Farshchi – Ambar Bhattacharyya – Adam D’Augelli – Juan-Pablo Mas – Michael Eggleston – Walter Greenleaf – Jacobo Penide – David Sarno – Peter Fischer
Registration rates increase on January 26th
Jonathan Posner, with University of Washington and UCLA colleagues, has developed a flexible sensor “skin” that can be stretched over prostheses to determine force and vibration.
The skin mimics the way a human finger responds to tension and compression, as it slides along a surface or distinguishes among different textures. This could allow users to sense when something is slipping out of their grasp.
Tiny electrically conductive liquid metal channels are placed on both sides of of a prosthetic finger. As it is slid across a surface, the channels on one side compress while those on the other side stretch, similar to a natural limb. As the channel geometry changes, so does the amount of electricity. Differences in electrical resistance correlate with force and vibrations.
The researchers believe that the sensor skin will enable users to better be able to open a door, use a phone, shake hands, or lift packages.
Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring: Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld
Ghazal Ghazai and Newcastle University colleagues have developed a deep learning driven prosthetic hand + camera system that allow wearers to reach for objects automatically. Current prosthetic hands are controlled via a user’s myoelectric signals, requiring learning, practice, concentration and time.
A convolutional neural network was trained it with images of 500 graspable objects, and taught to recognize the grip needed for each. Objects were grouped by size, shape, and orientation, and the hand was programmed to perform four different grasps to accommodate them: palm wrist neutral (to pick up a cup); palm wrist pronated (to pick up the TV remote); tripod (thumb and two fingers), and pinch (thumb and first finger).
The hand’s camera takes a picture of the object in front of it, assesses its shape and size, picks the most appropriate grasp, and triggers a series of hand movements, within milliseconds.
In a small study of the technology, subjects successfully picked up and moved objects with an 88 per cent success rate.
The work is part of an effort to develop a bionic hand that senses pressure and temperature, and transmits the information to the brain.
Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda
Professor Ravinder Dahiya, at the University of Glasgow, has created a robotic hand with solar-powered graphene “skin” that he claims is more sensitive than a human hand. The flexible, tactile, energy autonomous “skin” could be used in health monitoring wearables and in prosthetics, reducing the need for external chargers. (Dahiya is now developing a low-cost 3-D printed prosthetic hand incorporating the skin.)
Click to view University of Glasgow video
Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Roz Picard, Tom Insel, John Rogers, Jamshid Ghajar and Nathan Intrator – September 19, 2017 at the MIT Media Lab
This week at the Pentagon, Johnny Matheny unveiled his DARPA developed prosthetic arm. The mind-controlled prosthesis has the same size, weight, shape and grip strength of a human arm, and, according to Matheny, can do anything one can do.
It is, by all accounts, the most advanced prosthetic limb created to date.
The 100 sensor arm was developed as part of the “Revolutionizing Prosthetics Program” of the Biological Technologies office, led by Dr. Justin Sanchez.
An implanted neural interface allows the wearer to control the arm with his thoughts. Sensors are also implanted in the fingertips, sending signals back to the brain, allowing users to feel sensations.
Click to view Johns Hopkins video.
Dr. Sanchez will be a keynote speaker at ApplySci’s Wearable Tech + Digital Health + NeuroTech NYC conference on June 7-8, 2016 at the New York Academy of Sciences.
Swiss Federal Institute of Technology and Scuola Superiore Sant’Anna researchers have developed a bionic fingertip that allows amputees to feel textures and differentiate between rough and smooth surfaces.
Electrodes were surgically implanted into the upper arm of a man whose arm had been amputated below the elbow. A machine moved an artificial finger, wired with electrodes, across smooth and rough lines on a plastic strip. The fingertip movement generated an electrical signal, which translated into a series of electrical spikes sent to the brain. The spikes mimicked the language of the nervous system and create the sensation of feeling.
The subject, Aabo Sørensen said: “When the scientists stimulate my nerves I could feel the vibration and sense of touch in my phantom index finger,” he says. “[It] is quite close to when you feel it with your normal finger you can feel the coarseness of the plates and different gaps and ribs.”
Click to view EPFL video.
Wearable Tech + Digital Health San Francisco
– April 5, 2016 @ the Mission Bay Conference Center
NeuroTech San Francisco
– April 6, 2016 @ the Mission Bay Conference Center
Wearable Tech + Digital Health NYC
– June 7, 2016 @ the New York Academy of Sciences
– June 8, 2016 @ the New York Academy of Sciences
Johns Hopkins researchers have developed a proof-of-concept for a prosthetic arm with fingers that, for the first time, can be controlled with a wearer’s thoughts.
The technology was tested on an epileptic patient who was not missing any limbs. The researchers used brain mapping technology to bypass control of his arms and hands. (The patient was already scheduled for a brain mapping procedure.) Brain electrical activity was measured for each finger.
This was an invasive procedure, which required implanting an array of 128 electrode sensors, on sheet of film, in the part of the brain that controls hand and arm movement. Each sensor measured a circle of brain tissue 1 millimeter in diameter.
After compiling the motor and sensory data, the arm was programmed to allow the patient to move individual fingers based on which part of his brain was active.
The team said said that the prosthetic was initially 76 percent accurate, and when they combined the signals for the ring and pinkie fingers, accuracy increased to 88 percent.
Click to view Johns Hopkins video.
Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center
NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center
Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences
NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences
Hossam Haick and Technion colleagues are developing materials to be integrated into flexible electronics that mimic the healing properties of human skin. The goal is to quickly repair incidental scratches or damaging cuts that might compromise device functionality. The synthetic polymer can “heal” electronic skin in one day, which can improve the materials used to achieve a sense of touch in prosthetics.
The new sensor is comprised of a self-healing substrate, high conductivity electrodes, and molecularly modified gold nanoparticles. The researchers noted that “the healing efficiency of this chemiresistor is so high that the sensor survived several cuttings at random positions.”
WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER
NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER