BCI enabled 10-D prosthetic arm control

Jennifer Collinger and University of Pittsburgh colleagues have enabled a prosthetic arm wearer to reach, grasp, and place a variety of objects with 10-D control for the first time.

The trial participant had electrode grids with 96 contact points surgically implanted in her brain in 2012.  This allowed 3-D control of her arm. Each electrode point picked up signals from an individual neuron, which were relayed to a computer to identify the firing patterns associated with observed or imagined movements, such as raising or lowering the arm, or turning the wrist. This was used to direct the movements of a prosthetic arm developed by Johns Hopkins Applied Physics Laboratory.  Three months later, she also could flex the wrist back and forth, move it from side to side and rotate it clockwise and counter-clockwise, as well as grip objects, adding up to 7-D control.

The new study, published yesterday, allowed the participant 10-D control — the ability to move the robot hand into different positions while also controlling the arm and wrist.

To bring the total of arm and hand movements to 10, the pincer grip was replaced by four hand shapes: finger abduction, in which the fingers are spread out; scoop, in which the last fingers curl in; thumb opposition, in which the thumb moves outward from the palm; and a pinch of the thumb, index and middle fingers. As before, the participant watched animations and imagined the movements while the team recorded her brain signals. They used this to read her thoughts so that she could move the hand into various positions.

Wearable optical sensor controls prosthetic limbs

Ifor Samuel and Ashu Bansal at the University of St. Andrews have developed a wearable optical sensor that can be used to control the movement of artificial limbs.

Plastic semiconductor based sensors detect muscle contraction. Light is shined into fibrous muscle, and the scattering of the light is observed. When muscle is contracted, the light scatters less, because  muscle fibers are further apart. Sensors detect the changed scattering signals, and relay the information, as photocurrents, to a prosthetic limb, triggering movement.  A robotic arm was controlled using this method in a recent study.

Using disposable wearable optical sensors could eliminate patient risks associated with electrical based sensors, including electromagnetic interference, pain caused by sensing needles, and immune responses.

Applied robot control theory for more natural prosthetic leg movement

Prosthetics are lighter and more flexible than in the past, but fail to mimic human muscle power. Powered prosthesis motors generate force, but cannot respond with stability to disturbances or changing terrain.

Robert Gregg and colleagues at the University of Texas have applied robot control theory to allow powered prosthetics to dynamically respond to a wearer’s environment.  This has enabled wearers of a robotic leg to walk on a moving treadmill at similar speeds to able-bodied people.

Gregg said that “the gait cycle is a complicated phenomenon with lots of joints and muscles working together. We used advanced mathematical theorems to simplify the entire gait cycle down to one variable. If you measure that variable, you know exactly where you are in the gait cycle and exactly what you should be doing.”

In a recent study, algorithms with sensors measured the center of pressure on a powered prosthesis. Inputted with a user’s height, weight and dimension of the residual thigh, the prosthesis was configured for each subject in 15 minutes.  Subjects walked on the ground, and on a treadmill, at increasing speeds. Their walking speeds were greater than 1 meter per second throughout the study.  Typical able-bodied walking speed is  1.3 meters per second.  Participants also reported less energy exertion than with traditional prostheses.

NIH “Bionic Man” with 14 sensor and brain controlled functions

The National Institute of Biomedical Imaging and Bioengineering recently launched the “NIBIB Bionic Man,” an interactive Web tool detailing 14 sensor based technologies they are supporting.  They include:

1. A robotic leg prosthesis that senses a person’s next move and provides powered assistance to achieve a more natural gate.

2. A light sensitive biogel and biological adhesive to help new cartilage grow and become functional.

3. A blood clot emulator used to optimize ventricular assist devices to reduce the risk of blood clots.

4. An artificial kidney that could be used in place of kidney dialysis for treatment of end-stage kidney disease.

5. A micro needle patch that delivers vaccines painlessly and doesn’t require refrigeration.

6. An interstitial pressure sensor to help doctors determine optimal times for delivering chemotherapy/radiation to cancer patients.

7. Glucose-sensing contact lenses to provide a non-invasive solution for continuous blood sugar monitoring.

8. A tongue drive system to help individuals with severe paralysis navigate their environment using only tongue movements.

9. A wireless brain-computer interface that records and transmits brain activity wirelessly and could allow people with paralysis to use their thoughts to control robotic arms or other devices.

10. Implantable myoelectric sensors to detect nerve signals above a missing limb and can use these signals to move a prosthesis in a more natural way.

11. A synthetic glue modeled after an adhesive found in nature that could be used to repair tissues in the body.

12. Focused ultrasound used to temporarily open the blood brain barrier to let gene therapy treatments reach the brain.

13. Flexible electrode arrays that record brain activity from the surface of the brain and could be used to control robotic arms or provide real-time information about brain states.

14. Electrical stimulation of the spinal cord used in individuals with paralysis to help restore voluntary movement and other functions.

Virtual reality movement training for amputees

CAREN, developed at the University of South Florida, helps those with limb loss and prosthetics improve basic function, symmetry and walking efficiency.  It is also a tool for researchers to study ways to improve mobility and balance.

Wearing a safety harness and walking on a  treadmill in the room-sized system, participants of a recent study engaged in audio-visual balance games, explored virtual environments, and used an avatar to simulate activities fro on a surround screen.

CAREN’s interactive games allow for physical rehabilitation, combined with cognitive tasks, such as requiring someone to dig for objects in a virtual world while still walking on a treadmill. Distraction gait training could help  balance, mobility and coordination in PTSD, traumatic brain injury or stroke patients.

Boat driving, walking in a combat environment or mountain hiking  can be simulated. Visual tracking technology allows researchers to evaluate a patient’s gait or performance in real time, and immediately adjust the system to customize the rehab/training process.

Prosthetic arm moves after muscle contraction detected

DEKA is a robotic, prosthetic arm that will allow amputees to perform complex movements and tasks. It has just received FDA approval.

Electrodes attached to the arm detect muscle contractions close to the prosthesis, and a computer translates them into movement.  Six “grip patterns” allow wearers to drink a cup of water, hold a cordless drill or pick up a credit card or a grape, among other functions.

DARPA‘s Justin Sanchez believes that DEKA “provides almost natural control of upper extremities for people who have required amputations.”  He claims that “this arm system has the same size, weight, shape and grip strength as an adult’s arm would be able to produce.”

“Brain modeled” chip with prosthetic potential

Neurogrid is a “human brain based” microchip that is 9,000 times faster than and requires 1/40,000 the power of a typical pc.  It is being developed by Professor Kwabena Boahen at Stanford University.

The circuit board consists of 16 custom-designed “Neurocore” chips which can simulate 1 million neurons and billions of synaptic connections. Certain synapses were enabled to share hardware circuits, saving power.

Its speed and low power character could impact the development of prosthetic limbs that are controlled by a similar chip and not tethered to a power source.  Such a limb could have “the speed and complexity of our own actions” according to Professor Boahen.

BCI and robotic prosthetic paralympic competition

Switzerland will host the world’s first Cybathlon, an Olympic-style competition for parathletes using robotic assistive devices and brain computer interfaces.

It will include six events: a bike race, leg race, wheelchair race, exoskeleton race, arm prosthetic race (including electrical muscle stimulation), and a Brain Computer Interface race for competitors with full paralysis.

Unlike the Olympics, where athletes can use prosthetics to make themselves only as good as able-bodied athletes, Cybathlon competitors are encouraged to use the best technology. Prizes will be awarded both to the athlete and to the company that created the prosthetic, device or software.  The assistive devices can include commercially available products provided by companies and prototypes developed by research labs.

Cortical-spinal prosthesis directs “targeted movement” in paralyzed limbs

http://www.nature.com/ncomms/2014/140218/ncomms4237/full/ncomms4237.html

Cornell‘s Maryam ShanechiHarvard‘s Ziv Williams and colleagues developed a cortical-spinal prosthesis that directs “targeted movement” in paralyzed limbs. They tested a prosthesis that connects two subjects by enabling one subject to send its recorded neural activity to control limb movements in a different subject that is temporarily sedated.

The BMI is based on a set of real-time decoding algorithms that process neural signals by predicting their targeted movements. In the experiment, one animal acted as the controller of the movement or the “master,” then “decided” which target location to move to, and generated the neural activity that was decoded into this intended movement. The decoded movement was used to directly control the limb of the other animal by electrically stimulating its spinal cord.

The researchers focused on decoding the target endpoint of the movement as opposed to its detailed kinematics. This allowed them to match the decoded target with a set of spinal stimulation parameters that generated limb movement toward that target. They demonstrated that the alert animal could produce two-dimensional movement in the sedated animal’s limb .

The experiment focused on two different animals, rather than just one with a temporarily paralyzed limb. The scientists contend that this provided a true model of paralysis, since the master animal’s brain and the sedated animal’s limb had no physiological connection, as is the case for a paralyzed patient.

Trial: Improved sense of touch and control in prosthetic hand

http://stm.sciencemag.org/content/6/222/222ra19

In the ongoing effort to improve the dexterity of prosthetics, a recent trial showed an improved sense of touch and control over a prosthetic hand.  EPFL professor Silvestro Micera and colleagues surgically attached electrodes from a robotic hand to a volunteer’s median and ulnar nerves. Those nerves carry sensations that correspond with the volunteer’s index finger and thumb, and with his pinky finger and the edge of his hand, respectively. The volunteer controlled the prosthetic with small muscle movements detected by sEMG, a non-invasive method that measures electrical signals through the skin.

Over seven days, the volunteer was asked to grasp something with a light grip, a medium grip, and a hard grip, and to evaluate the shape and stiffness of three kinds of objects. During 710 tests, he wore a blindfold and earphones so that he could not use his vision or sound to guide the prosthetic. The researchers also sometimes turned off the sensory feedback to test whether he was using time to modulate his grip.

The subject was able to complete the requested tasks with his prosthetic thumb and index finger 67 percent of the time the first day and 93 percent of the time by the seventh day of the experiment. His pinky finger was harder to control: he was only able to accomplish the requested grip 83 percent of the time. In both the grip strength tests and in detecting the stiffness of objects, the volunteer made mistakes with the medium setting and object, but he never confused the softest and hardest objects. The ability to modulate his grip strength is this study’s main progress over previous work by the same group.