Category Archives: Prosthetics

Sensors allow more natural sense of touch in prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Stanford’s Zhenan Bao is developing technology that could restore a more natural sense of touch in prosthetics.  Her flexible, thin plastic sensors send signals to the brain that more closely resemble nerve messages of human skin touch sensors.

The disruptive technology has not yet been tested on humans, and researchers still need to find a safe way to pass electrical signals from prostheses to the brain for long periods.

Many teams are working toward this (see ApplySci coverage from 2013-2015).   Previous tactile sensors have however been analogue devices, where more pressure produces a stronger electrical signal, rather than a more frequent stream of pulses. The electrical signals must then be sent to another processing chip that converts the strength of the signals to a digital stream of pulses that is only then sent on to peripheral nerves or brain tissue.  Bao’s sensors send digital signals directly.

Click to view Stanford University video.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

Biocompatible neural prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Spinal injury patients, and those with lost limbs, sometimes have neural prosthetic devices implanted in an attempt to regain independence.  They are used for deep brain stimulation and brain controlled external prosthetics.  However, neural prosthetics are often rejected by the immune system, and can  fail because of a mismatch between soft brain tissue and rigid devices.

University of Pennsylvania‘s Mark Allen and colleagues have created an implantable neural prosthetic device that is biocompatible and replaces silicon and noble metal. The goal is to avoid immune-system rejection, failures due to tissue strain, neurodegeneration, and decreased fidelity of recorded neural signals.

“Lifelike” bionic hand for women and teenagers

FacebooktwitterlinkedinFacebooktwitterlinkedin

bebionic by steeper is a small,  “lifelike” bionic hand created for women and teenagers.  It is designed around an accurate skeletal structure with 337 mechanical parts.  Its 14  grip patterns and hand positions mimic real hand functions.

Its first user, Nicky Ashwell, was born with out a right hand.  After being fitted with the prosthetic, she was able to ride a bicycle and lift weights for the first time.

The prosthetic’s sensors are triggered by user muscle movements that connect to microprocessors and  motors in each finger. It weighs  1 pound and is 6.4 inches long.

WEARABLE TECH + DIGITAL HEALTH NYC 2015 – JUNE 30 @ NEW YORK ACADEMY OF SCIENCES.  REGISTER HERE.

Intent controlled robotic arm with neuroprosthetic implant

FacebooktwitterlinkedinFacebooktwitterlinkedin

Caltech and Keck researchers implanted neuroprosthetics in a part of the brain that controls the intent to move, with the goal of producing more natural and fluid motions.   The study, published in Science, was led by Richard Andersen.  A quadriplegic implanted with the device was able to perform a fluid handshaking gesture and  play “rock, paper, scissors” using a separate robotic arm.

Andersen  and colleagues improved the versatility of movement that a neuroprosthetic can offer by recording signals from  the PPC brain region.  He said: “The PPC is earlier in the pathway (than the motor-cortex, a target of earlier neuroprosthetics,) so signals there are more related to movement planning—what you actually intend to do—rather than the details of the movement execution.  We hoped that the signals from the PPC would be easier for the patients to use, ultimately making the movement process more intuitive. Our future studies will investigate ways to combine the detailed motor cortex signals with more cognitive PPC signals to take advantage of each area’s specializations.”

WEARABLE TECH + DIGITAL HEALTH NYC 2015 – JUNE 30 @ NEW YORK ACADEMY OF SCIENCES.  REGISTER HERE.

Intent controlled prosthetic foot using myoelectric sensors

FacebooktwitterlinkedinFacebooktwitterlinkedin

Ossur‘s sensor implant allows amputees to control  bionic prosthetic limbs with their minds.  Myoelectric sensors are surgically placed in residual muscle tissue.  Prosthetic movement is triggered via a receiver.

Ossur’s existing “smart limbs”  are capable of real-time learning and automatically adjust to a user’s gait, speed and terrain.   However,  conscious thought is still required.

According to Thorvaldur Ingvarsson, the company’s R&D lead, “the (implant) technology allows the user’s experience with their prosthesis to become more intuitive and integrative. The result is the instantaneous physical movement of the prosthesis however the amputee intended. They no longer need to think about their movements because their unconscious reflexes are automatically converted into myoelectric impulses that control their Bionic prosthesis.”

WEARABLE TECH + DIGITAL HEALTH NYC 2015 – JUNE 30 @ NEW YORK ACADEMY OF SCIENCES.  REGISTER HERE.

Implant to enable prosthetic sensations

FacebooktwitterlinkedinFacebooktwitterlinkedin

Washington University‘s Daniel Moran has received a DARPA grant to test a device that would stimulate nerves in the upper arm and forearm of prosthetic users.  The goal is for the wearer to be able to feel hot, cold, and a sense of touch.  In a related development last year, MC10‘s Roozbeh Ghaffari developed artificial skin for prosthetics that mimics the sensitivity of real skin.  Its silicon and gold sensors detect pressure, moisture, heat and cold (see ApplySci, 12/30/14).

Moran’s electrode is designed to stimulate sensory nerve cells in the ulnar and median nerves in the arms. The ulnar nerve is the largest  in the body unprotected by muscle or bone and is connected to the ring finger and pinkie finger on the hand. The median nerve in the upper arm and shoulder is connected to the other fingers on the hand. Together, the two nerves control movement and sensations including touch, pressure, vibration, heat, cold and pain in all of the fingers.

This novel  macro-sieve peripheral nerve interface is designed to stimulate regeneration of the ulnar and median nerves to transmit information back into the central nervous system.

The device is in an early stage, and will only be implanted in non-human primates at this time.

WEARABLE TECH + DIGITAL HEALTH NYC 2015 – JUNE 30 @ NEW YORK ACADEMY OF SCIENCES.  

EARLY REGISTRATION RATE ENDS TODAY, 5/15/15.

App helps orthopedic surgeons plan procedures

FacebooktwitterlinkedinFacebooktwitterlinkedin

Tel Aviv based Voyant Health‘s TraumaCad Mobile app helps orthopedic surgeons plan operations and create result simulations.  The system offers modules for  hip, knee, deformity, pediatric, upper limb, spine, foot and ankle, and trauma surgery.  The iPad app mobile version of this decade old system was recently approved by the FDA.

Surgeons can securely import medical images from the cloud or hospital imaging systems to perform measurements, fix prostheses, simulate osteotomies, and visualize fracture reductions. The app overlays prosthesis templates on radiological images and includes tools for performing measurements on the image and positioning the template.  In total hip replacement surgery, it automatically aligns implants and assembles components to calculate leg length discrepancy and offset.

Wearable Tech + Digital Health NYC 2015 – June 30, 2015 @ New York Academy of Sciences.  Register before April 24 and save $300.

Artificial skin detects pressure, moisture, heat, cold

FacebooktwitterlinkedinFacebooktwitterlinkedin

MC10‘s Roozbeh Ghaffari and a team of researchers from the US and Korea have developed artificial skin for prosthetics that mimics the sensitivity of real skin.  Its silicon and gold sensors detect pressure, moisture, heat and cold.   It is elastic enough for users to stretch and move a bionic hand’s fingers as they would real fingers.  According to Ghaffari, “If you have these sensors at high resolution across the finger, you can give the same tactile touch that the normal hand would convey to the brain.”  A paper detailing the research was published in Nature earlier this month.

BCI enabled 10-D prosthetic arm control

FacebooktwitterlinkedinFacebooktwitterlinkedin

Jennifer Collinger and University of Pittsburgh colleagues have enabled a prosthetic arm wearer to reach, grasp, and place a variety of objects with 10-D control for the first time.

The trial participant had electrode grids with 96 contact points surgically implanted in her brain in 2012.  This allowed 3-D control of her arm. Each electrode point picked up signals from an individual neuron, which were relayed to a computer to identify the firing patterns associated with observed or imagined movements, such as raising or lowering the arm, or turning the wrist. This was used to direct the movements of a prosthetic arm developed by Johns Hopkins Applied Physics Laboratory.  Three months later, she also could flex the wrist back and forth, move it from side to side and rotate it clockwise and counter-clockwise, as well as grip objects, adding up to 7-D control.

The new study, published yesterday, allowed the participant 10-D control — the ability to move the robot hand into different positions while also controlling the arm and wrist.

To bring the total of arm and hand movements to 10, the pincer grip was replaced by four hand shapes: finger abduction, in which the fingers are spread out; scoop, in which the last fingers curl in; thumb opposition, in which the thumb moves outward from the palm; and a pinch of the thumb, index and middle fingers. As before, the participant watched animations and imagined the movements while the team recorded her brain signals. They used this to read her thoughts so that she could move the hand into various positions.

Wearable optical sensor controls prosthetic limbs

FacebooktwitterlinkedinFacebooktwitterlinkedin

Ifor Samuel and Ashu Bansal at the University of St. Andrews have developed a wearable optical sensor that can be used to control the movement of artificial limbs.

Plastic semiconductor based sensors detect muscle contraction. Light is shined into fibrous muscle, and the scattering of the light is observed. When muscle is contracted, the light scatters less, because  muscle fibers are further apart. Sensors detect the changed scattering signals, and relay the information, as photocurrents, to a prosthetic limb, triggering movement.  A robotic arm was controlled using this method in a recent study.

Using disposable wearable optical sensors could eliminate patient risks associated with electrical based sensors, including electromagnetic interference, pain caused by sensing needles, and immune responses.