Focused ultrasound thalamotomy in Parkinson’s Disease

UVA’s Scott Sperling and Jeff Elias, who already used focused ultrasound to treat essential tremor, have just published the results of  a small study showing the efficacy of the technology in Parkinson’s Disease.

The sound waves were shown to interrupt brain circuits responsible for the uncontrollable shaking associated with the disease. The researchers claim that their study also offers “comprehensive evidence of safety” in its effect on mood, behavior and cognitive ability, which has not previously been studied.

According to Sperling, “In this study, we extended these initial results and showed that focused ultrasound thalamotomy is not only safe from a cognitive and mood perspective, but that patients who underwent surgery realized significant and sustained benefits in terms of functional disability and overall quality of life.”

27 adults with severe Parkinson’s tremor that had not responded to previous treatment were divided  into two groups. Twenty received the procedure, and a control group of seven (who were later offered the procedure) did not. Participants reported improved quality of life, including their ability to perform simple daily tasks, emotional wellbeing, and a lessened sense of stigma due to their tremor, at both three and twelve months.

The team found that mood and cognition, and the ability to go about daily life, ultimately had more effect on participants’ assessment of their overall quality of life than did remor severity or the amount of tremor improvement.

Cognitive decline was seen in some participants after the study, in that they were less able to name colors and think of and speak words. The cause of this was unclear, and must be investigated. The researchers suggested this could be a result of the natural progression of Parkinson’s.


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod KhoslaWalter Greenleaf – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi Emmanuel Mignot Michael Snyder Joe Wang – Josh Duyan – Aviad Hai Anne Andrews Tan Le

Proof of concept 3D printed cornea

Newcastle University’s Che Connon has developed proof-of-concept research that could lead to a 3D printed cornea.

Stem cells  from a healthy donor cornea were mixed with alginate and collagen to create a printable bio-ink.  A 3D printer extruded the bio-ink in  concentric circles to form the shape of a human cornea in less then 10 minutes. The stem cells then grew.

According to Connon: “Our unique gel – a combination of alginate and collagen – keeps the stem cells alive whilst producing a material which is stiff enough to hold its shape but soft enough to be squeezed out the nozzle of a 3D printer. This builds upon our previous work in which we kept cells alive for weeks at room temperature within a similar hydrogel. Now we have a ready to use bio-ink containing stem cells allowing users to start printing tissues without having to worry about growing the cells separately.”

The team demonstrated that they could build a cornea to match a patient’s unique specifications, but said that it will be several years before this might be used for transplants.

Click to view Newcastle University video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE FRIDAY, JUNE 22nd

fMRI + EEG used to detect consciousness in ICU patients

MGH’s  Brian Edlow and colleagues have completed a small study showing the efficacy of using fMRI and EEG in ICU TBI patients to detect consciousness.  Previous research has suggested that up to 40% of conscious patients are misclassified as unconscious.

The goal is to create a more informed care plan, and earlier interventions that could improve outcomes.

16 patients with severe traumatic brain injury at MGH’s ICU were studied. At the start, 8 patients could respond to language, 3 were classified as minimally conscious with no language response, 3 were classified as vegetative, and 2 were in a coma. The study also included a healthy control group 16.

f MRI scans were performed as soon as the subjects were stable. EEG readings were usually taken within 24 hours after the fMRI scan.  Tests were designed to detect a mismatch between their ability to imagine performing a task and their ability to physically express themselves (cognitive motor dissociation.)

Through the study, researchers detected evidence of consciousness in 4 of the 8 patients who were unable to respond to language in bedside exams, including the 3 classified as vegetative. (It was also noted that 25% of the healthy controls had no detectable brain response in a hand-squeeze imagery test.)

The subjects were also exposed to brief recordings of spoken language and music during both fMRI and EEG to detect activity in certain brain regions. Higher-order cortex activity was seen in 2 additional subjects. While higher-order cortical activity doesn’t prove that a patient is conscious, finding a response in those structures could have implications for a patient’s eventual recovery.

A 19 electrode EEG device was used for the study.

Brain health company Neurosteer is attempting to gather similar neural activity data with its continuous, mobile, 3 electrode, EEG wearable.  The company’s CEO, Nathan Intrator, will present this work at ApplySci’s Wearable Tech + Digital Health + Neurotech conference, on September 19th at the MIT Media Lab.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen

Registration rates increase Friday, August 18th.


ANNOUNCING WEARABLE TECH + DIGITAL HEALTH + NEUROTECH SILICON VALLEY – FEBRUARY 26 -27, 2018 @ STANFORD UNIVERSITY

Wearable, high resolution, continuous opto-electronic monitoring

MRI, disrupted.

Mary Lou Jepsen’s background in consumer electronics, computers, TV, VR, wearables, healthcare and software at Google x,  Facebook, and Oculus has led to the creation of Openwater — a company that will enable us to see the inner workings of the body and brain.  At high resolution, continuously.

Using novel opto-electronics, the company aims to replace the functionality of MRI with a wearable.  Applications include the  detection and treatment of cancer, cardiovascular diseases, internal bleeding, and brain diseases, and for communication via thought,  and potentially to upload/download and augment memories, thoughts and emotions.

The technology uses the scattering of the body or the brain to focus infrared light to scan itself, bit by bit, or voxel by voxel. This is enabled by LCDs with pixels small enough to create reconstructive holographic images that neutralize the scattering and enable scanning at MRI resolution and depth, coupled with body-temperature detectors.

These LCDs and detectors line the inside of a ski-hat, bandage or other clothing, and are designed to modulate the interference of intensity and phase in the near infrared regime with the video-rate computer generated holograms integrated with embedded detectors. The brain or body will be able to be scanned systematically or selectively. The system can be used in reverse, to focus light to any area of interest in the body or brain (for tumor treatment.)

Once again, Mary Lou Jepsen is pioneering technology that will transform — and save — lives.  We are honored to include her as a speaker at Wearable Tech + Digital Health + Neurotech Boston, on September 19th at the MIT Media Lab.

Click to view Jepsen’s talk at ApplySci’s  February, 2017 Wearable Tech + Digital Health + Neurotech Silicon Valley conference at Stanford University


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen – Daniela Rus

Registration rates increase Friday, August 4th.


ANNOUNCING WEARABLE TECH + DIGITAL HEALTH + NEUROTECH SILICON VALLEY – FEBRUARY 26 -27, 2018 @ STANFORD UNIVERSITY

Robotic leg brace helps stroke patients walk

Toyota’s Welwalk WW-1000 exoskeleton is designed to help those with paralysis on one side of their body walk again. The frame is worn on the affected leg, with a  motor at the knee joint that provides calibrated assistance based on a user’s ability.  Wearers are trained to recover their walking ability over time.

The robotic device is paired with a treadmill and harness that is controlled by medical staff. The  system will be rented to hospitals in Japan for $9000, plus $3200 per month.

The hope is that it will dramatically speed recovery time for stroke patients. The  brace integrates sensors that determine exactly how much support to provide  at any given point, ensuring that patients aren’t over-reliant on support, or rushed before they’re ready to progress.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator

Tiny wearable sensor measures blood flow

Kyocera has developed a tiny optical sensor to measure blood flow volume in subcutaneous tissue, meant to be integrated into a phone or wearable. Potential applications include monitoring stress and preventing dehydration, heat stroke, and altitude sickness.

The device will be used  in/on an ear, finger or forehead, to measure the velocity-driven frequency of reflected light within blood vessels. The  sensor utilizes the relative shift in frequency and the strength of the reflected light  to determine blood-flow volume.

ApplySci’s 6th  Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth – Shahin Farshchi – Casper de Clercq – Mary Lou Jepsen – Vivek Wadhwa – Dirk Schapeler – Miguel Nicolelis

 

“Mixed Reality” headset could support surgery, rehab, learning

Magic Leap has unveiled its “mixed reality” headset, where  virtual objects are integrated into the real world.  In addition to obvious gaming and entertainment applications, the system could be used in healthcare (including in surgery, surgery preparation, and orthopedic rehabilitation) and education.

The company remains vague in its description of its technology, but head and hand tracking functionality appear to have been added.   According to founder Rony Abramovitz, “Magic Leap doesn’t trick the brain. Rather it shoots photons into the eye that stimulate the cones and rods as if the hologram were real, or neurologically true.”

Click to view Magic Leap video.


Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Wrist-worn wearable detects Atrial Fibrillation, sends alerts

AliveCor is known for its FDA approved mobile EKG, which attaches to a phone or tablet.  The company has just announced  Kardia – an Apple Watch band that, when a sensor is pressed and paired with an app, can provide and accurate EKG, incorporate a user’s spoken symptoms into its analysis, and share data.  AliveCor said that the band and app can detect Atrial Fibrillation, which would be shared with a doctor immediately.

Click to view AliveCor video


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Injectable nanotech device continuously monitors glucose

Kyungsuk Yum at the University of Texas is developing an internal, nanoscale device to continuously analyze blood sugar.   A near infrared optical biosensor nanotube is injected, and an optical scanner accesses data for constant monitoring .

Current continuous monitoring technology for diabetes requires a tube inserted through the abdomen.  This reads glucose levels in tissue, which is not as accurate as blood reading.  It must be calibrated several times per day, and changed every week.

The traditional glucometer system requires blood-drawing finger pricks throughout  the day.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2106 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

WEARABLE TECH + DIGITAL HEALTH NYC – JUNE 7, 2016 @ THE NEW YORK ACADEMY OF SCIENCES

NEUROTECH NYC – JUNE 8, 2016 @ THE NEW YORK ACADEMY OF SCIENCES

Smart airline uniforms improve passenger safety

easyJet has partnered with CuteCircuit to create sensor embedded crew uniforms to improve passenger safety.

Cabin crew uniforms have shoulder LEDs and illuminated hems to provide lighting. Lapel LEDs display fight numbers, and microphones  in the fabric enable immediate communication.

Engineers’ uniforms have LEDs in  jacket hoods to illuminate work areas, and built in cameras to share photos for assistance.  Integrated air quality sensors and barometers notify workers of environmental issues, and create a city-by-city air quality map for passengers.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER