Apple developing health event detection / notification wearable

Apple’s has filed a patent application for a “Care Event Detection and Alerts” wearable that communicates with a phone to monitor user vital signs and notify carers. Events that can be detected include “a car crash, a bike accident, a medical emergency such as a heart attack or an aneurysm, separation of a child from the child’s caregiver, a dementia patient becoming lost, an avalanche, a fall, a mugging, a fire, and/or any other event for which a user may require medical, police, family, fire rescue, and/or other kind of assistance.”

This could become a tool to seniors age in place, increase the confidence of those with epilepsy and other medical issues, and protect children.  Regulatory approval for the medical device-like features will be required before bringing this product to market.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Stretchable robot “skin” can display health data

Robert Shepherd and Cornell colleagues have developed an electroluminescent “skin” that stretches to more than six times its original size while emitting light.  This could be used for soft robots that move more naturally, and dynamically display information, include health data.  The Cornell press release invites us to imagine “a  health care robot that displays a patient’s temperature and pulse, and even reacts to a patient’s mood.”

The material uses a “hyper-elastic light-emitting capacitor” made of layers of transparent hydrogel electrodes surrounding an insulating elastomer  sheet. The elastomer changes luminance and capacitance when stretched or rolled. The  skin allows soft robots to sense their actuated state and environment and communicate optically.  Small robots can crawl.

Click to view Cornell Universty video.


 

Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Bionic finger, implanted electrodes, enable amputee to “feel” texture

Swiss Federal Institute of Technology and Scuola Superiore Sant’Anna researchers have developed  a bionic fingertip that allows amputees to feel textures and  differentiate between rough and smooth surfaces.

Electrodes were surgically implanted into the upper arm of a man whose arm had been amputated below the elbow.  A machine moved an artificial finger, wired with electrodes,  across smooth and rough lines on a plastic strip.  The fingertip movement generated an electrical signal, which translated into a series of electrical spikes sent to the brain. The spikes mimicked the language of the nervous system and create the sensation of feeling.

The subject, Aabo Sørensen said:  “When the scientists stimulate my nerves I could feel the vibration and sense of touch in my phantom index finger,” he says. “[It] is quite close to when you feel it with your normal finger you can feel the coarseness of the plates and different gaps and ribs.”

Click to view EPFL video.


 

Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center
NeuroTech San Francisco  – April 6, 2016 @ the Mission Bay Conference Center
Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences
NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Wireless hub syncs health data at home, in clinic

Google’s “Connectivity Bridge” appears to be a wireless hub meant to  collect and sync medical data in clinical studies, according to an FCC filing reported by Business Insider.  (No company announcement has been made.)

The hub can be installed in facilities or homes, and sensor data can be quickly uploaded to the cloud for analysis. It uses open source software and lets users charge and sync Study Kit devices.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

 

Voice controlled wearable supports sight-impaired mobility

Toyota’s Project BLAID is a  camera based assistive device concept, meant to help the visually impaired identify bathrooms, escalators, stairs, elevators, doors, signs, and logos.

The wearable, which is in an early stage of development, is worn on the shoulders, wrapped around the neck, like an electronic scarf.  It will be controlled by voice commands, and relay information via audio and haptic cues.  It can also be paired with a phone via Bluetooth. The company plans to include mapping, object identification, and facial recognition technologies at a later stage.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

AI on a chip for voice, image recognition

Horizon Robotics, led by Yu Kai, Baidu’s former deep learning head,  is developing AI chips and software to mimic how the human brain solves abstract tasks, such as voice and image recognition.  The company believes that this will provide more consistent and reliable services than cloud based systems.

The goal is to enable fast and intelligent responses to user commands, with out an internet connection, to control appliances, cars, and other objects.  Health applications are a logical next step, although not yet discussed.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

BCI controlled wheelchair

Miguel Nicolelis has developed a brain computer interface that allows monkeys to steer a robotic wheelchair with their thoughts.  The study is meant to demonstrate the potential of humans to do the same.

Signals from hundreds of neurons simultaneously recorded in two brain regions were translated into the real-time operation of a wheelchair.

Nicolelis said: “In some severely disabled people, even blinking is not possible. For them, using a wheelchair or device controlled by noninvasive measures like an EEG may not be sufficient. We show clearly that if you have intracranial implants, you get better control of a wheelchair than with noninvasive devices.”

ApplySci looks forward to the day when non-invasive methods will allow similar brain-driven functioning for the disabled.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Machine learning for faster stroke diagnosis

MedyMatch uses big data and artificial intelligence to improve stroke diagnosis, with the goal of faster treatment.

Patient CT photos are scanned  and immediately compared with hundreds of thousands of other patient results.  Almost any deviation from a normal CT is quickly detected.

With current methods, medical imaging errors can occur when emergency room radiologists miss subtle aspects of brain scans, leading to delayed treatment. Fast detection of stroke can prevent paralysis and death.

The company claims that it can detect irregularities more accurately than a human can. Findings are presented as 3D brain images, enabling a doctor to make better informed decisions. The cloud-based system allows scans to be uploaded from any location.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Sleep app uses wearable sensors, cloud analytics

The American Sleep Apnea Association,  Apple and IBM have begun a study about the impact of sleep quality on daily activity level, alertness, productivity,  health and medical conditions. iPhone and Apple Watch sensors and the ResearchKit framework collect data from healthy and unhealthy sleepers, which is sent to the Watson Health Cloud.

The SleepHealth app uses the watch’s  heart rate monitor to detect sleep, and gathers movement data with its accelerometer and gyroscope. The app includes a  “personal sleep concierge” and nap tracker, meant to help users develop better sleeping habits.

Data is stored and analyzed on the Watson Health Cloud, allowing researchers to see common patterns .  The long term goal is to develop  effective interventions.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8,2016 @ the New York Academy of Sciences

 

Self-adjusting lenses adapt to user needs

DeepOptics is developing is vision-enhancing wearable lenses, with sensors that gauge viewing distance, and precisely adjust the lenses to bring an object into focus.

Electronic volts are sent into three layered liquid crystal lenses, changing the refractive index to provide the specific optical compensation needed to correct vision in every situation.

The company also believes that its technology can offer VR/AR devices the ability to deliver better experiences.

Click to view the DeepOptics video:


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences