Smartphone sensor detects cancer in breath

Professor Hossam Haick at the Technion – Israel Institute of Technology has developed a sensor equipped smartphone that screens a user’s breath for early cancer detection.

SNIFFPHONE uses micro and nano sensors that read exhaled breath.  The information is transferred through the phone to a signal processing system for analysis.  According to Haick, the NaNose system can detect benign and malignant tumors more quickly, efficiently and cheaply than previously possible, replacing clinical follow up that would lead to the same intervention.  He claims that NaNose has  a 90 percent accuracy rate.

This is one of several biomedical sensor breakthroughs that Professor Haick is working on.  In July 2013, ApplySci described his flexible sensor that could be integrated into electronic skin, enabling those with prosthetic limbs to feel changes in their environments.  This is similar to Roozbeh Ghaffari’s work at MC10, which we described last month and will be included in our June 30th conference, Wearable Tech + Digital Health NYC 2015.

Eye tracking measures brain injury severity

NYU‘s Uzma Samadan has developed an eye tracking device that measures the severity of concussion or brain injury.  This simple, inexpensive technology could improve the speed and accuracy of TBI diagnosis.

Researchers compared 64 healthy control subjects to 75 trauma patients at Bellevue Hospital. Pupil movement was tracked for 200 seconds while patients watched a music video.

The study showed that 13 patients who had hit their heads and had CT scans showing new brain damage, and 39 patients who had hit their heads and had normal CT scans, had significantly less ability to coordinate their eye movements than uninjured control subjects. 23 subjects who had bodily injuries but did not require head CT scans had a similar ability to coordinate eye movements as uninjured controls.

Wearable Tech + Digital Health NYC 2015 – June 30 @ New York Academy of Sciences

Sensors, software to understand MS progression

Biogen Idec and Google X  will use sensors and software to collect and analyze data from MS patients.  The goals is to understand  environmental and biological factors that contribute to the disease’s progression, and why it progresses differently in every patient.  Andrew Conrad, head of Life Sciences at Google X, believes that this will lead to earlier interventions and better outcomes.

Wearable Tech + Digital Health NYC 2015 – The Health Sensor Revolution.  June 30 @ New York Academy of Sciences.

Ear sensor monitors driver alertness

Fujitsu’s FEELythm is a wearable sensor that tracks pulse to detect drowsiness in drivers.  An algorithm monitors vital signs via a sensor attached to the earlobe, gauges drowsiness, and notifies the driver.  When used commercially,  it notifies the driver’s fleet manager. It can connect to onboard devices and link to fleet management systems for real time monitoring.

The company claims to be able to predict  commercial driving dangers before they occur by creating a hazard map for fleet managers based on sensor data indicating fatigue, stress, and tension.

Wearable Tech + Digital Health NYC 2015 – June 30 @ New York Academy of Sciences

Noninvasive sensor tattoo detects glucose levels

UC San Diego professor Joseph Wang has developed an ultra-thin, flexible device that sticks to skin like a tattoo and can detect glucose levels.  The sensor  has the potential to eliminate finger-pricking for diabetes.

The wearable, non-irritating sensor tattoo can detect glucose in the fluid just under the skin.  It is based on integrating glucose extraction and electrochemical biosensing.  Testing on seven volunteers showed  that it was able to accurately determine glucose levels. The sensor response correlated with that of a commercial glucose monitor.

Noninvasive monitoring will be one of the disruptive innovations discussed at  Wearable Tech + Digital Health NYC 2015:  The health sensor revoltion on June 30, 2015 at the New York Academy of Sciences.

Physiological and mathematical models simulate body systems

Another CES standout was LifeQ, a company that combines physiological and bio-mathematical modeling to provide health decision data.

LifeQ Lens is a multi-wavelength optical sensor that can be integrated into wearable devices. It monitors key metrics, with what the company claims to be laboratory level accuracy, using a proprietary algorithm. Raw data is translated through bio-mathematical models, called LifeQ Core. The models are turned into digital, virtual simulations of body systems. LifeQ Link is an open access platform through which partners can use the technology.

The system can be used by athletes, individuals monitoring nutrition, stress and sleep, doctors seeking data to help inform diagnoses  and manage chronic conditions.

The company foresees their data providing population level  health analysis for research purposes. They hope to be able to monitor clinical trials to help create safer medicines and more effective treatments.

Wearables, sensables, and opportunities at CES

It was the year of Digital Health and Wearable Tech at CES.  Endless watches tracked vital signs (and many athletes exercised tirelessly to prove the point).   New were several ear based fitness monitors (Brag), and some interesting TENS pain relief wearables (Quell).  Many companies provided  monitoring for senior citizens, and the most interesting only notified caregivers when there was a change in learned behavior (GreenPeak).  Senior companion robots were missing, although robots capable of household tasks were present (Oshbot).  3D printing was big (printed Pizza)–but where were 3d printed bones and organs?  Augmented reality was popular (APX, Augmenta)–but mostly for gaming or industrial use.  AR for health is next.

Two companies continue to stand out in Digital Health.  Samsung’s Simband  is best positioned to take wearables into  medical monitoring, with its multitude of sensors, open platform, and truly advanced health technologies.  And  MC10‘s electronics that bend, stretch, and flex will disrupt home diagnosis, remote monitoring, and smart medical devices.

We see two immediate opportunities.  The brain, and the pulse.

1.  A few companies at CES claimed to monitor brain activity, and one savvy brand (Muse) provided earphones with soothing sounds while a headband monitored attention.  While these gadgets were fun to try, noone at CES presented extensive brain state interpretation to address cognitive and emotional issues.

2.  Every athlete at CES used a traditional finger based pulse sensor.  A slick wearable that can forgo the finger piece will make pulse oximetry during sports fun, instead of awkward.  As with every gadget, ensuring accuracy is key, as blind faith in wearables can be dangerous.

ApplySci looks forward to CES 2016, and the many breakthroughs to be discovered along the way, many of which will be featured at Wearable Tech + Digital Health NYC 2015.

Implant stimulates vagus nerve, relieves arthritis pain

Academic Medical Center scientists have implanted tiny pacemaker-like devices in the necks of 20 patients with severe rheumatoid arthritis,  reducing joint pain with out drugs.  The trial was led by Professor Paul-Peter Tak.

The implant stimulates the vagus nerve, which connects the brain to major organs, and is responsible for automatic body functions. Spleen activity was reduced after impulses were fired for three minutes a day.  In less than a week, participant’s spleens produced fewer chemicals and other immune cells that cause abnormal joint inflammation in rheumatoid arthritis.

GlaxoSmithKline, a partner in the study, hopes that the same technique could, in the future, reverse other chronic conditions, including asthma, obesity and diabetes.

Samsung moves from fitness tracking to health monitoring

Samsung’s Simband represents the company’s shift from fitness tracking to health monitoring, allowing medical startups and researchers to develop sensor applications.

Simband is equipped with six sensors: electrocardiogram, photoplethysmogram, galvanic skin response, accelerometer, and thermometer.  Developers can also add proprietary sensors.

The wearable’s three main functions are called “trends,” “monitor,” and “spot check.” Trends displays one’s data over time, monitor is a real-time, all sensor tracking mode, and spot check quickly checks heart rate and blood pressure.


Reversing time improves cancer tissue imaging

Washington University professor Lihong Wang has developed a time-reversal technology that allows researchers to better focus light in tissue.  The photo acoustic imaging combines light with acoustic waves to form a sharper image, several centimeters into the skin.  Current high-resolution optical imaging technology allows researchers to see only 1 millimeter deep.

The time-reversed adapted-perturbation (TRAP) optical focusing  sends guiding light into tissue to seek movement. The light that has traversed stationary tissue appears differently than light that has moved through something moving, such as blood. By taking two successive images, they can subtract the light through stationary tissue, retaining only the scattered light due to motion. The light is then sent back to its original source via a time-reversal process, which improves its focus.