Category Archives: Cancer

Machine learning analysis of doctor notes predicts cancer progression

FacebooktwitterlinkedinFacebooktwitterlinkedin

Gunnar Rätsch and Memorial Sloan Kettering colleagues are using AI to find similarities between cancer cases.  Ratsch’s algorithm has analyzed 100 million sentences taken from clinical notes of about 200,000 cancer patients to predict disease progression.

In a recent study, machine learning was used to classify  patient symptoms, medical histories and doctors’ observations into 10,000 clusters. Each cluster represented a common observation in medical records, including recommended treatments and typical symptoms. Connections between clusters were mapped to  show inter-relationships. In another study, algorithms were used to  find hidden associations between written notes and patients’ gene and blood sequencing.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

 

Sensor + algorithm detect prostate cancer in urine

FacebooktwitterlinkedinFacebooktwitterlinkedin

Chris Probert and University of Liverpool and UWE Bristol colleagues are creating a test that uses gas chromatography to “smell” prostrate cancer in urine.  If proven accurate, the test might be able to be used instead of current invasive diagnostic procedures, at an earlier stage.

155 men were tested. 58 were diagnosed with prostate cancer, 24 with bladder cancer and 73 with hematuria or poor stream without cancer.  The sensor successfully identified patterns of volatile compounds that allow classification of urine in patients with urological cancers.

Urine samples are inserted into the  “Odoreader” and measured using algorithms.  A 30 meter column enables the urine compounds to travel through it at different rate. The algorithm detects cancer by reading the patterns.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC -June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Handheld microscope identifies cancer cells during surgery

FacebooktwitterlinkedinFacebooktwitterlinkedin

University of Washington, Memorial Sloan Kettering Cancer Center, Stanford University and Barrow Neurological Institute researchers are developing a handheld, miniature microscope to allow surgeons to “see” at a cellular level in the operating room.  This can enable more precise brain tumor removal, as surgeons try not  to leave cancerous material behind, while protecting healthy brain matter.

According to lead author Jonathan Liu, “Surgeons don’t have a very good way of knowing when they’re done cutting out a tumor. They’re using their sense of sight, their sense of touch, pre-operative images of the brain — and oftentimes it’s pretty subjective. Being able to zoom and see at the cellular level during the surgery would really help them to accurately differentiate between tumor and normal tissues and improve patient outcomes,”

 

The microscope delivers high-quality images at faster speeds than existing devices. In addition to tumor surgery, researchers will begin testing it as a cancer-screening tool in  dental and dermatological clinics, in an effort to avoid invasive biopsies and waiting time at labs.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Nanofiber sensor glove could detect breast cancer

FacebooktwitterlinkedinFacebooktwitterlinkedin

Tokyo University’s Takao Someya and Harvard’s Zhigang Suo are developing thin, bendable, pressure sensitive, nanofiber sensors that could be be incorporated into gloves to detect breast tumors.

The 1.9 inch square sheet has 144  pressure measuring locations, and can detect pressure even when twisted.  Many researchers are developing flexible pressure sensors, but they are vulnerable when bent and twisted.

According to  Someya, “Sensitive human fingers of a veteran doctor may be able to find a small tumor, but such perceived sensation cannot be measured. The digitization of the sensations means that they could be shared with other doctors who could theoretically experience the same sensations as the physician who performed the examination.”


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Phone chip may sense cancer in breath

FacebooktwitterlinkedinFacebooktwitterlinkedin

Image:  The Yomiuri Shimbun

A consortium led by Yoshikawa, Genki and National Institute for Materials Science colleagues is in the early stages of developing a phone sensor that they claim is capable of detecting cancer by analyzing breath odor.

A tiny chip is said to determine whether substances linked to cancer exist in one’s breath, and calculates whether he/she is suspected to have the disease.  The result is displayed on a phone’s screen.

NIMS believes that the sensor will be able to distinguish  cancer type in the future. It also may be able to determine odors linked to respiration in diabetes, kidney & liver disease, and asthma.  The researchers are working on multiple applications for both clinical and self quantifying scenarios, and hope for it to be available by 2022.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Respiratory motion system for improved lung tumor imaging

FacebooktwitterlinkedinFacebooktwitterlinkedin

U of T professor Shouyi Wang has developed a mathematical model based, personalized respiratory motion system for more precise lung tumor imaging.

Respiratory gating, or a patient’s motion breath-by-breath, is monitored,  and the data is used to focus a radiology beam on the target when the chest cavity is relaxed.  This is the stage that provides the best picture of a cancerous site.

Current techniques depend on expensive, uncomfortable, scanning equipment pressed on a patient’s chest.  This often produces only moderately accurate images.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Mobile 3D imaging for early breast cancer detection

FacebooktwitterlinkedinFacebooktwitterlinkedin

Vayyar is a 3D imaging company that aims to turn every mobile device into an advanced imaging system.  The company, which focuses on breast cancer detection, among other applications, claims that it can better detect anomalies, at an earlier stage, than traditional methods.    The technology will be demonstrated at CES next month.  ApplySci looks forward to more detail about its sensors, as well as clinical study results.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Machine learning based cancer diagnostics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Search engine giant Yandex is using its advanced machine learning capabilities to detect cancer predisposition.

Yandex Data Factory has partnered with AstraZeneca to develop the RAY platform, which analyzes DNA testing results, generates a report about patient genome mutations, and provides treatment recommendations and side effect information. Testing will begin next month.

The two companies have signed a cooperation agreement  to launch big data projects in epidemiology, pathologic physiology, diagnostics and treatment of diseases (focused on contagious diseases, cancer, endocrinology, cardiology, pulmonology, and psychiatry).

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

Ultrasound penetrates blood-brain barrier to treat brain tumor

FacebooktwitterlinkedinFacebooktwitterlinkedin

Todd Mainprize at Sunnybrook Hospital has, for the first time,  delivered chemotherapy directly to a brain tumor, by breaking through the blood-brain barrier using tightly focused ultrasound.

The patient’s bloodstream was  infused with a chemotherapy drug, as well as microscopic bubbles, which are smaller than red blood cells and can pass freely through blood. MRI-guided, low intensity sound waves targeted blood vessels in the blood-brain barrier, near the tumor site. The ultrasound waves vibrated the microbubbles,  loosening the tight cell junctions of the blood-brain barrier. The loosened junctions allowed the chemotherapy drug to flow past the barrier and deposit within the targeted tumor site.

This breakthrough could also lead to new treatments for brain diseases such as Parkinson’s and Alzheimer’s.

Click to view Sunnybrook Hospital video.

NEUROTECH SAN FRANCISCO – APRIIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

Remotely controlled capsule endoscope captures lower GI images

FacebooktwitterlinkedinFacebooktwitterlinkedin

A new type of capsule endoscope may improve cancer diagnostics, providing comprehensive, non-invasive imaging, including lower GI images.

The 3D printed Tadpole Endoscope (TE) has a soft tail that allows it to be  remotely guided around the stomach.  The technology was developed by Yong ZHONG, Ruxu DU and Prof Phillip W Y CHIU of the Chinese University of Hong Kong.

The TE is activated immediately after being swallowed. Once it reaches the stomach, a doctor can guide the device to gather images. By adjusting a patient’s posture, the whole stomach can be viewed. The TE moves into the lower GI tract via natural peristalsis. The patient is sent home, wearing a sensor patch, to record  the lower GI images, which are transmitted to the doctor.

Currently, esophagus and stomach cancer can be diagnosed using gastroscopy; intestinal cancer can be diagnosed using capsule endoscopy; and colorectal cancer can be diagnosed using colonoscopy.

Click to view the Chinese University of Hong Kong video.