Category Archives: Eyes

Voice controlled wearable supports sight-impaired mobility

FacebooktwitterlinkedinFacebooktwitterlinkedin

Toyota’s Project BLAID is a  camera based assistive device concept, meant to help the visually impaired identify bathrooms, escalators, stairs, elevators, doors, signs, and logos.

The wearable, which is in an early stage of development, is worn on the shoulders, wrapped around the neck, like an electronic scarf.  It will be controlled by voice commands, and relay information via audio and haptic cues.  It can also be paired with a phone via Bluetooth. The company plans to include mapping, object identification, and facial recognition technologies at a later stage.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Self-adjusting lenses adapt to user needs

FacebooktwitterlinkedinFacebooktwitterlinkedin

DeepOptics is developing is vision-enhancing wearable lenses, with sensors that gauge viewing distance, and precisely adjust the lenses to bring an object into focus.

Electronic volts are sent into three layered liquid crystal lenses, changing the refractive index to provide the specific optical compensation needed to correct vision in every situation.

The company also believes that its technology can offer VR/AR devices the ability to deliver better experiences.

Click to view the DeepOptics video:


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

First human optogenetics vision trial

FacebooktwitterlinkedinFacebooktwitterlinkedin

Retina Foundation of the Southwest scientists, in a study sponsored by Retrosense Therapeutics, will for the first time use optogenetics — a combination of gene therapy and light to  control nerve cells – in an attempt to restore human sight.  Previously, optogenetic therapies were only tested on mice and monkeys.

Viruses with DNA from light-sensitive algae will be injected into the eye’s ganglion cells, which transmit signals from the retina to the brain, in an attempt to make them directly responsive to light.  15 legally blind patients will participate in the study, which was first reported by the MIT Technology Review.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Smart contact lens could detect Glaucoma progression

FacebooktwitterlinkedinFacebooktwitterlinkedin

Columbia University’s C. Gustavo De Moraes has developed a contact lens sensor that can detect Glaucoma progression by constantly monitoring intraocular pressure.  Doctors check eye pressure, but the measurement is not continuous, and not performed at night, when eye pressure typically rises.

As eye pressure fluctuates, lens curvature changes.  The sensor sends a signal to a wireless device that records it, and shows pressure changes over time.

In a recent study, 40  open-angle glaucoma patients wore the smart  lens for 24 hours, while awake and asleep. Patients with steeper overnight spikes and a greater number of peaks in their signal profile usually had faster glaucoma progression.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the New York Academy of Sciences

NeuroTech San Francisco – April 6, 2016 @ the New York Academy of Sciences

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Coating enhances smart contact lens capabilities

FacebooktwitterlinkedinFacebooktwitterlinkedin

Google and others are developing smart contact lenses meant to be the next wave of wearables.  To broaden and enhance their capabilities, Drew Evans of the  University of South Australia has created  a biocompatible, conducting, nanoscale polymer lens coating.   Potential applications include  visual assistance through electronic displays and noninvasive glucose  measurement through sensors.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

“Augmented attention” wearable assists the visually impaired

FacebooktwitterlinkedinFacebooktwitterlinkedin

OrCam is a disruptive artificial vision company that creates assistive devices for the visually impaired.  It is led by Hebrew University professor Amnon Shashua.

MyMe, its latest product, uses artificial intelligence to respond to audio and visual information in real-time.  A clip on camera and Bluetooth earpiece create what the company calls an “augmented attention” experience, meant to enrich interactions.

The device is aware of all daily actions — including people we meet, conversation topics,  visual surroundings, food we eat, and activities we participate in. Visual and audio processing functions serve as an extension to a wearers’ awareness.  A built in fitness tracker will also be included.

More details will be available after MyMe is unveiled at CES next week.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

2nd Annual Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Robotic “glove” helps sight-impaired navigate, sense, grab objects

FacebooktwitterlinkedinFacebooktwitterlinkedin

University of Nevada’s Yantao Shen is developing a hand-worn robotic device to help blind and sight impaired people navigate around obstacles, or locate, sense and grasp objects.  Examples include picking up a glass or operating a door handle. The technology combines vision, tactile, force, temperature and audio sensors.

According to Shen: “The visual sensors, very high resolution cameras, will first notify the wearer of the location and shape, and the proximity touch sensors kick in as the hand gets closer to the object. The multiple sensors and touch actuators array will help to dynamically ‘describe’ the shape of the object to the hand when the hand is close to the object, allowing people with vision loss to have more independence and ability to navigate and to safely grasp and manipulate.”


 

Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

LCD glasses treat amblyopia, correct vision

FacebooktwitterlinkedinFacebooktwitterlinkedin

Amblyz are programmable LCD glasses that can treat amblyopia (colloquially known as “lazy eye”).

Instead of patching, the lens over the affected eye would go dark for a few seconds out of every 30 during the prescribed patching period.  As children with amblyopia often have astigmatism or nearsightedness, the glasses simultaneously corrects vision (when not occluding it).

Researchers compared  Amblyz with traditional patching on 33 children, ages 3-8. For 3 months, one group wore the patch for 2 hours per day, while the other wore Amblyz for 4 hours per day. Both groups showed similar improvement, and were able to see an average of two more lines on a reading chart.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

AR + Kinect games assist the hearing, visually impaired

FacebooktwitterlinkedinFacebooktwitterlinkedin

Reflex Arc‘s  augmented reality games  work with  Microsoft Kinect to help children learn sign language and assist the visually impaired with exercise.   Boris gestures sign language, and  The Nepalese Necklace helps those with no limited sight  with mobility training.

The games encourage exercise and  are designed to help blind children learn about  spatial awareness, balance, coordination, and orientation.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

PREFERRED REGISTRATION RATES AVAILABLE THROUGH 11/30/15.

Wearable + navigation service for the visually impaired

FacebooktwitterlinkedinFacebooktwitterlinkedin

Aira.IO combines wearable tech with a remote agent service to guide the visually impaired.  Users are connected with agents who interpret the data stream from smart glasses and assist with navigation. The device uses a routing algorithm based on user and agent preferences.

The company completed  a multi-phase beta trial with 100 blind and low-vision participants in simulated and real-life settings. Users wore Google Glass and were assisted by agents via Aira’s dashboard platform during several activities: navigating city streets and intersections; being in a hotel; locating and identifying canned goods in a kitchen; selecting clothing in a simulated department store; shopping at a supermarket; ordering at a coffee shop; and traveling by foot and public transportation.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER