The New York Times Bits blog reports that “advocates for the blind say that smartphones and tablets could be the biggest assistive aid to come along since Braille was invented in the 1820s.” Writer Nick Bilton explores some of the many ways–from voice commands to gesture readers–that mobile devices are helping the visually impaired.
Harvard researchers have demonstrated that electrical charges carried by ions, rather than electrons, can be put to meaningful use in fast-moving, high-voltage devices.
These ionic conductors can be stretched to many times their normal area without an increase in resistivity—a problem common in stretchable electronic devices. They can be transparent, making them well suited for optical applications. The gels used as electrolytes are biocompatible, therefore easy to incorporate ionic devices—such as artificial muscles or skin—into biological systems.
Signals carried by charged ions are the electricity of the human body, allowing neurons to share knowledge and spurring the heart to beat. It is the goal of bioengineers to mesh artificial organs and limbs with that system. Harvard is trying to commercialize the technology for use in tablets, smartphones, wearable electronics, consumer audio devices, and adaptive optics.
The Sensimed Triggerfish combines a non-invasive wireless soft contact lens sensor with an automated system for recording IOP related patterns for up to 24 hours. The ambulatory patient wears the device during normal activity, including sleeping. At the end of the session, the data is transferred from the recorder to an ophthalmologist’s computer for analysis of the circadian IOP-related pattern.
OrCam, led by Hebrew University Professor Amnon Shashua, one the most exciting computer vision entrepreneurs in Israel, has developed a device that uses audio feedback to relay visual information to visually impaired people. The tiny wearable computer works with a 5-mega pixel camera attached to glasses. A computer vision algorithm enables it to read text, and it can be taught to recognize faces and objects with the help of the user.
University of Bath scientists have developed the vOICe sensory substitution device to help blind people use sounds to build a mental image of things around them.
Blindfolded study participants captured an accurate mental image of an object in front of them when a wearable camera and the vOICe device converted its visual image into a cluster of natural sounds delivered to the participant via headphones. The experiment suggested that even without eyesight, humans may still able to experience a visual sensation.
Professor Jelle De Smet of Ghent University has developed a spherical, curved LCD display which can be embedded in contact lenses and handle projected images using wireless technology. This is the first step towards “fully pixelated contact lens displays” with the same detail as a television screen. The technology could lead to a superimposed image projected onto the user’s normal view, similar to Google Glass but without the headgear. The lenses could also be used for medical purposes, including controlling light transmission toward the eye’s retina when an iris is damaged, or to display directions or texts from a smartphone.
Vinod Khosla and others have invested in MIT Media Lab’s EyeNetra, a smartphone attachment that claims to diagnose nearsightedness, farsightedness and astigmatism. The device is positioned as a less bulky alternative to the Shack-Hartmann Wavefront sensor. A $2 eyepiece is clipped onto a phone. The user then clicks to align the displayed patterns. The number of clicks required to bring the patterns into alignment indicates the refractive error. Patients connect to corrective lens providers through a cloud based system. The technology provides access to eye exams in the developing world.
USC’s Laurent Itti and researchers from Queen’s University in Ontario have created a data heavy, low cost method of identifying brain disorders through eye tracking. Subjects watch a video for 15 minutes while their eye movements are recorded. An enormous amount of data is generated as the average person makes three to five saccadic eye movements per second. Itti’s team uses advanced machine learning algorithms to enable a computer to recognize patterns without explicit human instruction.
The proof of concept study found that the algorithm could classify mental disorders through eye movement patterns. Parkinson’s patients were identified with nearly 90 percent accuracy. Children with ADHD or fetal alcohol spectrum disorder were identified with 77 percent accuracy. “This is very different from what people have done before. We’re trying to have completely automated interpretation of the eye movement data,” said Itti.
Professor Bradley Nelson and researchers at ETH Zurich have created a miniature robot that can be injected into the eye to precisely measure the retina’s oxygen supply. Many diseases, including Glaucoma, can interfere with oxygen delivery to the retina. Rapid diagnosis and treatment is essential in the attempt to preserve vision.
Using a proprietary patented shape discrimination hyperacuity (SDH) test, myVisionTrack enables patients to regularly assess their vision function. myVisionTrack stores test results, tracks disease progression and can automatically alert a healthcare provider if it suspects significant deterioration of visual function in the patient. Clinical studies demonstrate that myVisionTrack’s shape discrimination hyperacuity test has comparable or higher sensitivity and specificity compared to clinically available standard visual function tests for detecting advanced maculopathy from high-risk moderate maculopathy.