DARPA continues to build technology with academic partners to enable amputees to control prosthetic limbs with their minds. Examples follow:
Researchers at the Rehabilitation Institute of Chicago demonstrated a type of peripheral interface called targeted muscle re-innervation (TMR). By rewiring nerves from amputated limbs, new interfaces allow for prosthetic control with existing muscles.
Researchers at Case Western Reserve University used a flat interface nerve electrode (FINE) to demonstrate direct sensory feedback. By interfacing with residual nerves in the patient’s partial limb, some sense of touch by the fingers is restored. Other existing prosthetic limb control systems rely solely on visual feedback. Unlike visual feedback, direct sensory feedback allows patients to move a hand without keeping their eyes on it—enabling simple tasks, like searching a bag for small items, not possible with today’s prosthetics.
Cornell University researchers have programmed a PR-2 robot to not only carry out everyday tasks, but to anticipate human behavior and adjust its actions.
From a database of 120 3-D videos of people performing common household activities, the robot has been trained to identify human activities by tracking the movements of the body – reduced to a symbolic skeleton for easy calculation – breaking them down into sub-activities like reaching, carrying, pouring or drinking, and to associate the activities with objects.
Observing a new scene with its Microsoft Kinnect 3-D camera, the robot identifies the activities it sees, considers what uses are possible with the objects in the scene and how those uses fit with the activities; it then generates a set of possible continuations into the future – such as eating, drinking, cleaning, putting away – and finally chooses the most probable. As the action continues, it constantly updates and refines its predictions.
On Capitol Hill, IBM representatives described the supercomputer’s new health-care related features, including the ability to ingest patients’ medical information and synthesize thousands of medical journals and other reference materials along with patient preferences to suggest treatment options.
The Watson team has collaborated with the Memorial Sloan-Kettering Cancer Center and insurer Well Point to teach the computer about the medical world.
Google, NASA and the Universities Space Research Association will put a 512 qubit machine from D-Wave at the disposal of researchers around the globe. The USRA will invite teams of scientists and engineers to share time on the unique supercomputer. The goal is to study how quantum computing might be leveraged to advance machine learning.
A group at the Tokyo Institute of Technology, led by Dr. Osamu Hasegawa, has advanced SOINN, their machine learning algorithm, which can now use the internet to learn how to perform new tasks. The system, which is under development as an artificial brain for autonomous mental development robots, is currently being used to learn about objects in photos using image searches on the internet. It can also take aspects of other known objects and combine them to make guesses about objects it doesn’t yet recognize.
Kurzweil predicts that computers will be able to have a deep understanding of human emotion by 2029. He wants to see search evolve to understand even more complex language that will involve “emotional intelligence, being funny, getting the joke, being sexy, being loving, understanding human emotion.”
Kurzweil hopes to leverage Google’s massive pool of resources and data to develop technology that would create truly intelligent computers that can understand human language on a deep level.