Nerve and muscle interfaces for prosthetic control

http://www.darpa.mil/NewsEvents/Releases/2013/05/30.aspx

DARPA continues to build technology with academic partners to enable amputees to control prosthetic limbs with their minds.  Examples follow:

Researchers at the Rehabilitation Institute of Chicago demonstrated a type of peripheral interface called targeted muscle re-innervation (TMR). By rewiring nerves from amputated limbs, new interfaces allow for prosthetic control with existing muscles.

Researchers at Case Western Reserve University used a flat interface nerve electrode (FINE) to demonstrate direct sensory feedback. By interfacing with residual nerves in the patient’s partial limb, some sense of touch by the fingers is restored. Other existing prosthetic limb control systems rely solely on visual feedback. Unlike visual feedback, direct sensory feedback allows patients to move a hand without keeping their eyes on it—enabling simple tasks, like searching a bag for small items, not possible with today’s prosthetics.

Cornell robots anticipate human actions

http://news.cornell.edu/stories/2013/04/think-ahead-robots-anticipate-human-actions

Cornell University researchers have programmed a PR-2 robot to not only carry out everyday tasks, but to anticipate human behavior and adjust its actions.

From a database of 120 3-D videos of people performing common household activities, the robot has been trained to identify human activities by tracking the movements of the body – reduced to a symbolic skeleton for easy calculation – breaking them down into sub-activities like reaching, carrying, pouring or drinking, and to associate the activities with objects.

Observing a new scene with its Microsoft Kinnect 3-D camera, the robot identifies the activities it sees, considers what uses are possible with the objects in the scene and how those uses fit with the activities; it then generates a set of possible continuations into the future – such as eating, drinking, cleaning, putting away – and finally chooses the most probable. As the action continues, it constantly updates and refines its predictions.

Watson’s health care capabilities described to lawmakers

http://www.washingtonpost.com/blogs/innovations/wp/2013/05/17/watson-goes-to-washington-ibm-shows-off-latest-health-care-work-to-lawmakers/

On Capitol Hill, IBM representatives described the supercomputer’s new health-care related features, including the ability to ingest patients’ medical information and synthesize thousands of medical journals and other reference materials along with patient preferences to suggest treatment options.

The Watson team has collaborated with the Memorial Sloan-Kettering Cancer Center and insurer Well Point to teach the computer about the medical world.

Quantum computing AI lab from Google, NASA and USRA

http://googleresearch.blogspot.ca/2013/05/launching-quantum-artificial.html

Google, NASA and the Universities Space Research Association will put a 512 qubit machine from D-Wave at the disposal of researchers around the globe.  The USRA will invite teams of scientists and engineers to share time on the unique supercomputer. The goal is to study how quantum computing might be leveraged to advance machine learning.

SOINN artificial brain learns from the internet, applies information

http://haselab.info/soinn-e.html

A group at the Tokyo Institute of Technology, led by Dr. Osamu Hasegawa, has advanced SOINN, their machine learning algorithm, which can now use the internet to learn how to perform new tasks. The system, which is under development as an artificial brain for autonomous mental development robots, is currently being used to learn about objects in photos using image searches on the internet. It can also take aspects of other known objects and combine them to make guesses about objects it doesn’t yet recognize.