Category Archives: AI

Personalized robot companion for seniors

FacebooktwitterlinkedinFacebooktwitterlinkedin

http://cordis.europa.eu/fetch?CALLER=OFFR_TM_EN&ACTION=D&RCN=11525

A European consortium of research institutes, universities and technology companies has developed a highly customizable robot companion to help seniors to maintain their quality of life, stay healthy and avoid social exclusion.

The robot, a mobile wheeled semi-humanoid figure equipped with cameras, sensors, audio, and a touch screen interface, can remind users to take their medicine, suggest they have their favorite drink, or prompt them to go for a walk or visit friends if they haven’t been out for a while. As part of a larger smart-home environment that can include smart clothing to monitor vital signs, the system can monitor user’s health and safety, and alert emergency services if necessary.

Neuromorphic chip mimics human brain in real time

FacebooktwitterlinkedinFacebooktwitterlinkedin

http://www.mediadesk.uzh.ch/articles/2013/chips-die-das-gehirn-imitieren_en.html

University of Zurich and ETH Zurich scientists have created a two by two millimeter microchip with 11,011 electrodes that mimics the brain’s processing power.   The brain-like microchips are not sentient beings, but can carry out complex sensorimoter tasks in real time.  Previous brain-like computer systems have been slower and larger.  This system, developed by Professor Giacomo Indiveri, is comparable to an actual brain in both speed and size.

Algorithm analyzes head movements to measure heart rate

FacebooktwitterlinkedinFacebooktwitterlinkedin

http://web.mit.edu/newsoffice/2013/seeing-the-human-pulse-0620.html

MIT researchers have developed an algorithm that gauges heart rate by measuring tiny head movements in video data.  A subject’s heart rate was consistently measured within a few beats per minute when compared to results from electrocardiograms. The algorithm was also able to provide estimates of time intervals between beats, which can be used to identify patients who are at risk for cardiac events.

The algorithm uses face recognition to differentiate between the person’s head and the rest of the image.  It then randomly picks 500 to 1,000 exact points, clustered around the person’s mouth and nose.  Movements are followed from frame to frame, and filtered when the temporal frequency falls below the range of a regular heartbeat – about 0.5 to 5 hertz, or 30 to 300 cycles per minute. This eliminates movements that continue at a lower frequency, such as those caused by regular breathing and slow alterations in posture.  Principal component analysis is used to break down the resulting signal into many constituent signals, which stand as part of the uncorrelated leftover movements. Of those signals, it chooses one that appears to be the most regular and that drops within the typical frequency band of the human pulse.

Low cost GPU based neural network simulates the brain

FacebooktwitterlinkedinFacebooktwitterlinkedin

http://stanford.edu/~acoates/papers/CoatesHuvalWangWuNgCatanzaro_icml2013.pdf

In a new paper, Stanford’s Andrew Ng describes how to use graphics microprocessors to build a $20,000 computerized brain that is similar to the cat-detector he developed with Google last year for $1M.

To test his hypothesis about GPU-driven Deep Learning, he also built a larger version of the platform for $100,000.  It utilized 64 Nvidia GTX 680 GPUs on 16 computers. It was able to accomplish the same cat-spotting tasks as the Google system, which needed 1,000 computers to operate.  That system, modeling the activities and processes of the human brain, was able to learn what a cat looks like, and then translate that knowledge into spotting different cats across multiple YouTube videos.

Nerve and muscle interfaces for prosthetic control

FacebooktwitterlinkedinFacebooktwitterlinkedin

http://www.darpa.mil/NewsEvents/Releases/2013/05/30.aspx

DARPA continues to build technology with academic partners to enable amputees to control prosthetic limbs with their minds.  Examples follow:

Researchers at the Rehabilitation Institute of Chicago demonstrated a type of peripheral interface called targeted muscle re-innervation (TMR). By rewiring nerves from amputated limbs, new interfaces allow for prosthetic control with existing muscles.

Researchers at Case Western Reserve University used a flat interface nerve electrode (FINE) to demonstrate direct sensory feedback. By interfacing with residual nerves in the patient’s partial limb, some sense of touch by the fingers is restored. Other existing prosthetic limb control systems rely solely on visual feedback. Unlike visual feedback, direct sensory feedback allows patients to move a hand without keeping their eyes on it—enabling simple tasks, like searching a bag for small items, not possible with today’s prosthetics.

Cornell robots anticipate human actions

FacebooktwitterlinkedinFacebooktwitterlinkedin

http://news.cornell.edu/stories/2013/04/think-ahead-robots-anticipate-human-actions

Cornell University researchers have programmed a PR-2 robot to not only carry out everyday tasks, but to anticipate human behavior and adjust its actions.

From a database of 120 3-D videos of people performing common household activities, the robot has been trained to identify human activities by tracking the movements of the body – reduced to a symbolic skeleton for easy calculation – breaking them down into sub-activities like reaching, carrying, pouring or drinking, and to associate the activities with objects.

Observing a new scene with its Microsoft Kinnect 3-D camera, the robot identifies the activities it sees, considers what uses are possible with the objects in the scene and how those uses fit with the activities; it then generates a set of possible continuations into the future – such as eating, drinking, cleaning, putting away – and finally chooses the most probable. As the action continues, it constantly updates and refines its predictions.

Watson’s health care capabilities described to lawmakers

FacebooktwitterlinkedinFacebooktwitterlinkedin

http://www.washingtonpost.com/blogs/innovations/wp/2013/05/17/watson-goes-to-washington-ibm-shows-off-latest-health-care-work-to-lawmakers/

On Capitol Hill, IBM representatives described the supercomputer’s new health-care related features, including the ability to ingest patients’ medical information and synthesize thousands of medical journals and other reference materials along with patient preferences to suggest treatment options.

The Watson team has collaborated with the Memorial Sloan-Kettering Cancer Center and insurer Well Point to teach the computer about the medical world.

Quantum computing AI lab from Google, NASA and USRA

FacebooktwitterlinkedinFacebooktwitterlinkedin

http://googleresearch.blogspot.ca/2013/05/launching-quantum-artificial.html

Google, NASA and the Universities Space Research Association will put a 512 qubit machine from D-Wave at the disposal of researchers around the globe.  The USRA will invite teams of scientists and engineers to share time on the unique supercomputer. The goal is to study how quantum computing might be leveraged to advance machine learning.

SOINN artificial brain learns from the internet, applies information

FacebooktwitterlinkedinFacebooktwitterlinkedin

http://haselab.info/soinn-e.html

A group at the Tokyo Institute of Technology, led by Dr. Osamu Hasegawa, has advanced SOINN, their machine learning algorithm, which can now use the internet to learn how to perform new tasks. The system, which is under development as an artificial brain for autonomous mental development robots, is currently being used to learn about objects in photos using image searches on the internet. It can also take aspects of other known objects and combine them to make guesses about objects it doesn’t yet recognize.