Cornell University researchers have programmed a PR-2 robot to not only carry out everyday tasks, but to anticipate human behavior and adjust its actions.
From a database of 120 3-D videos of people performing common household activities, the robot has been trained to identify human activities by tracking the movements of the body – reduced to a symbolic skeleton for easy calculation – breaking them down into sub-activities like reaching, carrying, pouring or drinking, and to associate the activities with objects.
Observing a new scene with its Microsoft Kinnect 3-D camera, the robot identifies the activities it sees, considers what uses are possible with the objects in the scene and how those uses fit with the activities; it then generates a set of possible continuations into the future – such as eating, drinking, cleaning, putting away – and finally chooses the most probable. As the action continues, it constantly updates and refines its predictions.
Already popular in Japan, today’s New York Times reports on the developing trend of robotic companions for the elderly.
A typical Japanese example is the Tsukuba University created Hybrid Assistive Limb. The battery-powered suit senses and amplifies the wearer’s muscle action when carrying or lifting heavy objects. Caregivers can also use the suit to aid them while lifting patients from a bed, and patients can wear it to support their movements. Other Japanese devices include a small, battery-powered trolley to aid independent walking; a portable, self-cleaning bedside toilet; and a monitoring robot which tracks and reports the location of dementia patients.
The Times describes several interesting US developed robots: Cody, a Georgia Tech created robotic nurse cable of bathing patients; HERB, a Carnegie Mellon developed butler which retrieves objects and cleans; Hector, a University of Reading robot which provides medication reminders, locates lost objects, and can assist in a fall; and Paro, a baby seal looking robot which calms dementia patients.