Duke researchers link visual stimulus with tactile sensation

http://www.dukehealth.org/health_library/news/touch-and-movement-neurons-shape-the-brain-s-internal-image-of-the-body

Miguel Nicolelis is one of the main contributors to brain machine interface.  In a series of innovative experiments, he demonstrates the intricate connections in the brain, attempting to create a coherent model of multi-sensory input.  His recent experiment shows that monkeys can be tricked when the multi-sensory input is only partially coherent.

A related study from Stockholm’s Karolinska Institute describes humans distorted self perception due to incoherency between visual and tactile input.

In the Nicolelis study, untrained monkeys were implanted with up to 384 electrodes to analyze how stimulus was encoded by the monkey brain. The monkeys were then shown a virtual simulation of their arm being touched, while simultaneously having their own arm touched. After a few minutes of synchronized stimulation, the physical component was removed. The areas of the monkeys’ brains responsible for tactile sensations, however, continued to respond to the virtual simulation, indicating that they felt physical sensation simply through visual association. The monkeys’ neuronal responses to the virtual stimulation came later than the responses to the physical stimulation. This suggests that the sensation was mediated by a longer pathway involving the visual system.

Instead of previously imagined single neuronal pathways, seemingly unrelated cortices apparently use a highly dynamic, cross functional process to form more of a continuously interacting grid or network. They also appear to cooperate quite closely in shaping the body schema, or the brain’s internal representation of the body

Human-to-human brain interface – UW researcher controls colleague’s movement

http://www.washington.edu/news/2013/08/27/researcher-controls-colleagues-motions-in-1st-human-brain-to-brain-interface/

University of Washington researchers have performed what they believe is the first noninvasive human-to-human brain interface, with one researcher able to send a brain signal via the Internet to control the hand motions of a fellow researcher.

Using electrical brain recordings and a form of magnetic stimulation, Rajesh Rao sent a brain signal to Andrea Stocco on the other side of the UW campus, causing Stocco’s finger to move on a keyboard.

While researchers at Duke University have demonstrated brain-to-brain communication between two rats, and Harvard researchers have demonstrated it between a human and a rat, Rao and Stocco believe this is the first demonstration of human-to-human brain interfacing.

Magnetic stimulation as a direct brain communication channel is very intriguing.

Transcranial direct current stimulation headset receives FCC approval

http://www.extremetech.com/extreme/162581-foc-us-the-first-commercial-tdcs-headset-that-lets-you-safely-overclock-your-brain

The Foc.us headset is an early player in the wave of non-invasive devices that will enable improved brain function.  It passes direct current between the cathode and anode, which are placed over the prefrontal cortex, making neurons more excitable.  This helps them to fire more quickly, improving reaction time. When the currents are removed, neurons have additional plasticity.

Early studies have shown that tDCS, which can be used to stimulate regions of the brain other than the prefrontal cortex, such as the motor cortex, can provide therapeutic effects to Parkinson’s and stroke patients.  DARPA has used tDCS to improve the training of snipers, and it has also been used to improve gamer performance.

Neuromorphic chip mimics human brain in real time

http://www.mediadesk.uzh.ch/articles/2013/chips-die-das-gehirn-imitieren_en.html

University of Zurich and ETH Zurich scientists have created a two by two millimeter microchip with 11,011 electrodes that mimics the brain’s processing power.   The brain-like microchips are not sentient beings, but can carry out complex sensorimoter tasks in real time.  Previous brain-like computer systems have been slower and larger.  This system, developed by Professor Giacomo Indiveri, is comparable to an actual brain in both speed and size.

Computer model of the brain simulates daydreams

http://www.jneurosci.org/content/33/27/11239.abstract?sid=f2eef2ee-cad7-4d63-b90a-ac6566847078

Washington University researchers have created a computer model to help scientists learn how the brain’s anatomical structure contributes to the creation and maintenance of resting state networks.   They hope that the model will help them understand why certain portions of the brain work together when a person daydreams or is mentally idle, helping doctors better diagnose and treat brain injuries.

“We can give our model lesions like those we see in stroke or brain cancer, disabling groups of virtual cells to see how brain function is affected,” said Professor Maurizio Corbetta.  “We can also test ways to push the patterns of activity back to normal.”

Based on data from brain scans, researchers assembled 66 cognitive units in each hemisphere, and interconnected them in anatomical patterns similar to the connections present in the brain.  Individual units went through the signaling process at random low frequencies that had previously been observed in brain cells in culture and in recordings of resting brain activity.  The researchers let the model run, slowly changing the coupling, or the strength of the connections between units.   At a specific coupling value, the interconnections between units sending impulses soon began to create coordinated patterns of activity.

“Even though we started the cognitive units with random low activity levels, the connections allowed the units to synchronize,” said Professor Gustavo Deco of Universitat Pompeu Fabra. “The spatial pattern of synchronization that we eventually observed approximates very well—about 70 percent—to the patterns we see in scans of resting human brains.”

Low cost GPU based neural network simulates the brain

http://stanford.edu/~acoates/papers/CoatesHuvalWangWuNgCatanzaro_icml2013.pdf

In a new paper, Stanford’s Andrew Ng describes how to use graphics microprocessors to build a $20,000 computerized brain that is similar to the cat-detector he developed with Google last year for $1M.

To test his hypothesis about GPU-driven Deep Learning, he also built a larger version of the platform for $100,000.  It utilized 64 Nvidia GTX 680 GPUs on 16 computers. It was able to accomplish the same cat-spotting tasks as the Google system, which needed 1,000 computers to operate.  That system, modeling the activities and processes of the human brain, was able to learn what a cat looks like, and then translate that knowledge into spotting different cats across multiple YouTube videos.

Thought-controlled flying robot

http://www.livescience.com/27849-mind-controlled-devices-brain-awareness-nsf.html

University of Minnesota researchers led by Professor Bin He have been able to control a small helicopter using only their minds, pushing the potential of technology that could help paralyzed or motion-impaired people interact with the world around them.

An EEG cap with 64 electrodes was put on head of the person controlling the helicopter. The researchers mapped the controller’s brain activity while they performed certain tasks (for example, making a fist or looking up). They then mapped those patterns to controls in the helicopter. If the researchers mapped “go up” to a clenched fist, the copter went up. After that, the copter would go up automatically when the controller clenches a fist.

Nerve and muscle interfaces for prosthetic control

http://www.darpa.mil/NewsEvents/Releases/2013/05/30.aspx

DARPA continues to build technology with academic partners to enable amputees to control prosthetic limbs with their minds.  Examples follow:

Researchers at the Rehabilitation Institute of Chicago demonstrated a type of peripheral interface called targeted muscle re-innervation (TMR). By rewiring nerves from amputated limbs, new interfaces allow for prosthetic control with existing muscles.

Researchers at Case Western Reserve University used a flat interface nerve electrode (FINE) to demonstrate direct sensory feedback. By interfacing with residual nerves in the patient’s partial limb, some sense of touch by the fingers is restored. Other existing prosthetic limb control systems rely solely on visual feedback. Unlike visual feedback, direct sensory feedback allows patients to move a hand without keeping their eyes on it—enabling simple tasks, like searching a bag for small items, not possible with today’s prosthetics.

Cornell robots anticipate human actions

http://news.cornell.edu/stories/2013/04/think-ahead-robots-anticipate-human-actions

Cornell University researchers have programmed a PR-2 robot to not only carry out everyday tasks, but to anticipate human behavior and adjust its actions.

From a database of 120 3-D videos of people performing common household activities, the robot has been trained to identify human activities by tracking the movements of the body – reduced to a symbolic skeleton for easy calculation – breaking them down into sub-activities like reaching, carrying, pouring or drinking, and to associate the activities with objects.

Observing a new scene with its Microsoft Kinnect 3-D camera, the robot identifies the activities it sees, considers what uses are possible with the objects in the scene and how those uses fit with the activities; it then generates a set of possible continuations into the future – such as eating, drinking, cleaning, putting away – and finally chooses the most probable. As the action continues, it constantly updates and refines its predictions.