VR + motion capture to study movement, sensory processing, in autism, AD, TBI

MoBi, developed by John Foxe at the University of Rochester, combines VR, EEG, and motion capture sensors to study movement difficulties associated with neurological disorders.

According to Foxe, “The MoBI system allows us to get people walking, using their senses, and solving the types of tasks you face every day, all the while measuring brain activity and tracking how the processes associated with cognition and movement interact.”

Motion sensor and EEG data, collected while a subject is walking in a virtual environment, are synchronized, allowing researchers to track which areas of the brain are being activated when walking or performing task. Brain response while moving, performing tasks, or doing both at the same time, is analyzed.

This technique could potentially guide treatment in Autism, dementia, and TBI, characterized by difficulty in processing sensory information from multiple sources and an abnormal gait.

Click to view University of Rochester video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson – Ed Simcox – Sean Lane

PREFERRED REGISTRATION AVAILABLE THROUGH TODAY, SEPTEMBER 7TH

VR-enhanced molecular simulations

University of Bristol researchers, Oracle and Interactive Scientific  have used Oracle’s cloud infrastructure to combine real-time molecular simulations with VR, enabling them to “touch” molecules as they move — highlighting the potential of VR in seeing and manipulating complex 3D structures.  The technology could change how drugs are designed, and transform the teaching of chemical structures and dynamics.

The molecules can be virtually folded, knotted, plucked, and their shape changed to test how they interact.  The cloud allows several people to interact with them  in the same virtual space at the same time.

The team designed a series of molecular tasks to test on a mouse and keyboard, touchscreens and VR. This included threading a small molecule through a nanotube, changing the screw-sense of a small organic helix and tying a small string-like protein into a simple knot.  They said that in complex 3D tasks, VR gave participants up to 10 times more success.

Acocording to Bristol Professor Adrian Mulholland: “Chemists have always made models of molecules to understand their structure – from how atoms are bonded together to Watson and Crick’s famous double helix model of DNA. At one point in their education, most people have held a molecular model, probably made from plastic or metal. Models like these are particularly important for things we can’t see, such as the nanoscale world of molecules.  Thanks to this research we can now apply virtual reality to study a variety of molecular problems which are inherently dynamic, including binding drugs to its target, protein folding and chemical reactions. As simulations become faster we can now do this in real time which will change how drugs are designed and how chemical structures are taught.

Click to view University of Bristol video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE JULY 13th

David Axelrod: VR in healthcare & the Stanford Virtual Heart | ApplySci @ Stanford

David Axelrod discussed VR-based learning in healthcare, and the Stanford Virtual Heart, at ApplySci’s recent Wearable Tech + Digital Health + Neurotech conference at Stanford;


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE JULY 6th

TMS + VR for sensory, motor skill recovery after stroke

EPFL’s Michela Bassolino has used transcranial magnetic stimulation to create hand sensations when combined with VR.

By stimulating the motor cortex,  subjects’ hand muscles  were activated, and involuntary short movements were induced.

In a recent study, when subjects observed a virtual hand moving at the same time and in a similar manner to their own during TMS, they felt that a virtual hand was a controllable body part.

25 of 32 participants experienced the effect within two minutes of stimulation. Bassolino believes that the effect may also be achieved through less immersive video.

The technology could  help patients recover sensory and motor skills after a stroke — and also be used as a gaming enhancement.


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 25, 2018 at the MIT Media Lab

VR + neurofeedback for movement training after stroke


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg

VR studied for PTSD, phobia treatment

Emory’s Jessica Maples-Keller has published a study demonstrating the effectiveness of VR in treating PTSD, phobias, and other mental illnesses.  She describes the treatment as allowing “providers to create computer-generated environments in a controlled setting, which can be used to create a sense of presence and immersion in the feared environment for individuals suffering from anxiety disorders.”

Small studies on the use of VR in  panic disorder, schizophrenia, acute and chronic pain, addiction, and eating disorders have been done, but with limited numbers and a lack of comparison groups. Keller noted that extensive training is needed before integrating VR approaches into clinical practice.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda

REGISTER BY MAY 19TH AND SAVE $500

VR therapy could reduce acute and chronic pain

Cedars-Sinai’s Brennan Spiegel has published a study showing that VR therapy could reduce acute and chronic pain.

100 gastrointestinal, cardiac, neurological and post-surgical pain patients with an average pain score of 5.4 were included.  Fifty patients watched a 15-minute nature video. Fifty patients watched a 15-minute animated game with VR goggles.
The patients who watched the nature video had a 13% decrease in  pain scores.  The patients who watched the virtual reality game had a 24% decrease.

Th researchers are not sure how VR actually reduces pain, but thnk that it could be due to immersive distraction.  According to Spiegel:

“When the mind is deeply engaged in an immersive experience, it becomes difficult, if not impossible, to perceive stimuli outside of the field of attention. By ‘hijacking’ the auditory, visual, and proprioception senses, VR is thought to create an immersive distraction that restricts the mind from processing pain.”

Potential side effects of VR include dizziness, vomiting, nausea or epileptic seizures, therefore patients must be carefully screened and monitored.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Ed Boyden, Roz Picard, Tom Insel, John Rogers, Jamshid Ghajar and  Nathan Intrator – September 19, 2017 at the MIT Media Lab

VR training to reduce falls in Parkinson’s, dementia

Tel Aviv University’s Jeff Hausdorff has created a virtual reality treadmill system in an attempt to prevent falls in Parkinson’s  and  dementia patients.

Current interventions focus on improving muscle strength, balance and gait.  By integrating motor planning, attention, executive control and judgement training, using VR, therapies can also address the cognitive issues associated with falls.

In a recent study of 282 participants,  146 did treadmill + VR training, and 136 did treadmill training alone. VR patient foot movements were filmed and shown on a screen, in order for them to “see” their feet walking  in real-time. The game-like simulation included avoiding and stepping over puddles or hurdles, and navigating pathways. It also provided motivational feedback.

Fall rates were similar in both groups before the training. Six months after, those who participated in the VR intervention fell 50% less. Those who did not train with VR had consistent fall rates. The biggest improvement was seen in Parkinson’s patients.

Patients can receive the combined therapy at the Hausdorff-led Center for the Study of Movement Cognition and Mobility at Tel Aviv’s Ichilov Hospital.

Click to view the Tel Aviv Sourasky Medical Center video.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Roz Picard, Tom Insel, John Rogers and Nathan Intrator – September 19, 2017 at the MIT Media Lab

Eye tracking + VR to improve brain injury diagnosis, track recovery

Eye tracking technology, combined with VR, is proliferating, with myriad medical, gaming, and education applications.

SyncThink uses eye tracking, built into an Oculus Rift,  to detect if a person has the ability to keep

the eyes synced with moving objects, to determine brain injury and track recovery.

The company has been granted 10 patents, for  eye-tracking hardware, and analytical techniques for stimulating, measuring, and training brain attention networks. It has been used to detect concussions on the field and evaluate soldier readiness and brain impairment after injury. The company describes additional applications including characterizing and monitoring fatigue, performance, and developmental or neurodegenerative conditions.

Eyefluence, which was today acquired by Google, creates head-mounted display AR, VR, and mixed reality interfaces. According to the company,  its AR application allows critical care professionals to access patient data with their eyes while their hands treat the injured.  VR integrations humanize experiences, reduce nausea, optimize image resolution, and increase speed.

ApplySci believes that the next step in AR/VR enhancement is integrating mobile EEG into headsets, combining eye tracking, GSR, and  brainwave data into various applications.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth