Samsung’s Early Detection Sensor & Algorithm Package (EDSAP), developed by Se-hoon Lim, is meant to detect early signs of stroke.
A multiple sensor headset records electrical impulses in the brain, algorithms determine the likelihood of a stroke in one minute, and results are presented in a mobile app. EDSAP can also analyze stress and sleep patterns, and potentially be used to monitor heart activity. The company believes that the system can one day be built into one’s own glasses.
Wearable Tech + Digital Health NYC 2015 – June 30 @ New York Academy of Sciences
On October 23rd, ApplySci described Justin Williams‘s graphene based, transparent sensor brain implant.
The Nature paper is now available online. This will redefine neural implants as it will enable better fMRI monitoring of the activity around the implant, while getting detailed activity from the area. Together with noninvasive EEG, this can help fine tune very detailed EEG features.
Albert Einstein College of Medicine professor Sophie Molholm has published a paper describing the way that autistic children process sensory information, as determined by EEG. She believes that this could lead to earlier diagnosis (before symptoms of social and developmental delays emerge), hence earlier treatment, which might reduce the condition’s symptoms.
EEG readings were taken from 40 children, ages 6-17, who were diagnosed with autism, and compared to those of unaffected children of similar age. All were given either a flash cue, a beep cue or a combination, and asked to press a button when these stimuli occurred. A 70 electrode cap measured brain responses every two milliseconds, including those that recorded how the brain first processed the information.
The children with autism showed a distinctly different brain wave signature from those without the condition. There were differences in the speed in which the sights or sounds were processed, and in how the sensory neurons recruited neurons in other areas of the brain to register and understand the information. The more different this multi-processing was, the more severe the child’s autistic symptoms.
Professor Molholm acknowledges that the sample was too small to use the profile for diagnosing autism, but it could lead to such a test if the results are confirmed and repeated.
Philips and Accenture are using EEG brainwaves to help ALS patients command electronic devices via a wearable display, a tablet and software. The system can access a medical alert service, a smart TV and wireless lighting, and communicate via pre-configured messages. The wearable display provides visual feedback that allows the user to navigate the application menu.
Professor Florian Holzapfel and colleagues at the Institute of Flight System Dynamics of the Technische Universität München have demonstrated the feasibility of flying via brain control.
Brainwaves of the pilots are measured with EEG electrodes connected to a cap. An algorithm developed by Team PhyPa at the Berlin Institute of Technology deciphers electrical potentials and converts them into control commands. Only very clearly defined electrical brain impulses required for control are recognized by the brain-computer interface.
Called Brainflight, the EU-funded project aims to prove that brain-controlled flight is possible and that pilots with little or no experience can use a BCI to fly. Some of the pilots were able to land the plane, in a simulator, under conditions of poor visibility using their thoughts.
Virginia Tech Carilion Research Institute scientists, led by Professor William Tyler, have demonstrated that ultrasound directed to a specific region of the brain can boost performance in sensory discrimination. This is the first example of low-intensity, transcranial-focused ultrasound modulating human brain activity to enhance perception.
The scientists delivered focused ultrasound to an area of the cerebral cortex that corresponds to processing sensory information received from the hand. To stimulate the median nerve, they placed an electrode on the wrist and recorded brain responses using EEG. Before stimulating the nerve, they began delivering ultrasound to the targeted brain region. The ultrasound decreased the EEG signal and weakened the brain waves responsible for encoding tactile stimulation.
Subjects were then given two neurological tests: the two-point discrimination test, which measures a one’s ability to distinguish whether two nearby objects touching the skin are truly two distinct points, rather than one; and the frequency discrimination task, which measures sensitivity to the frequency of a chain of air puffs. The subjects receiving ultrasound showed significant improvements in their ability to distinguish pins at closer distances and to discriminate small frequency differences between successive air puffs.
In a recent experiment, Stanford professors Chris Chafe and Josef Parvizi created audio EEG recordings of both normal brain activity and seizure states. During the state of seizure, tones became more pronounced and their tempo became chaotic. “We could instantly differentiate seizure activity from non-seizure states with just our ears,” Chafe said. “It was like turning a radio dial from a static-filled station to a clear one.”
Since some seizures can occur without immediate, behavioral symptoms, Chafe and Parvizi have decided to use this research to develop a tool for caregivers to use real time brain data to hear and recognize undetected seizures.
The EEGs Parvizi conducts register brain activity from more than 100 electrodes. Chafe selects certain electrode/neuron pairings and allows them to modulate notes sung by a female singer. As the electrode captures increased activity, it changes the pitch and inflection of the singer’s voice.
Before the seizure begins (during the pre-ictal stage) the notes from each “singer” almost synchronize and fall into a clear rhythm. In the moments leading up to the seizure event, each of the singers begins to improvise. The notes become progressively louder and more scattered as the full seizure event occurs (the ictal state). One can hear the electrical storm originate on one side of the brain and eventually cross over into the other hemisphere. After about 30 seconds of chaos, the singers begin to calm, tapering off into their post-ictal rhythm. Occasionally, one or two will behave erratically, but on the whole, the choir sounds extremely fatigued.
According to Professor Parvizi, this is the perfect representation of the three phases of a seizure event.
Danilo Mandic at Imperial College in London has developed an EEG device that can be worn inside the ear, like a hearing aid. It will enable scientists to record EEGs for several days at a time, allowing doctors to monitor patients who have recurring problems such as seizures or microsleep.
By nestling the EEG inside the ear, a lot of signal noise usually introduced by body movement is avoided. The engineers can also ensure that the electrodes are always placed in exactly the same spot to make repeated readings more reliable.
US regulators have approved a device that analyzes brain activity to help confirm a diagnosis of attention deficit hyperactivity disorder in children ages 6-17. It records different kinds of electrical impulses given off by neurons in the brain and the frequency the impulses are given off each second.
The EEG based, Neuropsychiatric EEG-Based Assessment Aid test is noninvasive, takes 20 minutes, and is designed to measure theta and beta brain waves via sensors attached to the child’s head. Both brain frequencies are at higher levels in children and teens with ADHD
University of Minnesota researchers led by Professor Bin He have been able to control a small helicopter using only their minds, pushing the potential of technology that could help paralyzed or motion-impaired people interact with the world around them.
An EEG cap with 64 electrodes was put on head of the person controlling the helicopter. The researchers mapped the controller’s brain activity while they performed certain tasks (for example, making a fist or looking up). They then mapped those patterns to controls in the helicopter. If the researchers mapped “go up” to a clenched fist, the copter went up. After that, the copter would go up automatically when the controller clenches a fist.