EEG could lead to earlier autism diagnosis

Albert Einstein College of Medicine professor Sophie Molholm has published a paper describing the way that autistic children process sensory information, as determined by EEG.  She believes that this could lead to earlier diagnosis (before symptoms of social and developmental delays emerge), hence earlier treatment, which might reduce the condition’s symptoms.

EEG readings were taken from 40 children, ages 6-17, who were diagnosed with autism,  and compared to those of unaffected children of similar age.  All were given either a flash cue, a beep cue or a combination, and asked to press a button when these stimuli occurred.  A 70 electrode cap measured brain responses every two milliseconds, including those that recorded how the brain first processed the information.

The children with autism showed a distinctly different brain wave signature from those without the condition.  There were differences in the speed in which the sights or sounds were processed, and in how the sensory neurons recruited neurons in other areas of the brain to register and understand the information. The more different this multi-processing was, the more severe the child’s autistic symptoms.

Professor Molholm acknowledges that the sample was too small to use the profile for diagnosing autism, but it could lead to such a test if the results are confirmed and repeated.

Google Fit platform aggregates health data

Using a single set of APIs, Google Fit collects and aggregates data from fitness apps and sensors to manage a user’s fitness stream.

The platform will work with wearables and other peripherals.  To protect privacy, permission is required and data can be deleted.  Initally, Adidas,  Nike , Intel, LG and Motorola will participate.  Nike will add its Fuel number to the Fit stream for other apps to utilize.

Apple announced a similar fitness data aggregation platform, Healthkit,  earlier this month. (See ApplySci, June 5 2014.)  Both platforms are expected to go live this fall.

Smart glasses track fatigue

JINS MEME glasses track the correlation between eye strain and fatigue, and send mental and physical tiredness data to a user’s smartphone. 

The glasses monitor a user’s eye movements and gaze. They contain small metallic electrooculography sensors in the parts of the frame that touch the bridge of the nose and ears, determining the electrical potential of the eye movement.  Voltage change data is measured for alertness or fatigue.  It also has a motion tracking sensor that calculates body movements, posture and balance and can provide information about calories burned or exercise speed.

Prosthetic arm moves after muscle contraction detected

DEKA is a robotic, prosthetic arm that will allow amputees to perform complex movements and tasks. It has just received FDA approval.

Electrodes attached to the arm detect muscle contractions close to the prosthesis, and a computer translates them into movement.  Six “grip patterns” allow wearers to drink a cup of water, hold a cordless drill or pick up a credit card or a grape, among other functions.

DARPA‘s Justin Sanchez believes that DEKA “provides almost natural control of upper extremities for people who have required amputations.”  He claims that “this arm system has the same size, weight, shape and grip strength as an adult’s arm would be able to produce.”