Johns Hopkins engineers have developed a powerful new computer-based process that helps identify the dangerous conditions that lead to concussion-related brain injuries.
Professor K.T. Ramesh led a team that used a technique called diffusion tensor imaging, together with a computer model of the head, to identify injured axons, which are tiny but important fibers that carry information from one brain cell to another. These axons are concentrated in a kind of brain tissue known as “white matter,” and they appear to be injured during the so-called mild traumatic brain injury associated with concussions. Ramesh’s team has shown that the axons are injured most easily by strong rotations of the head, and the researchers’ process can calculate which parts of the brain are most likely to be injured during a specific event.
A doctor recently used his iPhone, in combination with AliveCor, a mounted sensor capable of delivering clinically accurate electrocardiograms, while in flight, to measure the vital signs of a passenger experiencing severe chest pains at 30,000 feet.
The results indicated that the passenger was having a heart attack. The doctor recommended an urgent landing, and the passenger survived after being rushed to the hospitall
Kurzweil hopes to leverage Google’s massive pool of resources and data to develop technology that would create truly intelligent computers that can understand human language on a deep level.
PUMA measures six components to evaluate metabolic function: oxygen and carbon dioxide partial pressure, volume flow rate, heart rate, and gas pressure and temperature. From those measurements, PUMA can compute the oxygen uptake, carbon dioxide output and minute ventilation (average expired gas flow rate). A small, embedded computer takes readings of each sensor and relays the data wirelessly to a remote computer via Bluetooth.
The possibility of confusing causation and correlation in fMRI analysis is explored.
Devices that collect personal medical information are growing both prolific and inexpensive. The biggest challenges lie not in collecting and transmitting the data, but in building the backend systems that can interpret it.
OSU researchers attempt to reduce the cost of wireless EEG and ECG monitoring to less than a dollar. Applications include self-tracking and enabling doctors to monitor at-risk patients in real time. Multiple chips around the body can continuously track specific metrics.
Researchers from the University of Pittsburgh Medical Center tested four apps to analyze images of 188 moles, including 60 melanomas. All of these moles were pre-evaluated by a dermatologist.
The best-performing app forwarded the images to board-certified dermatologists to review at cost of $5 per mole, and claims to be accurate 98% of the time. Some are skeptical. We are sure that we will soon see a proliferation of early, at home detection apps.
In addition to the remote monitoring of chronic conditions, sensors, computerized pattern recognition and links to human responders can detect and head off health threats to the elderly living alone.
The “sensorization” of CES was obvious. Which technologies are meaningful, and which are simply stylish? The health monitoring sector is set to grow exponentially in 2013. It’s important to understand the science behind the gadgets. ApplySci, the crowdfunding platform, is committed to bringing you peer reviewed, life enhancing, sensor based mobile health monitoring technology. And the ApplySci blog will regularly review this technology. We welcome your input.