MIT researchers have developed an algorithm that gauges heart rate by measuring tiny head movements in video data. A subject’s heart rate was consistently measured within a few beats per minute when compared to results from electrocardiograms. The algorithm was also able to provide estimates of time intervals between beats, which can be used to identify patients who are at risk for cardiac events.
The algorithm uses face recognition to differentiate between the person’s head and the rest of the image. It then randomly picks 500 to 1,000 exact points, clustered around the person’s mouth and nose. Movements are followed from frame to frame, and filtered when the temporal frequency falls below the range of a regular heartbeat – about 0.5 to 5 hertz, or 30 to 300 cycles per minute. This eliminates movements that continue at a lower frequency, such as those caused by regular breathing and slow alterations in posture. Principal component analysis is used to break down the resulting signal into many constituent signals, which stand as part of the uncorrelated leftover movements. Of those signals, it chooses one that appears to be the most regular and that drops within the typical frequency band of the human pulse.