Just a short video clip can now detect your breathing rate, heart rate and more

Smartphone camera health monitoring

Contactless health monitoring is all the rage these days. From Google’s Sleep sensing feature in the new Nest hub to smart speakers that can read your heart metrics, contactless health monitoring innovations are here to complement remote monitoring and telehealth opportunities.

With just an 18-second video clip of a person’s head and shoulders, a new algorithm can determine heart rate, or pulse, based on the changes in light intensity reflected off the skin. Breathing rate, or respiration, is measured from the rhythmic motion of their head, shoulders, and chest. 

Related Reading

Researchers from Microsoft and the University of Washington presented novel machine learning systems that can generate a personalized model to measure heart and breathing rates based on a short video taken with a smartphone camera.  

Daniel Mcduff, a principal researcher at Microsoft Research, and Ph.D. student Xin Lau at the University of Washington developed and presented this new system.

“Currently there’s no way to do remote vitals collection except for a very small minority of patients who have medical-grade devices at home,” such as a pulse oximeter to detect heart rate and blood oxygen level, or a blood pressure cuff, says McDuff.

“Variations in blood volume influences how light is reflected from the skin,” says McDuff. “So the camera is picking up micro-changes in light intensity and that can be used to recover a pulse signal. From that, we can derive heart rate variation and detect things like arrhythmias.”

Smartphone camera systems have been getting a facelift recently. Google announced recently that its Android-powered Pixel phones will be able to track respiratory rate using the phone’s cameras.

The new system isn’t ready for medical use and will need to be validated in clinical trials. To improve the robustness of the system, one approach the team is taking is to train models on computer-generated images and improve upon the AI algorithms.

Computer Vision-based AI algorithms are becoming sophisticated enough so that they can enable new telehealth features at a reduced cost. We expect to see new innovations in this area in the coming months and quarters.

Sources:

Previous articleSchedule your Apple Fitness+ workouts in advance and stay more fit
Next articleTicWatch Pro 3 GPS devices receiving the new H-MR2 update
Sudz Niel Kar
I am a technologist with years of experience with Apple and wearOS products, have a BS in Computer Science and an MBA specializing in emerging tech, and owned the popular site AppleToolBox. In my day job, I advise Fortune 500 companies with their digital transformation strategies and also consult with numerous digital health startups in an advisory capacity. I'm VERY interested in exploring the digital health and fitness-tech evolution and keeping a close eye on patents, FDA approvals, strategic partnerships, and developments happening in the wearables and digital health sector. When I'm not writing or presenting, I run with my Apple Watch Ultra or Samsung Galaxy Watch and closely monitor my HRV and other recovery metrics.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.