In a breakthrough study published this week in Nature journal, researchers have finally made a promising discovery. They can use Photoplethysmography (PPG) to perform non-invasive diabetes detection.
Researchers at the Tison Lab, UC San Francisco have now come up with a promising method of detecting diabetes using a smartphone camera and some deep learning. The research team leveraged an app called Azumio to capture heart rate data for this study.
Contents
Related reading:
- Companion Medical researching use of Augmented Reality and AI for Diabetes Management
- Apple Researching Bio-authentication processes for the Apple Watch
- 5 Common Dexcom Errors and How to Fix Them
- Withings Smart Scale, Smart Watch, Contactless Thermometer and more for 20% OFF
- Google AI and Diabetic Retinopathy, Hope lies Ahead
- Apple’s AI-based odor sensor technology could transform future Apple Watch Health and Safety offerings
Measuring your fingertip’s heart rate photoplethysmography (PPG) to identify biomarkers
When a user places his or her fingertip over the phone’s flashlight and camera, the app measures PPG’s by capturing color changes in the fingertip corresponding to each heartbeat. This data is reported back to the user as the instantaneous heart rate.
This element of using PPG to measure heart rate is not new. There are many apps out there that offer this functionality and do not require the user to wear a smartwatch. In fact, Samsung released a heart rate monitoring feature using this exact method some years back.
What is more exciting is the use of newer AI technology to read the collected PPG data from the phone’s camera and then make inferences about possible biomarkers for diabetes.
The team developed a 39-layer convolutional deep neural network (DNN) to detect prevalent diabetes from smartphone PPG signals
The researchers developed and validated a deep-learning algorithm using nearly 3 million PPG recordings from 53,870 patients.
The algorithm impressively managed to correctly identify the presence of diabetes in 82% of patients with diabetes, as well as the absence of diabetes in 97% of patients without diabetes.
And this algorithm’s performance improved even further when combined with patient data, such as age, gender, and BMI, as well as other co-occurring comorbidities.
Diabetes affects more than 451 million people worldwide and nearly half are undiagnosed.
Though its symptoms can go unnoticed for years before being diagnosed, diabetes impacts nearly every organ system causing substantial suffering and life-threatening diseases including heart attacks, stroke, and kidney failure.
This digital biomarker of diabetes could serve as a readily attainable complement to other established tools, providing novel information about vascular and autonomic sequelae of diabetes for clinical applications ranging from screening to therapeutic monitoring.
This is a huge breakthrough! Feel free to also check out and join the Health e-heart study that is being led by the team at UCSF.
Source: Tison Lab @UCSF
This would mean no more finger pricks or insertion of sensors on your body to monitor your diabetes?
No.
This will not replace my sensor readings upon which I will base my insulin dosages. Not even CLOSE.
Why make such a claim that has no basis in truth and that makes life harder for us because people may quote your commentary on this?
Placing my finger on the camera is not a viable alternative to having readings automatically sent to my phone every 5 minutes.