Google looking to build “next gen clinical visit experience”

It appears that Google is doubling down on its plans to improve clinical visits for folks around the country. CNBC found that the company has posted a few job listings under its “Google Brain” initiative.

Google Medical Digital Assist

These job listings are for a new research project called “Medical Digital Assist”. As the name may suggest, Google is looking to use artificial intelligence (AI) to improve doctor visits.

But instead of focusing on the patient, this new project is looking to focus on the doctors. Google is looking to use voice recognition when it comes to taking notes during a visit.

Related Reading

This is likely to be based on Google’s already great voice recognition software. For example, Assistant is becoming more and more widely utilized and popularized with the help of Assistant.

Additionally, these job listings state that Google is looking to build the “next gen clinical visit experience”. The company is going to be looking to use both audio and touch technologies for these visits.

Speeding up the process

In fact, Google is looking to bring some new tests to at least one health-care partner by the end of the year. The company is definitely not wasting any time and is hoping to revolutionize the experience.

The reason for using Google’s voice recognition software over a computer is pretty simple. It would be much better for doctors and clinicians to use their voice versus accidentally inputting wrong data into the computer.

Accuracy is another big issue because a simple mistake like a computer notating “hyper” versus “hypo” can be potentially life-threatening, especially if the doctor doesn’t thoroughly check the note.

The technology would be able to “listen in” to the visit, while being able to “parse out the relevant information”. Not only would this improve the visit as a whole, but would give doctors more hands-on time with patients. More hands-on time would equate to quicker visits and more appointments being taken care of in a day.

Google and Stanford Medicine

Google is currently partnered with Stanford Medicine for a “digital scribe” study. This study uses both speech recognition and machine learning tools. These tools help doctors fill out electronic health records automatically.

The current collaboration will be concluding in August. However, it has been confirmed that there will be talks to move onto the second phase of the study, which could last for the next year, at least.

It will definitely be interesting to see what happens with this Medical Digital Assist initiative. Let us know what you think about this in the comments below, and whether you think it will be helpful for doctors and patients alike.

Previous articleAparito Wearable Technology for Disease Monitoring and Health
Next articleGarmin hopes to help keep better track of your sleeping habits
Sudz Niel Kar
I am a technologist with years of experience with Apple and wearOS products, have a BS in Computer Science and an MBA specializing in emerging tech, and owned the popular site AppleToolBox. In my day job, I advise Fortune 500 companies with their digital transformation strategies and also consult with numerous digital health startups in an advisory capacity. I'm VERY interested in exploring the digital health and fitness-tech evolution and keeping a close eye on patents, FDA approvals, strategic partnerships, and developments happening in the wearables and digital health sector. When I'm not writing or presenting, I run with my Apple Watch Ultra or Samsung Galaxy Watch and closely monitor my HRV and other recovery metrics.


Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.