In this article, we will discuss ambient intelligence in healthcare. As far as clinical documentation is concerned, there are multiple terms that we hear most often, and some among them are digital scribe, clinical documentation tool, digital voice assistant, etc.,

Now, what is ambient intelligence? It’s an intelligence where we put all the above-said tools together and create an environment that continuously provides us with intelligent data knowledge that can be transformed into wisdom.

This is part of the larger series — Digital Voice Assistant in Telehealth and it includes four major topics:

  1. True Impact of Telehealth
  2. Natural Language Processing in Healthcare and Overview
  3. Ambient Intelligence in Healthcare
  4. Cloud vs Edge Computing in Healthcare

So, in this article, we will be discussing Ambient Intelligence in Healthcare.

Scribe vs CDT:

What do these terms really mean?

Digital Scribe — It is a process where the whole doctor-patient conversation via phone is transcribed either using any automated software or with the help of a transcriber.

Clinical Documentation Tool — The next phase was the clinical documentation tool. It’s a tool with which you can automate certain things if you click a certain portion of it. For example, if you click and say “Start X note”, then it can automate a workflow in that note. And this way, you’re actually not just doing pure transcription that gets pushed into the EMR, and then you click sign and gets buried into PDFs of stuff within the EMR, but rather within the very document itself, there’s an automation that is built-in, that creates better workflows.

Another example would be that if there are some extracted data like say, “‘Insert last hemoglobin”, or “Insert last BP”, not only the data is extracted but positioned appropriately on the said note.

Digital Voice Assistant:

What we truly wanted is a true digital voice assistant, which generally speaking does not exist at the moment but will help with clinical documentation. Not only that, it will be able to place orders, and then also search for data with your voice. It’s like any other voice assistant and if you say “Hey, X”, then it will invoke the voice assistant that’s going to help you through the journey of clinical documentation.

And that digital voice assistant can be dialed into specific areas, for example, “Hey, X, provide me a stroke summary”, and this way, it will know what I need and can gather data like antiplatelets, anticoagulation agents, when the patient had a stroke last time, etc. So that’s where we want to get to. Next step provides clinical decision support. Not just in terms of expert systems, but just contextual data that can transform physicians’ workflow and help make better and faster decisions. Because data is the key to everything in evidence-based medicine. Ambient intelligence is embedded in the surroundings and ready for us.

Road Map of Tech Giants:

Knowing its futuristic implications, the best company that has come up with it was Microsoft with the project EmpowerMD (1). While the physician is talking to the patient, the whole conversation is recorded and is transcribed like a normal text conversation distinguishing who said what appropriately. And not only that, but it also transforms the data into a proper soap note.

It can actually differentiate and say, this was about the history of present illness, this was about the review of systems, social history, physical exam, etc., Next step would be that “Hey, in the last note, you suggested that we need to talk about digestive side effects of the Parkinson’s medication that you started last time.” That can actually prompt you that, “Oh, by the way, you haven’t discussed that yet” So, that would be the second step of what we call clinical decision support.

But when ambient intelligence is there, you would just have to say “we wanted to discuss your neurology visit”, because it will automatically pop up the last neurology consult note and you can actually see that while you’re discussing with the patient. So, this ambient intelligence is where we want to get to, and this is quite possible in this day and age.

As a matter of fact, Microsoft bought Nuance for that (2,3), Nuance is one of those companies that have been doing this for some years now (4). Microsoft paid ~20 billion dollars to acquire that company (5), and it just tells you that how important it is in the future of healthcare. And if the companies are investing that much amount of money, then there must be a reason for it.

If you look at the roadmap, you will know that it’s not just Microsoft but EMR companies like Cerner (6) and Epic (7) have clearly set their roadmaps, and they’re going to develop this. And the reason behind it is consumer demand. It’s not just for the patients, because as they can get some advice but for clinicians as well. Clinicians are really burned out because of the EMR.

Another big tech working on it is Google and they added Suki (8) as one of their first clinical digital voice assistants. And then if you look at Amazon, it has its own transcription services (9), which of course is way behind as compared to Microsoft and Google. But at the end of the day, all major tech companies are investing in it.

Heavy EHR — The Clinician’s Nightmare:

Heavy EMR use has led to clinical burnout and exhaustion. It made clinicians spend more time in front of the computer screen than in front of a patient. This literally drove away clinicians from doing the clinical work because they’re so mad at their computers. As a matter of fact, articles like “Death By 1,000 Clicks” (10) summarizes this well and suggest why it was a botched operation from the beginning.

The whole idea behind the electronic medical records was to get the data shared rather than data siloed, and then be able to produce this intelligence that can be used to improve outcomes. But unfortunately, that wasn’t the case. And now, the Office of National Commission (ONC) is going to implement policy that will ensure data sharing is default. So there’s a lot of regulation catching up, patient over paperwork, data sharing by default, and this will decrease the overall cost of providing care and will decrease physician documentation burden. As a matter of fact, AI systems are already decreasing physician burdens. And Saykara is another example, with whom Google partnered. (12) These systems are essential for providing better, faster, and accurate care.

If anyone saved a life, it would be as if he saved the life of all Mankind.

References:

  1. https://youtu.be/jnGlOCBK3kM
  2. https://news.microsoft.com/2019/10/17/nuance-and-microsoft-partner-to-transform-the-doctor-patient-experience/
  3. https://news.microsoft.com/2020/09/15/nuance-and-microsoft-announce-the-integration-of-dragon-ambient-experience-and-microsoft-teams-for-virtual-telehealth-consults/
  4. https://www.geekwire.com/2021/nuance-acquires-seattle-digital-health-startup-saykara-boost-healthcare-ai-products/
  5. https://news.microsoft.com/2021/04/12/microsoft-accelerates-industry-cloud-strategy-for-healthcare-with-the-acquisition-of-nuance/
  6. https://www.healthcareitnews.com/news/hey-cerner-company-seeks-health-systems-help-test-new-voice-assist-tech
  7. https://www.epic.com/epic/post/hey-epic-tell-voice-assistant-clinicians
  8. https://resources.suki.ai/in-the-news/suki-partners-with-google-cloud
  9. https://aws.amazon.com/transcribe/medical/
  10. https://ehrintelligence.com/news/heavy-ehr-workload-leads-to-higher-clinician-burnout-exhaustion
  11. https://www.healthit.gov/topic/about-onc
  12. https://www.healthcareittoday.com/2020/07/24/how-ai-assistants-are-decreasing-the-physician-documentation-burden/