Blog

Medical Applications for Natural Language Processing

May 07, 2020 | Jonathan Maisel

Medical Applications for Natural Language Processing

Many healthcare providers are contending with a growing sense that the field is becoming bogged down with regulations stemming from the continued development of electronic health record (EHR) systems. The recent transition from the EHR Incentive Programs to the Medicare Access and CHIP Reauthorization Act (MACRA) has left many providers feeling swamped by new surpluses of data that are not easy to mine for useful information. One tool that is transforming the use of all this new unstructured data is natural language processing (NLP).

What Is Natural Language Processing?

What Is Natural Language Processing?

NLP is a broad term describing the use of algorithms and artificial intelligence (AI) to identify elements in the language we use every day and derive meaning from the input. It brings together multiple elements of computer science, such as computational linguistics and various disciplines in machine learning.

Most of us are familiar with NLP as it relates to consumer goods on the Internet of Things (IoT). When you tell your smart speaker to play a certain song or bring up your smartphone assistant to set an appointment on your calendar, you are benefiting from NLP. In recent years, however, there has been a significant push to apply natural language processing to the healthcare industry.

What Role Does Natural Language Processing Play in Healthcare?

In healthcare, NLP serves mainly to help users access data by pinpointing and retrieving relevant details from immense repositories of patient data. Some of the common tasks an NLP system carries out include:

  • Summarizing large pieces of text like those found in academic journal articles or clinical notes through the identification of specific key concepts and phrases in the original material.
  • Mapping of data elements from unstructured text into structured fields, improving the clinical integrity of data in EHRs.
  • Taking data from machine-readable formats and translating it into natural human language for use in reports and education.
  • Answering queries by synthesizing from multiple sources of data.
  • Utilizing visual character identification, turning images such as PDFs or scans of patient charts into text files that can be translated into machine-readable formats.
  • Providing speech recognition, allowing users to speak clinical notes and other information that is then translated into text.

One benefit of NLP is that many systems are built to “learn” as they operate over time. Much like the human brain, such systems can absorb and evaluate the results of each interaction to determine the accuracy of results and provide ever more useful answers to queries in the future.

How Natural Language Processing Is Being Used in the Medical Community

How Natural Language Processing Is Being Used in the Medical Community

Medical natural language processing applications have become more common and more effective in the healthcare field. The following four applications represent real progress in the development and deployment of NLP:

1. Better Quality Care

In the switch from EHR Incentives to MACRA and its associated programs, the ability to measure provider performance and find the gaps in a patient’s care has become even more imperative for organizations hoping to receive value-based reimbursement from the Center for Medicare & Medicaid Services (CMS).

Recent research has proven that the use of NLP helps simplify the benchmarking process to evaluate the skills of physicians by automating free text evaluation. This reduces the time and human resources needed to perform such benchmarking tasks.

Quality Care Use Cases

One study investigating how NLP compares to human assessments shows that natural language processing is on par with evaluations made by humans a good deal of the time. One 2017 study revealed that 98 percent of the time, the algorithms tested were in agreement with human assessments of the same material. The NLP systems were able to identify and evaluate language terms relating to the providers’ soft skills, opening the door to further such assessments in the near future.

The study had eight different machine learning algorithms comb through free text comments regarding 548 doctors across multiple specialties and practice areas in the United Kingdom. The algorithms were told to identify terms relating to these non-clinical categories:

  • Innovation
  • Interpersonal skills
  • Popularity
  • Professionalism
  • Respect among colleagues

The overall agreement of the NLP systems with human assessments ranged between 68 and 83 percent. The algorithms proved to be more accurate in certain categories like popularity, in which the agreement rate was 97 percent, and innovation, where the agreement rate was 98 percent.

Now that patient satisfaction is becoming more critical in care quality measurements, one of the most important natural language processing healthcare applications will be structuring data provided in patient surveys.

While patient satisfaction questionnaires have traditionally been composed of a multiple-choice component and an opportunity to enter free text, the free text part of the equation has not been meaningfully quantifiable. With the development of natural language processing healthcare systems, organizations can produce more quantifiable information from unstructured data while allocating a minimum of human resources to the task. For instance, an NLP system might eventually learn to draw a connection between patient use of the word “rude” in free text answers and lower interpersonal skills, allowing physicians to pinpoint areas of improvement for better care and maximized reimbursement from CMS.

2. Better Provider-Patient Relationships

While EHRs have contributed greatly to the amount and quality of information available on patients, the experience of interacting with the system is lacking for both patients and providers. A study from Stanford Medicine revealed that of an average 31 minutes spent with each patient, 19 minutes are spent interacting with the EHR. This unfortunate statistic can lead to a breakdown in the provider-patient relationship if the patient becomes frustrated with the lack of face-to-face interaction time. Natural language processing can improve this relationship.

Improved Relationship Use Cases

The most direct way NLP can improve the doctor-patient relationship is through speech recognition. Speech recognition allows physicians to dictate their notes out loud during appointments, which offers a number of benefits.

  • Face time: When a physician is dictating notes, he or she can look at the patient directly rather than turning away to type and click within the EHR system. This can give doctors more time to ask and answer questions and notice more nuanced physiological symptoms.
  • Fact-checking: Patients listening to notes being dictated are able to follow what is being noted and provide corrections if necessary, leading to more accurate and useful information.
  • Intuitive flow: Dictating notes allows the physician to create a natural narrative that may capture more nuance and provide a clearer picture of the patient’s condition.

Patients who feel listened to are more likely to provide important details during appointments and ask questions when they need to. Both of these tendencies are essential in the improvement of clinical decision-making. Natural language processing through speech recognition can help physicians take back more time for patient interactions, helping to build trust and rapport that increases satisfaction as well as the quality of care.

3. Improved Health Literacy

One stubborn problem in the healthcare field is increasing patient engagement and literacy in terms of personal health information. A survey conducted in 2016 asked 500 people about their knowledge of and access to health information through their EHR portal.

About 60 percent of patients reported being able to access their electronic health information through a patient portal, but 55 percent only used their EHR access to stay updated on their clinical data. Only 22 percent of the respondents reported using their data to better engage in patient-provider discussions and make medical decisions.

Increasing patient literacy is a critical goal for medical natural language processing, and displays the ability of machine learning to affect the way people interact with their health information.

Improved Health Literacy Use Cases

The average patient does not have a deep understanding of medical terminology, and it can end up reducing their quality of care. When someone needs their doctor to provide definitions of terms and explain the meanings of certain diagnoses or lab results, it takes valuable time out of the appointment slot. NLP can be used within patient portals to provide accessible information to patients in terms they understand.

A 2017 study saw NLP tools used to match medical terms in patient charts with plain language counterparts. Compared to baseline systems, NLP algorithms displayed better performance when given unlabeled data sets.

A separate study involved the development of a new NLP tool specifically to connect medical jargon to plain language definitions. Using feedback from physicians, researchers were able to make significant improvements to clarity and usability, which led to greater success in the algorithm recalling plain language definitions. When patients are able to understand the information available to them, they are better able to participate in their own care and the clinical decision-making process. NLP helps bridge the significant gap between complex medical terms and language the average person can fully understand.

4. Enhanced Care Coordination

The coordination of care for individual patients is a problem that has been addressed only partially by continued advances in EHR technology. Patients with complex conditions and histories often have care teams comprised of multiple physicians, specialists and other providers, leading to health records with mountains of information to sort through. This data often includes nuanced information on non-clinical factors like food insecurity, mental illness or housing instability that may be important determinants of health. NLP tools can make coordination of care for these patients easier for care teams.

Enhanced Coordination Use Cases

In one example, a research team from Massachusetts General Hospital used NLP tools in their EHR with the intent of helping providers identify the terms related to social health determinants. In the end, the researchers pinpointed 22 terms that were specific enough to identify high-risk patients on the basis of psychological, behavioral and social factors.

This type of NLP tool can also address a widespread problem in the healthcare industry: the reactive model. Typically, providers and organizations look to what has happened historically to predict what is likely to happen in the future. With new NLP tools able to explore and mine patient data, the healthcare industry can shift to a more predictive and proactive model targeted specifically to high-risk patients.

NLP can also spot incongruities and potential mistakes that humans don’t have the time or eye for detail to identify. Drug interactions and adverse drug events (ADEs) are a risk that increases with the number of providers a patient has. Some NLP tools can sift through information from disparate providers collected in the EHR and look for interactions or evidence that a patient may experience an adverse reaction to a specific drug.

Medical Transcription and the Expansion of Natural Language Processing

Medical Transcription and the Expansion of Natural Language Processing

By far the most common healthcare applications of natural language processing are those that use speech recognition. Speech recognition technology does have significant benefits like increased time with patients and reduced time spent in the EHR, but speech recognition in its current state has many issues that need to be fixed before it can be used on its own without the assistance of human transcriptionists.

One study published in JAMA Network Open tackles the significant accuracy problems that plague speech recognition technology when it is used by itself. The researchers looked at 217 sets of dictation at three stages: Original transcriptions, versions edited by transcriptionists and the final versions checked by clinicians. The observations of the unedited speech recognition transcript included:

  • Over seven percent of words involved errors.
  • One in every 250 words involved a clinically significant error.
  • Over 96 percent of raw speech recognition transcriptions contained at least one error.
  • About 63 percent of raw speech recognition transcripts contained at least one clinically significant error.

These figures highlight the long way NLP has to go in speech recognition applications before it is safe enough to use on its own. The same study also revealed that a human hand in transcription dramatically reduces error rates. When a medical transcriptionist reviewed the documentation after the initial speech recognition draft, error rates plummeted to just 0.4 percent, an astounding turnaround from the raw draft error rates. When it comes to clinical decision making and care coordination, every percentage point of error reduced can contribute to lives saved.

Currently, the most efficient, cost-effective and accurate method of generating clinical documentation is through a hybridized system like the one used by ZyDoc. Our process involves the generation of an initial transcription draft using cutting-edge NLP speech recognition technology. That draft is then carefully reviewed by highly trained human transcriptionists who identify and weed out any errors before returning the documentation. If the provider uses our medical transcription services with EHR integration, the documentation is also inserted back into the EHR automatically, removing yet another burden from physicians.

Trust ZyDoc With the Most Advanced Transcription Technology

Trust ZyDoc With the Most Advanced Transcription Technology

Natural language processing healthcare uses are expanding all the time, and speech recognition is one of them. However, there is still no substitute for the attention of an experienced medical transcriptionist when it comes to accurate clinical documentation. ZyDoc’s mission is to utilize the most modern speech recognition technology in combination with human expertise to provide the swiftest, most accurate clinical documentation at the fairest prices.

If you would like to know more about our medical transcription services, call 1-800-546-5633 and ask about our free 14-day trial. You can also view our plans and pricing to determine which service tier is right for your organization.

Tired of Typing In Your EHR?