AI Made Friendly HERE

Contributed Content: Will AI Change the Outlook for Primary Care?

Primary care providers are struggling. Can augmented intelligence tools give them the support they need to enjoy healthcare again?

Editor’s Note: Michael S. Barr, MD, MBA, MACP, is a board-certified internist and the executive vice president for the Quality Measurement & Research Group at the National Committee for Quality Assurance (NCQA).

Primary care is in trouble

According to a report from the Milbank Memorial Fund, The Physicians Foundation, and the Robert Graham Center about the crisis facing primary care:

  • The number of primary care physicians per capita has declined from 68.4 primary care physicians (PCPs) per 100,000 people in 2012 to 67.2 PCPs per 100,000 in 2021.
  • Only 15% of all physicians who enter residency training practice primary care three to five years after residency. ​
  • Nearly half of family physicians rate the usability of electronic health records (EHRs) as poor or fair, and more than one-third are unsatisfied with their EHRs.

KFF Health News journalist Elisabeth Rosenthal put it best when she wrote:

“American physicians have been abandoning traditional primary care practice — internal and family medicine — in large numbers. Those who remain are working fewer hours. And fewer medical students are choosing a field that once attracted some of the best and brightest because of its diagnostic challenges and the emotional gratification of deep relationships with patients.”

Much of the frustration experienced by physicians relates to burnout. The American Medical Association (AMA) defines burnout as “a long-term stress reaction that can include emotional exhaustion, depersonalization (i.e., lack of empathy for or negative attitudes toward patients), [or the] feeling of decreased personal achievement.”

According to the Agency for Healthcare Research and Quality (AHRQ), “Burnout can also threaten patient safety and care quality when depersonalization leads to poor interactions with patients and when burned-out physicians suffer from impaired attention, memory, and executive function.” An AMA survey in 2022 identified that primary care physicians (Internal Medicine, Family Medicine, Pediatrics) were among the top six physician specialties with the highest burnout (52% to 58%).‍

What causes burnout?

EHRs, administrative burdens, and organizational factors are the leading contributors to frustration and physician burnout. This Harvard Health Blog, written to educate patients, includes the following apt description:

“The causes of physician burnout are complex, but have to do in part with increasing workload, constant time pressures, chaotic work environments, declining pay, endless and unproductive bureaucratic tasks required by health insurance companies that don’t improve patient care, and increasingly feeling like cogs in large, anonymous systems. Parasitic malpractice lawyers are always circling, which causes us to waste an enormous amount of time with defensive documentation. The transition from paper charts to electronic medical records, which seemingly were designed to maximize revenues instead of clinical care, has created a technological barrier between doctor and patient, and between doctors.”

Michael S. Barr, MD, MBA, MACP, executive vice president of the Quality Measurement & Research Group at the National Committee for Quality Assurance (NCQA). Photo courtesy NCQA.

The American College of Physicians (ACP) and other medical professional societies are focused on addressing this issue through policy and advocacy. A 2017 position paper from ACP titled “Putting Patients First by Reducing Administrative Tasks in Healthcare” takes an analytical approach to categorizing administrative tasks to identify and mitigate their adverse effects on clinicians, patients, and the healthcare system, pointing out that:

“Tasks that become burdensome may differ from payer to payer; appear one month without notice, then reappear modified or changed the next; and often result from not using documentation that already exists in the medical record.”‍

Can AI help?

Given AI’s ubiquity, most people will tell you that it stands for artificial intelligence. Most of us have given a lot of thought to how AI is poised to affect our lives and livelihoods in the years to come. Many fear that AI-powered software might eventually make their jobs redundant.

The AMA takes a more optimistic approach. They’ve decided to use AI as an acronym for augmented intelligence, “as a conceptualization of artificial intelligence that focuses on AI’s assistive role, emphasizing that its design enhances human intelligence rather than replaces it.”

That framing is useful, because augmented intelligence is actually emerging as a potentially valuable “partner” for clinical teams to help address common challenges in primary care – many of which contribute to burnout and frustration in practice.

An AMA survey report (2023) found enthusiasm for AI in healthcare, with 65% of physicians surveyed seeing an advantage to AI. The report found particular enthusiasm for AI tools that can help reduce administrative burdens such as documentation and prior authorization, and to support diagnosis and workflow. At the same time, 41% of physicians reported equal excitement and concern, with their ambivalence stemming mostly from the potential impact to patient-physician relationships and patient privacy.

When appropriately trained, maintained, and implemented in the clinical workflow, proponents of AI in healthcare have hopes and expectations it will produce significant benefits by:

  • Reducing the documentation burden (e.g., ambient AI generating progress notes via ambient AI, drafting replies to patient messages, completing prior authorization requests, producing referral notes and discharge summaries).
  • Identifying at-risk/high-risk populations for early, proactive interventions and support.
  • Producing actionable patient summaries and reports.
  • Supporting improvements in risk adjustment and appropriate coding.
  • Generating guideline-concordant clinical recommendations (e.g., smart alerts, clinical decision support).
  • Providing diagnosis support to reduce missing, delayed, or incorrect diagnoses.
  • Suggesting treatment plans based on clinical conditions accounting for patient needs, preferences, and other factors.
  • Handling repetitive and predictable administrative tasks (e.g., eligibility checks, appointment reminders, standard reports).
  • Providing translation services.

The effectiveness and success of AI in healthcare will depend on the appropriate and ethical application of the technology. This includes transparency about its limits, biases, and potential to cause unintentional harm. Importantly, clinical recommendations and summaries should link to the source documentation to allow clinical teams the opportunity to review and confirm the accuracy and validity of the information.

Other considerations include user acceptance of the technology (i.e., the usability of the AI interfaces and reports), the cost to implement and maintain, and potential liability from inaccurate guidance that could lead to patient harm. Data privacy and security is another important concern: clinical teams and health systems must be confident that appropriate protections are in place and consistent with HIPAA and other regulations.

Future of Health: The Emerging Landscape of Augmented Intelligence in Health Care, a research paper produced by AMA in collaboration with Manatt Health, provides a good framework for understanding the issues, identifying use cases, and planning for AI implementation in practice. Many of the use cases highlighted are non-clinical – that is, they address the administrative hassles and tasks at the root of clinician burnout.

Will AI make a difference in primary care?

There are good indications that AI, appropriately designed and implemented in the workflow of busy clinicians, can reduce the stress associated with administrative tasks, documentation, and clinical care. .

However, integrating AI into healthcare must be done carefully, ethically, and with an understanding of its promise and limitations. Organizations such as the new Coalition for Health AI (CHAI) are focused on developing guidelines “to drive high-quality healthcare through the adoption of credible, fair, and transparent health AI systems.” The CHAI Blueprint for Trustworthy AI Implementation Guidance and Assurance for Healthcare provides an excellent academic approach to addressing these imperatives.

But primary care needs help now. I am optimistic that AI systems can address many tedious administrative tasks that cause significant frustration in practice. With appropriate transparency, usability testing, and sufficient clinician training, AI systems could also be used to support clinical documentation, quality gap closure, population health initiatives, and risk adjustment.

But clinicians and their team members will always need to be able to view the clinical evidence used by AI systems to generate inferences and recommendations. Those systems are there to augment physicians–not to replace them. Decision-making and clinical interventions will always remain the responsibility of clinicians.

Originally Appeared Here

You May Also Like

About the Author:

Early Bird