By Miranda L Martinez-Moad, DO
Dr. Miranda L. Martinez-Moad is a PY2 at Kaiser Permanente Fontana Medical Center
As a family medicine resident in my second year of training, I have some concern for my future and maintaining a healthy work-life balance. A common sentiment among family physicians is that we chose this path for the continuity and variety in medical care. But we know that a huge cause for burnout is administrative bloat, and that charting time outside office hours ultimately leads to less face-to-face time with patients. Residents around me are adopting AI with eyes wide open, and discovering sound time-savings.
AI is being woven into almost all industries, and healthcare is no exception. The integration of this technology is both exciting and daunting. In family medicine I know AI can be instrumental in performing administrative tasks, enhancing workflow optimization and streamlining note-taking during patient encounters. But are we ready for it? And is it ready for us?
"AI scribing has been an excellent adjunctive tool for note-writing that has saved me time and energy during my busy clinic days. It has allowed me to focus more of my attention on clinical decision-making rather than administrative tasks, lessening my feelings of burnout and fatigue. Although I do not use it for every patient, its value in acute visits, follow-up visits, and telephone/video visits have been greatly appreciated during each of my clinics." - Darryn Wong, DO (PGY-3 Family Medicine Chief Resident at Kaiser Permanente, Fontana)
Right now at my institution, Abridge (a scribing platform integrated with Epic) is offered to physicians and 3rd year residents. The response has been largely positive among the residents electing to utilize the software. Northern California Kaiser published Ambient Artificial Intelligence Scribes to Alleviate the Burden of Clinical Documentation and has gone all in on generative AI.
At this point you may be thinking to yourself, “is AI safe? ethical?” There’s no shortage of ethical concerns like bias, fairness, transparency, accountability, liability, patient privacy and data security, just to name a few. We know that we will always need to review AI notes and have a system in place for quality control. We also know that models should be developed using diverse datasets to reduce bias, and that decision-making should be explainable. Lives depend on this. If an error occurs, we need a clear system of accountability. Developers and system leaders know this too, and they are on the case.
AI isn’t just coming, it’s here. Despite the uncertainty, I welcome the transformation AI brings to the way we document patient encounters. The best way to navigate the expanding landscape of AI is to do our best to stay abreast of the ever-changing environment.
Resources