The Promises and Challenges of Generative AI

Jeri Koester, Heather Nelson, CHCIO, Lisa S. Stump, FASHP

By Topic: TechnologyInformation Management By Collection: Blog

 

The genie is out of the bottle: artificial intelligence. Generative AI. ChatGPT. It’s all the buzz now. In healthcare IT, CIOs are being asked by leaders, physicians and staff: “Can we use ChatGPT?” “Is ChatGPT safe … accurate … will it save me time?” ChatGPT holds incredible promise to streamline work on harried clinicians, to enhance communication with patients and to drive efficiencies in many administrative functions. So, what is the role of the CIO when it comes to partnering with your organization to define the who, what, when, why and how of it all?

AI, Ethics and Interconnected Healthcare Ecosystems

Generative AI is not like traditional software solutions such as an EHR that has been implemented and supported, optimized and managed through a governance process. Generative AI is a dynamic and learning technology that can produce varied types of content, including essays, images, poems or computer code. It can solve problems and even summarize large volumes of content. It is a very powerful tool, and as Winston Churchill said, “Where there is great power there is great responsibility.”

The potential for generative AI to augment and complement our work is very exciting—and a bit frightening. What if the AI model builds its conclusions and recommendations off a data set wrought with bias? We have software modules that help clinicians identify disparities and care gaps in their patient populations (remember the boom of “pop health tools” in 2013 and 2014?) with the hope they will allow us to demarginalize cohorts of patients and focus on caring for the individual when and where they need it. While we continue to enhance those technologies and slice and dice the data from them, does AI introduce another hurdle we will need to climb to ensure health equity for our patients?

AI is also giving patients the opportunity to do more of their own due diligence before stepping into the doctor’s office or sitting in front of a screen for a virtual visit. We know how, in some instances, physicians and care team members can do the very same thing. Who is responsible for validating the information that scrolls onto our screen when it is asked a question? How can we as patients feel secure knowing that the conversations, the diagnosis and the treatment plan are from a trained provider and not some algorithm on an AI site? Does this change how we disclose, store or use data for research? Will consent forms need to be changed to reflect this emergent new world?

All of this is to say, healthcare leaders need a thoughtful plan that carefully evaluates the use of AI tools to identify and mitigate any unintended consequences, to protect the privacy of our patient data, to maintain the confidentiality of intellectual property and business intelligence, and to avoid the propagation of misinformation that could significantly affect healthcare decisions and health outcomes. This means that we need to think about what data is shared with AI platforms and how the data is delivered and used. 

This will require strong governance and accountability, guided by the input of privacy and compliance experts, technology leaders, legal advisers and clinicians. While we work to establish the right protections, we want to encourage the innovation and flexibility to explore and study these tools within those boundaries, exercising caution and constraint around this exciting exploration.

It is on all of us to ensure this new opportunity is purposeful, effective and safe. It also needs to be transparent and explainable to clinicians, care teams and patients. Healthcare is built on trust between care teams and patients. This technology must not be allowed to damage that empathetic relationship; put to good use, it should augment the care we provide.

Striking this balance is no small effort. By coming together as an industry on this, we can ensure that these innovations make a positive impact that benefits us all.


Jeri Koester, Carol Emmott Fellow Class of 2017, is chief information and digital officer, Marshfield Clinic Health System. Heather Nelson, CHCIO, Carol Emmott Fellow Class of 2020, is the senior vice president/CIO, Boston Children’s Hospital. Lisa S. Stump, FASHP, Carol Emmott Fellow Class of 2018, is the senior vice president/chief information and digital transformation officer, Yale New Haven Health System and Yale Medicine.