Anyone who has ever walked out of a doctor鈥檚 office feeling confused or overwhelmed will appreciate the work Professor Richard Zraick is doing to make patient education more accessible.

鈥淗ealth literacy surveys have revealed that 1 in 3 adults in the U.S. has difficulty understanding basic health information,鈥 Zraick says.

Zraick is a leading expert and advocate for health literacy in the discipline of communication sciences and disorders. In one of his research mentoring classes this fall, he and his students in the School of Communication Sciences and Disorders are exploring whether the artificial intelligence (AI) website ChatGPT can improve how healthcare information is created, conveyed and understood.

This is one of many projects at 麻豆原创 that is unlocking the future of AI. Explore more 麻豆原创 AI initiatives here.

The group is gathering existing materials from key websites that address common medical conditions faced by patients with communications disorders 鈥 information ranging from hearing loss to swallowing challenges to voice disorders. The content of this material is then entered into ChatGPT using prompts 鈥 text-based input, such as a question or instruction. Students develop and refine prompts that seek to simplify the language and apply readability formulas to assess if the new text is more readable 鈥 or not.

Upon completion of the project, the team will have evaluated and revised about three dozen web-based patient education documents.

They plan to submit their findings to a peer-reviewed journal next semester. The goal is to better understand if AI can be an effective tool for simplifying complex medical information and improving health literacy and patient education. Tools like ChatGPT, a large language model developed by OpenAI, may offer healthcare professionals new ways to efficiently deliver patient education materials. The technology may not only streamline the process but also enhance the clarity of information delivered to patients, improving understanding of their own health conditions and treatment plans and ultimately leading to better health outcomes.

One of the most promising applications of ChatGPT in patient education is its ability to simplify medical jargon and make health information more accessible to the public, Zraick says.

鈥淚f you give AI, for example ChatGPT, the correct prompt, it will either create, edit or suggest revisions to an existing document that you might be trying to create for a patient, or that exists for a patient or their family.鈥

This ability could allow healthcare professionals to quickly produce patient-friendly materials that meet readability standards recommended by institutions like the Institute of Medicine, which suggests health information be written at a fifth or sixth grade reading level鈥.

AI also has clear advantages in terms of speed. 鈥淵ou can ask ChatGPT to give you a script…that somebody with limited health literacy could understand, and it will do that in 20 seconds,鈥 Zraick says.

He emphasizes that while AI serves as a helpful starting point, students still need to ensure the information is accurate and apply it in their interactions鈥.聽While AI tools like ChatGPT offer efficiency, they are not flawless and Zraick emphasizes that the role of the content expert remains crucial.

“We are the content experts, so I never trust ChatGPT 100%, but it’s a starting point. And then I look at it and review it for content,” he says.

Human review ensures that the information is not only accurate, but also contextually appropriate for the intended audience鈥.

Beyond simplifying language, AI can assist in evaluating the readability of existing documents.

鈥淭here鈥檚 no one readability formula that captures all kinds of documents. We usually use more than one formula to get a variety of metrics, and they tend to agree with each other,鈥 Zraick says.

Materials that are easier to understand are also more actionable, increasing the likelihood that patients will follow through with medical advice.

For Kelly Clevenger, a School of Communication Sciences and Disorders senior, the project gave her an opportunity for a deeper dive into AI, something she had only used superficially for things like checking grammar for class assignments. As part of the project, she attended a 鈥減rompt engineering鈥 workshop designed to fine tune her ability to leverage ChatGPT鈥檚 functionality.

鈥淧eople kind of think it just runs itself, and I think something that people should realize is that you really need to have a good idea of exactly how you want it to work before you even start prompting,鈥 Clevenger says. 鈥淚f you don’t give it specific enough direction, it won’t give you exactly what you want.鈥

She notes that while the tool isn鈥檛 perfect, it significantly cuts down the time required, enabling researchers to focus on higher-level analysis and interpretation.

This is not Zraick鈥檚 first foray into the world of exploring the use of AI in health communications. This year, he and colleagues published an article in a journal of 聽American Speech-Language-Hearing Association examining the use of ChatGPT as a tool to teach students in communication sciences and disorders how to write in plain language. The researchers believe that AI tools hold promise; the tech can enhance students鈥 abilities as well as offer an interactive environment that encouraged active participation and critical thinking.

AI adds a new element for Zraick, who, for decades, has taught students about health literacy.聽 Some of his courses include class assignments that have students complete written assignments describing medical concepts using plain language and participate in role playing exercises that stress clear communication. His ongoing research is assessing how effective this work is and whether teaching new graduate clinicians to use plain language will enhance the clarity and actionability of their patient reports.

Students will one day serve different audiences as speech-language pathologists and audiologists, and there鈥檚 a difference between writing for professionals and for patients, Zraick says.

鈥淚f a student is writing a report or a treatment update for another healthcare provider, it鈥檚 a technical writing exercise,鈥 he says. 鈥淏ut for patients, they need a plain language summary.鈥

Clevenger also underscores the challenges of using AI in research.

“You can’t just throw any dataset at it and expect good results,鈥 she says. 鈥淲e’ve been working on refining the prompts we use to get better, more accurate outputs from the model. It’s a learning process, but the more we work with it, the better it gets.”

As the use of AI in healthcare continues to expand, the focus will likely shift toward refining these tools to ensure even greater accuracy and relevance, Zraick says.

鈥淐linicians and educators have more tools to fine-tune skills and expand the skill set of a speech-language pathologist, or an audiologist, beyond the core content knowledge that they have to have,鈥 he says. 鈥淚t’s like practicing for the 22nd century, not just the 21st century.鈥