Can ChatGPT Answer Geriatric Knowledge Questions Like Medical Trainees?
Author Information
Author(s): Cheng Huai
Primary Institution: Minneapolis VA Hospital
Hypothesis
Can ChatGPT perform well on validated geriatric knowledge questions compared to medical students, residents, and geriatrics fellows?
Conclusion
ChatGPT performed reasonably well on geriatric knowledge questions, scoring higher than medical students and similar to residents, but lower than geriatrics fellows.
Supporting Evidence
- ChatGPT scored a mean of 14.25 on the geriatric knowledge questions.
- ChatGPT's performance was higher than that of medical students.
- ChatGPT's performance was similar to that of residents.
- ChatGPT's performance was lower than that of geriatrics medicine fellows.
- Half of ChatGPT's outputs provided good rationale for its answers.
Takeaway
This study found that ChatGPT can answer questions about elderly care almost as well as medical students and residents, making it a helpful tool for learning.
Methodology
ChatGPT answered 18 validated geriatric knowledge questions four times, and the scores were compared to those of medical students, residents, and geriatrics fellows.
Limitations
The study's sample size was small, with only four prompts analyzed.
Participant Demographics
Participants included medical students, residents, and geriatrics medicine fellows.
Digital Object Identifier (DOI)
Want to read the original?
Access the complete publication on the publisher's website