The Doctor Will Polygraph You Now
Author Information
Author(s): Anibal James, Gunkel Jasmine, Awan Shaheen, Huth Hannah, Nguyen Hang, Le Tram, Bélisle-Pipon Jean-Christophe, Boyer Micah, Hazen Lindsey, Bensoussan Yael, Elemento Olivier, Rameau Anais, Sigaras Alexandros, Ghosh Satrajit, Powell Maria, Ravitsky Vardit, Dorr David, Payne Phillip, Johnson Alistair, Bahr Ruth, Bolser Donald, Rudzicz Frank, Lerner-Ellis Jordan, Watts Stephanie, Siu Jennifer, Hanna Karim, Zesiewicz Theresa, Zhao Robin, Jayachandran Lochana, Cruz Samantha Salvi
Hypothesis
Can AI methods accurately predict social behaviors from patient-reported information while addressing ethical concerns?
Conclusion
The study highlights significant ethical concerns regarding the use of AI for verifying patient-reported information, particularly issues of trust and bias.
Supporting Evidence
- AI methods can predict social behaviors from clinical data, but this raises ethical concerns.
- Biases in AI systems can lead to unfair treatment of marginalized groups.
- Patients expect control over their health narratives, which AI systems may violate.
- Trust between patients and providers is crucial and can be undermined by AI verification.
- AI self-trust can lead to prioritizing data over patient-reported information.
- Ethical use of AI requires careful oversight and respect for patient autonomy.
- AI systems may misinterpret patient data, leading to incorrect conclusions.
- Future research should focus on real-world data to better understand AI biases.
Takeaway
This study shows that using AI to check if patients are telling the truth about their health can be risky and might hurt the trust between doctors and patients.
Methodology
Experiments were conducted using synthetic audio data to evaluate LLMs' predictions of smoking status based on patient-reported information and acoustic features.
Potential Biases
The study identified risks of bias in AI models, particularly against marginalized groups, which could lead to unfair outcomes.
Limitations
The study was limited by a small and demographically narrow dataset, and it used synthetic data rather than real patient data.
Participant Demographics
The dataset included 44 patients with respiratory or voice conditions, but specific demographic details were not provided.
Digital Object Identifier (DOI)
Want to read the original?
Access the complete publication on the publisher's website