David Waterhouse, MD, MPH, from Chief Innovation Officer of Oncology Hematology Care, and an Editorial Board Member of AI in Precision Oncology, Terence Cooney-Waterhouse, from VandHus LLC, and coauthors, emphasize the importance of trust as a prerequisite for successful integration of AI into clinical practice.
Despite recognizing the potential benefits of AI, patients express significant concerns about data privacy, algorithmic bias, and the lack of transparency in AI decision-making processes. Physicians tend to be driven by doubts about the clinical validation and interpretability of AI systems.
To build confidence in AI applications, the authors advocate "for the implementation of robust data governance frameworks, enhanced transparency, and active involvement of stakeholders in AI development." They underscore "the necessity of addressing ethical implications and ensuring equitable access to AI-driven innovations."
"Integrating artificial intelligence into oncology care is much like introducing a new colleague to an established clinical team. Just as we wouldn't immediately trust a new team member with critical decisions without proper vetting, training, and transparency, we must approach AI implementation with similar rigor and care. Trust isn't granted - it's earned through demonstrated reliability, transparent processes, and consistent results. By prioritizing ethical frameworks, clinical validation, and patient-centered approaches, we can transform AI from a misunderstood technological tool into a trusted ally in the fight against cancer," says Douglas Flora, MD, Editor-in-Chief of AI in Precision Oncology.
Cooney-Waterhouse T, Ou W, Mukherji S, Frytak J, Saha P, Waterhouse D.
Bridging the Trust Gap in Artificial Intelligence for Health care: Lessons from Clinical Oncology.
AI in Precision Oncology, 2025. doi: 10.1089/aipo.2025.0001