Logo des Repositoriums
 
Zeitschriftenartikel

Human-Centered Explanations: Lessons Learned from Image Classification for Medical and Clinical Decision Making

Vorschaubild nicht verfügbar

Volltext URI

Dokumententyp

Text/Journal Article

Zusatzinformation

Datum

2024

Zeitschriftentitel

ISSN der Zeitschrift

Bandtitel

Verlag

Springer

Zusammenfassung

To date, there is no universal explanatory method for making decisions of an AI-based system transparent to human decision makers. This is because, depending on the application domain, data modality, and classification model, the requirements for the expressiveness of explanations vary. Explainees, whether experts or novices (e.g., in medical and clinical diagnosis) or developers, have different information needs. To address the explanation gap, we motivate human-centered explanations and demonstrate the need for combined and expressive approaches based on two image classification use cases: digital pathology and clinical pain detection using facial expressions. Various explanatory approaches that have emerged or been applied in the three-year research project “Transparent Medical Expert Companion” are shortly reviewed and categorized in expressiveness according to their modality and scope. Their suitability for different contexts of explanation is assessed with regard to the explainees’ need for information. The article highlights open challenges and suggests future directions for integrative explanation frameworks.

Beschreibung

Finzel, Bettina (2024): Human-Centered Explanations: Lessons Learned from Image Classification for Medical and Clinical Decision Making. KI - Künstliche Intelligenz: Vol. 38, No. 3. DOI: 10.1007/s13218-024-00835-y. Springer. ISSN: 1610-1987

Zitierform

Tags