Evaluating Explainable AI for healthcare

L-r: Jeff Clark and Raul Santos-Rodrıguez

Artificial Intelligence (AI) is set to transform healthcare decision-making, but building trust remains crucial. We hear from Dr Jeff Clark, one of our AI in Health awardees, about his project which aims to evaluate how clinicians understand and interact with explainable AI (xAI) so that systems can better meet their needs.

Explainable AI explained

It is critical that AI systems designed for use in healthcare environments can quickly be well understood by clinicians. Existing xAI research often targets developers rather than clinicians, leading to a gap between AI innovation and practical healthcare use.

Our project aims to address this gap by quantitatively investigating clinicians’ preferences between various xAI techniques, and assessing how they interact with such systems. We will measure speed and accuracy in interpretation, and analyse sensor data, to inform the development of more intuitive and user-friendly AI systems, thereby fostering trust and integration in clinical workflows.

Interdisciplinary collaborations

This interdisciplinary project brings together academic technical expertise from the University of Bristol’s School of Engineering Mathematics and Technology led by Dr Jeff Clark, with co-investigators Dr Alexander Hepburn and Professor Raul Santos-Rodrıguez, in collaboration with medics from the intensive care unit at University Hospitals Bristol and Weston NHS Foundation Trust led by Dr Chris Bourdeaux. Experiments will be conducted at the Bristol Digital Futures Institute.

Exploring usability and adoption

Previous studies have qualitatively explored clinical preferences for xAI, and this project builds on some of our own previous work where we interviewed a range of clinicians to gather their requirements for xAI systems within intensive care.

However, quantitatively assessing how xAI fits into the clinical decision making workflow has not been well explored, especially whilst under time constraints and the cognitive demands associated with clinical practice. We are preparing to conduct a clinical user study to assess how clinicians engage with different xAI examples, measuring their comprehension and usability. We will compare established xAI techniques, and measure participants’ responses both through a questionnaire and sensors recording whilst they engage with the explanations. Sensors will include eye tracking, so that we can identify which parts of the explanations clinicians spend the most time looking at, and outputs from wearable devices such as heart rate so that physiological state can be assessed in relation to cognitive load.

The project will provide insights into how clinicians interact with different types of xAI outputs, helping to develop clinical dashboards that are both informative and easy to interpret. Ultimately, this will support the safe adoption of AI in healthcare by making it more transparent and trustworthy.

Improving patient care

So far on the project we have engaged with our clinical collaborators and two long-standing international collaborating xAI experts to develop the experimental design. We have devised the protocol to capture the most important factors with regards to xAI understanding and engagement within the context of intensive care unit decision making. Since we will now make use of the facilities available at the Bristol Digital Futures Institute, we have identified the available sensors, are liaising with the team there and are preparing for a pilot study.

By quantifying clinicians’ interaction with xAI outputs, we will generate evidence that guides the design of future healthcare AI systems to ensure that such systems can quickly be well understood whilst under cognitive load. Once complete, results from our project will be submitted for publication at a leading academic venue, fostering broader knowledge sharing and supporting external funding bids.

Our goal is to better understand how clinicians interact with xAI to enable development of systems that are not only technically sound but also practically valuable to clinicians, ultimately improving patient care.

Future plans

We are currently preparing to conduct experiments at the Bristol Digital Futures Institute, where we will be collecting sensor data whilst clinical participants are presented with xAI outputs.

In addition to supporting future grant applications to continue this academic research, results from this project will also contribute to the design and development of decision support systems for intensive care we are already in the process of building with our clinical collaborators here in Bristol.

For more information about this project, please contact Dr Jeff Clark jeff.clark@bristol.ac.uk