Checklist for responsible deep learning modeling of medical images based on COVID-19 detection studies

  title={Checklist for responsible deep learning modeling of medical images based on COVID-19 detection studies},
  author={Weronika Hryniewska and Przemyslaw Bombinski and Patryk Szatkowski and Paulina Tomaszewska and Artur Przelaskowski and Przemysław Biecek},
  journal={Pattern Recognition},
  pages={108035 - 108035}
Pareto optimization of deep networks for COVID-19 diagnosis from chest X-rays
Let AI Perform Better Next Time—A Systematic Review of Medical Imaging-Based Automated Diagnosis of COVID-19: 2020–2022
This paper presents an in-depth discussion of the existing automated diagnosis models and notes a total of three significant problems: biased model performance evaluation; inappropriate implementation details; and a low reproducibility, reliability and explainability.
Requirement analysis for an artificial intelligence model for the diagnosis of the COVID-19 from chest X-ray data
  • T. Kalliokoski
  • Medicine
    2021 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)
  • 2021
Main findings are that a clinically usable AI needs to have an extremely good documentation, comprehensive statistical analysis of the possible biases and performance, and an explainability module.
Explainable Artificial Intelligence (XAI) in Biomedicine: Making AI Decisions Trustworthy for Physicians and Patients
This review focuses on the requirement that XAIs must be able to explain in detail the decisions made by the AI to the experts in the field.
Research on Music Teaching and Creation Based on Deep Learning
Under the background of quality education, music learning is also changing, from the original shallow learning to deep learning gradually. In-depth learning is a new teaching concept, which pays full
LIMEcraft: Handcrafted superpixel selection and inspection for Visual eXplanations
This work proposes an approach called LIMEcraft that allows a user to interactively select semantically consistent areas and thoroughly examine the prediction for the image instance in case of many image features and improves model safety by inspecting model fairness for image pieces that may indicate model bias.


An eXplainable Deep Learning approach to detect COVID-19 from computer tomography (CT) - Scan images is proposed and produces highly interpretable results which may be helpful for the early detection of the disease by specialists.
Deep Learning COVID-19 Features on CXR Using Limited Training Data Sets
Experimental results show that the proposed patch-based convolutional neural network approach achieves state-of-the-art performance and provides clinically interpretable saliency maps, which are useful for COVID-19 diagnosis and patient triage.
Exploration of Interpretability Techniques for Deep COVID-19 Classification using Chest X-ray Images
Five different deep learning models and their Ensemble have been used, to classify COVID-19, pneumoniae and healthy subjects using Chest X-Ray, and qualitative results depicted the ResNets to be the most interpretable model.
Accurate Screening of COVID-19 Using Attention-Based Deep 3D Multiple Instance Learning
This paper proposes an attention-based deep 3D multiple instance learning (AD3D-MIL) where a patient-level label is assigned to a 3D chest CT that is viewed as a bag of instances, which can semantically generate deep3D instances following the possible infection area.
COVID-DenseNet: A Deep Learning Architecture to Detect COVID-19 from Chest Radiology Images
A deep learning-based approach using Densenet-121 to effectively detect COVID-19 patients and a website that takes chest radiology images as input and generates probabilities of the presence of CO VID-19 or pneumonia and a heatmap highlighting the probable infected regions is developed.
Interpretable artificial intelligence framework for COVID-19 screening on chest X-rays
This study presents an interpretable AI framework assessed by expert radiologists on the basis on how well the attention maps focus on the diagnostically-relevant image regions, achieving an overall area under the curve of 1 for a binary classification problem across a 5-fold training/testing dataset.
Estimating Uncertainty and Interpretability in Deep Learning for Coronavirus (COVID-19) Detection
This paper investigates how drop-weights based Bayesian Convolutional Neural Networks (BCNN) can estimate uncertainty in Deep Learning solution to improve the diagnostic performance of the human-machine team using publicly available COVID-19 chest X-ray dataset and shows that the uncertainty in prediction is highly correlates with accuracy of prediction.
DeepCOVIDExplainer: Explainable COVID-19 Diagnosis from Chest X-ray Images
An explainable deep neural networks (DNN)-based method for automatic detection of COVID-19 symptoms from chest radiography (CXR) images, which is called ‘DeepCOVIDExplainer’ and provides human-interpretable explanations for the diagnosis.
Evaluation of Contemporary Convolutional Neural Network Architectures for Detecting COVID-19 from Chest Radiographs
This study trains and evaluates three model architectures, proposed for chest radiograph analysis, under varying conditions, and finds issues that discount the impressive model performances proposed by contemporary studies on this subject, and proposes methodologies to train models that yield more reliable results.