Raw DataPDF Available

Medical Imaging Multimodality Breast Cancer Diagnosis User Interface: BIRADS Survey Template File

Authors:

Abstract

In Breast Cancer Diagnosis, the BIRADS Survey document is a simple seven-item breast severity scale. This document aims to measure the severity and findings of the mammogram screening, into well-defined categories. It is intended to use this document on several User Tests where Researchers can retrieve information from Clinicians regarding the BIRADS. Repository: https://github.com/MIMBCD-UI/survey-birads
BIRADS Survey Institution
Single-Modality Prototype Universidade de Lisboa (ULisboa)
Multi-Modality Prototype Instituto Superior Técnico
BIRADS: 0 1 2 3 4 5 6
Figure 1: BIRADS Final Assessment Categories.
Information
The following information describes the detailed data regarding the user tests
of the research work. The data will therefore be important to the BIRADS
analysis.
Participant Name Date
Test Location
Prototype Version
Support
List of sponsors, honers and donors:
... We were especially interested in the use of several modalities and AI assistance to detect and classify lesions. We conducted quantitative and qualitative studies in nine health institutions to understand the medical practices surrounding radiomics in breast cancer, including the classifications of the lesion severity using the BI-RADS [22] score. Next, we will describe the workflow practices that lead to design implications of our assistant, as well as design goals and design methods. ...
Article
In this research, we take an HCI perspective on the opportunities provided by AI techniques in medical imaging, focusing on workflow efficiency and quality, preventing errors and variability of diagnosis in Breast Cancer. Starting from a holistic understanding of the clinical context, we developed BreastScreening to support Multimodality and integrate AI techniques (using a deep neural network to support automatic and reliable classification) in the medical diagnosis workflow. This was assessed by using a significant number of clinical settings and radiologists. Here we present: i) user study findings of 45 physicians comprising nine clinical institutions; ii) list of design recommendations for visualization to support breast screening radiomics; iii) evaluation results of a proof-of-concept BreastScreening prototype for two conditions Current (without AI assistant) and AI-Assisted; and iv) evidence from the impact of a Multimodality and AI-Assisted strategy in diagnosing and severity classification of lesions. The above strategies will allow us to conclude about the behaviour of clinicians when an AI module is present in a diagnostic system. This behaviour will have a direct impact in the clinicians workflow that is thoroughly addressed herein. Our results show a high level of acceptance of AI techniques from radiologists and point to a significant reduction of cognitive workload and improvement in diagnosis execution.
... In the radiology room, medical imaging annotations [8,10] are one of the main activities of radiologists and the quality of annotation depends on the clinician experience, as well as on the number of studied cases [38]. Manual annotations [18] are very useful to extract features like contours, intersections, margins, and shapes that can be used in the processes of lesion segmentation (i.e., masses and calcifications) and classification (i.e., BIRADS [17]) made by automatic (AI-Assisted) agents [79]. In this document, we propose a new method and process that generate a standardized [42,54] dataset of medical imaging annotations, across the domain of breast cancer, adopting a Multimodality 1 strategy (i.e., MG, US and MRI) in order to provide clinicians a tool for the production of qualified datasets. ...
Technical Report
Full-text available
In this invention proposal, we propose a method and process using a system for providing a User Interface (UI) to annotate and visualize masses and calcifications of breast cancer lesions in a Multimodality strategy. The Multimodality strategy supports the following image modalities: (i) MammoGraphy (MG) in both CranioCaudal (CC) and MedioLateral Oblique (MLO) views; (ii) UltraSound (US); and (iii) Magnetic Resonance Imaging (MRI) volumes. The interface receives a set of medical images to be annotated. In order to annotate the medical images, the UI comprises two lesion tools: (1) a free-hand polygon tool for annotating the masses of breast cancer lesions; and (2) a bullet probe on the image for annotating the calcifications of breast cancer lesions. These tools generate a dataset of manual annotations, which will be able to extract features (e.g., lesion contours, intersections, and shapes) that can be used in the lesion segmentation and classification computation made by automatic agents. Such automatic agents can have the integration of algorithms from Artificial Intelligence (AI), Machine Learning (ML), or Deep Learning (DL) literature.
... The AI classification uses the well known BI-RADS [3,38] scale for the breast cancer. The values in the scale [41] are a representation of the state of the breast cancer, by categories: ...
Technical Report
Full-text available
Breast screening aims to identify breast cancer at earlier stages of the disease, when treatment can be more successful. Despite the existence of screening systems on clinical institutions, the interpretation of medical imaging is affected by high rates of medical errors. Hopping to reduce medical errors, AI is increasingly being used in medical imaging systems supporting decision making of the physicians. This has spurred the field of XAI, mitigating those medical errors, while explaining AI results to the physicians. Our work seeks to strengthen empirical clinical applications of XAI by exploring underpinnings of medical decision making, drawing from the field of medical imaging. In order to facilitate physician's understanding towards AI, we are proposing the development of a novel XAI framework for clinical purposes. We also expect an improvement of both the interaction of the physician with the AI and the physicians' accuracy. We want to make a visual and interactive interface to help the physician in making a more informed decision. The visual interface's goal is to call to the attention of the physician what's important and also valuable information right away. In the interaction part, we want the physician to be able to simulate other hypothesis with the AI, to consider all possible cases, with the goal of making a decision as informed as possible. With this XAI feature we hope to improve the physician AI interaction and also, improve the accuracy, specificity, sensitivity, and precision classification.
... This project is a collaborative project and the framework will be the base of a bigger project that will have Artificial Intelligence (AI) [20,21]. It will analyze a modality and identify lesions (Section 2.3), and, if they exist, those will be annotated and classified with a BI-RADS [11,12,22] Stage. This information will be processed and given to the physician in order to help diagnose the patient. ...
Technical Report
Full-text available
Nowadays diagnosis in radiology are done with old tools, though medical exams are improving, the tools used to perform the diagnose are not. Furthermore, Deep Learning (DL) approach has increased the potential of autonomous medical diagnoses at the cost of building qualified datasets to train such supervised Machine Learning (ML) methods. One of the main activities of radiologists is lesion annotation of the breast cancer diseases. On the one hand, the quality of annotation depends on the physician’s experience. On the other hand, it also depends on the number of study cases: (i) annotations made manually by physicians are very useful to extract features like lesion contours; (ii) shape and margins that can be used in the processes of lesion segmentation; and (iii) classification made by AI assistants. We propose a project that takes advantage of several Human-Computer Interaction (HCI) techniques to develop a Computer-Aided Diagnosis (CADx) system which will support a higher and better diagnose in the breast cancer field. In this project, we introduce four new features: (i) Recorded View ; (ii) Temporal Comparison View ; (iii) Coordinated View ; and (iv) 3D Module View. The features aim to lower the time that takes to make a diagnose and to improve its confidence.Design Thinking is the HCI method that we choose to develop this system, focused on the physicians, who make the diagnoses daily. We will do three iterations of Design Thinking during this year. Special attention is given to the design choices, theories, and assumptions as well as the implementation and technological details. With this project we expect to deliver a final system that has an impact on the way that diagnosis is made and in patients that will be able to start their treatment sooner, thus reducing the mortality of this disease. In this document, we propose the design of novel interactive techniques in a platform that enhances the annotation process of medical images, in the breast cancer domain. We will do that by adopting several novel interaction techniques in order to improve the engagement and the production of qualified datasets also fostering their sharing and practical evaluation among physicians.
ResearchGate has not been able to resolve any references for this publication.