Content uploaded by Stefan Bruckner
Author content
All content in this area was uploaded by Stefan Bruckner on Oct 04, 2021
Content may be subject to copyright.
ParaGlyder:
Probe-driven Interactive Visual Analysis for
Multiparametric Medical Imaging Data
Eric M¨orth1,2[0000−0003−1625−0146] , Ingfrid S. Haldorsen2,3[0000−0001−9313−7564] ,
Stefan Bruckner1,2[0000−0002−0885−8402], and Noeska N.
Smit1,2[0000−0002−3719−4625]
1Department of Informatics, University of Bergen, Norway
2Mohn Medical Imaging and Visualization Centre,
Haukeland University Hospital, Norway
3Department of Clinical Medicine, University of Bergen, Norway
Abstract. Multiparametric imaging in cancer has been shown to be
useful for tumor detection and may also depict functional tumor charac-
teristics relevant for clinical phenotypes. However, when confronted with
datasets consisting of multiple values per voxel, traditional reading of
the imaging series fails to capture complicated patterns. These patterns
of potentially important imaging properties of the parameter space may
be critical for the analysis, but standard approaches do not deliver suffi-
cient details. Therefore, in this paper, we present an approach that aims
to enable the exploration and analysis of such multiparametric studies
using an interactive visual analysis application to remedy the trade-offs
between details in the value domain and in spatial resolution. This may
aid in the discrimination between healthy and cancerous tissue and po-
tentially highlight metastases that evolved from the primary tumor. We
conducted an evaluation with eleven domain experts from different fields
of research to confirm the utility of our approach.
Keywords: Medical Visualization ·Visual Analysis ·Multiparametric
Medical Imaging Data
1 Introduction
Multiparametric medical imaging scans are commonly used in screening pro-
cedures and in targeted diagnostics. Basing decisions on the analysis of these
datasets is not an easy task and often involves visual inspection of different jux-
taposed representations [6]. Multiparametric datasets are generated in medical
imaging by, e.g., Magnetic Resonance Imaging (MRI) scanners, by varying ac-
quisition parameters resulting in imaging data with varying contrasts. In the
analysis of medical imaging data, the main task is usually to identify discernible
patterns to distinguish pathologic from healthy tissue, and identify, e.g., ma-
lignant tumors. The identification of metastases, likely to share characteristic
imaging properties with the primary tumor, may be difficult to spot only us-
ing one modality, although identifying them at primary diagnostic work-up is
essential to develop more tailored and targeted treatment strategies in various
cancers. In order to improve the workflow of tumor diagnosis and metastases
identification, we have developed a tool for analyzing multiparametric medical
2 Eric M¨orth et al.
imaging data together with gynecological cancer, machine learning and neuro-
logical cancer research experts. By employing different views displaying multi-
parametric data at different levels of detail, we can present imaging data without
having to visually compare several modalities in side-by-side views. We enable
highlighting of target structures, based on multiparametric similarity, which was
not possible before. Medical experts are used to working with 2D slice views.
Overlaying multiparametric data on top of these views produces insights which
are easy for them to put into a spatial context. Showing multiparametric images
in one view reduces the cognitive load and allows the medical experts to see the
relevant information at a glance. Our main contributions are the following: (1)
We present visualizations that remedy the trade-offs between revealing details in
the multiparametric value domain and spatial resolution by introducing a mul-
tiparametric star glyph map-based visualization. (2) We present an interactive
analysis application primarily targeting cancer imaging, as well as additional
workflows in different application areas. (3) We evaluate our system with eleven
experts using the System Usability Scale (SUS) [3] and a qualitative evaluation
to demonstrate the utility of our approach.
2 Medical Background
Modern imaging techniques are routinely used at many centers in the preoper-
ative diagnostic work-up in endometrial cancer. Imaging markers derived from
these advanced MRI techniques have been shown to be linked to endometrial can-
cer subtype and stage [6,11,10,9,2]. According to previous findings, low tumor
blood flow and a low rate constant for contrast agent intravasation, meaning the
backflow of injected contrast into the close vessels, based on dynamic contrast-
enhanced (DCE)-MRI, are associated with high-risk histologic subtypes and poor
prognosis. Gathering information from parametric maps based on DCE-MRI is
usually done using juxtaposed images of the same slice in the different modali-
ties. These maps are derived from a single dynamic acquisition and are therefore
co-registered by nature. Examining the images involves comparing the images
mentally or by using a manually placed region of interest (ROI). If advanced
imaging methods can be utilized to validly predict the aggressiveness of a tu-
mor, this could lead to better risk-stratified treatment algorithms that may be
beneficial for the patients. Less invasive treatment regimens may then be given
in low-risk patients, and the more invasive treatments can be reserved for high-
risk patients in whom the expected survival benefit justifies the increased side
effects.
3 Related Work
Lawonn et al. [16] provide an extensive overview of different visualization tech-
niques for multimodal medical imaging datasets. Gleicher et al. [8] introduced a
taxonomy of visual comparison approaches and surveyed existing methods ac-
cording it. Friendly et al. [7] proposed radial boxplots, as a means to visualize
data variations. Ropinski et al. [22] provide a thorough overview of different
glyph-based visualization techniques in the field of multivariate medical data
visualization. Wickham et al. [27] introduced a visualization technique called
glyph maps. Opach et al. [20] described that the effectiveness of polyline versus
ParaGlyder 3
A
B
D
C
Fig. 1. The ParaGlyder prototype application, featuring a subject overview (A), central
view (B), Stixels view (C), and radial boxplot view (D).
star glyphs is task-dependent. The effective combination of star glyphs present-
ing non-spatial data and geospatial data has been demonstrated by Friendly et
al. [7] and more recently by J¨ackle et al. [12]. In contrast to this, we use star
glyphs to present an abstract version of multiparametric spatial data on top of
spatial data. Smit et al. [24] presented a method to spatially query data by plac-
ing a sphere in a 3D view, and interaction techniques to effectively place spheres
in volume renderings [25]. Bruckner et al. [4] introduced a probing tool for en-
abling visual queries. Mlejnek et al. [18] presented interactive glyphs for probing
tissue characteristics in medical data. In contrast to these approaches, we provide
a probing interaction that acts like a digital biopsy of our multiparametric med-
ical imaging datasets. More closely related to our approach, Stoppel et al. [26]
used small multiples to visualize spatio-temporal data in a spatial context. Malik
et al. [17] introduced a comparative visualization technique that visualizes up to
five modalities together in one view. J¨onsson et al.[13] presented a visual envi-
ronment for hypothesis generation using spatial and abstract data. In contrast
to these related publications, our approach enables the exploration and analysis
of multiparametric medical imaging datasets of more than five modalities. We
provide targeted functionality for the analysis of pathology, which allow for in-
spection of the multiparametric imaging data in linked spatial and non-spatial
data visualizations.
4 Requirement Analysis
Following the nested model for visualization design by Munzner [19], we charac-
terized the problem domain. To meet the requirements and the demands of the
target audience, we consulted experts in gynecological cancer imaging, neurolog-
ical imaging, and machine learning. We identified application related challenges
they face in their research practice. Cancer imaging is performed to assess tu-
mors and metastases, in gynecological cancer imaging in the pelvic area and
for neurological imaging in the brain. Cancerous tissue is discernible because it
differs from its surrounding healthy tissue. Besides analysis of the extent and
size of the tumor, analyzing different sub-regions within a tumor may be of in-
terest. Finding abdominal lymph node metastases is a challenging task, as the
metastases have variable size, ranging from a few millimeters to sizes exceeding
the primary tumor. Metastases often share some of the characteristic imaging
features of the primary tumor. Based on our analysis we present the following
requirements for our interactive analysis application:
4 Eric M¨orth et al.
–R1: Visual analysis of multiparametric imaging data in a single view
–R2: Multiparametric inhomogeneity analysis
–R3: Comparing regions within the multiparametric imaging data
–R4: Comparing multiparametric imaging data of multiple subjects
–R5: Multiparametric similarity analysis based on a digital biopsy
–R6: Interactive parameter selection for automatic multiparametric segmen-
tation tasks
When satisfying these requirements, we support gynecological imaging researchers,
neurological imaging experts and machine learning experts in their research or
clinical routine with the ultimate goal of improving patient care by providing
better diagnostic tools that can guide tailored and individual treatment strate-
gies.
5 ParaGlyder
In this section, we present our visualization and interaction design decisions
based on the requirement analysis. In Figure 2, we present the different compo-
nents of our method and their interplay. Our design combines spatial and non-
spatial visualizations, linked by a view combining a non-spatial visualization in
spatial context. Our approach consists of several visualization and interaction
methods for the interactive analysis of multiparametric data described in the
following.
5.1 Data Processing
Our method relies on multidimensional co-registered volumetric data. Our gyne-
cological cancer experts already deliver co-registered volumes due to the nature
of the data source. Co-registration is therefore not part of our application but
may be performed by using state of the art applications such as Elastix [14].
spatial non-spatial
Interactive probing
in volume view
Tumor and metastases
detection support Comparison between
patients
Comparison within
patient
Multiparametric homogeneity analysis
Similarity calculation
Multiparametricprobing
Fig. 2. The ParaGlyder application com-
bines spatial (volumetric view) and non-
spatial (radial boxplot) visualization to en-
able multiparametric analysis and explo-
ration. In between, the Stixels view depicts
a combination of both.
Standard MRI imaging data cannot
be converted to physical units and
therefore is highly dependent on the
scanner and sequences employed. In
order to allow for comparison normal-
ization is required. In our application,
we perform two types of normaliza-
tion. When we use a slice view, we
normalize the data of the slice using a
min-max normalization of the selected
slice. In the 3D volume visualization,
we normalize the whole volume by us-
ing the min-max value of the volume.
This results in the most appropriate
normalization based on the tasks the
visualizations support.
5.2 The Stixels View
Based on requirement R1, the goal is to raise the level of detail in the value
domain but still keep the details in the spatial resolution. To facilitate this, we
ParaGlyder 5
Fig. 3. The Stixels view reveals an inhomogeneous tumor in one subject (a) and a more
homogeneous tumor in another subject (b). The outline in red shows the tumor extent
for illustration purposes. A tooltip provides details on demand in a radial boxplot (c),
The Stixels view reveals oedema in the brain after surgery (d).
employ a glyph map approach, presented in the middle of Figure 2, which is
called the Stixels (Star glyph pixels) view. The glyph map is based on a regular
grid which is overlaid on a 2D view of a slice. For every grid cell, we calculate
statistics of the multiparametric medical imaging data. The star glyphs are then
created by summarizing the statistics within each of the cells. The grid size and
the star glyph size can be adapted, depending on the granularity of the structure
of interest. By cropping the slice view to a region of interest, the glyph maps also
adapt to the selection and allows for an even more detailed view of the selected
structures. We use star glyphs instead of polyline-based glyphs since according
to Opach et al. [20] star glyphs are a better choice for finding differences. For
the star glyph design, we display the average value of each parameter within
the grid cell on the axes. The area described by connecting these points forms a
glyph which describes the relation of average parameters within the cell. When
designing a star glyph, a homogeneous shape is favorable [21,15]. Therefore,
the order in which the parameters are presented is adjustable. While even more
information could be encoded on the axes of the star glyph, we opted for a design
that is easier to interpret and presents all necessary information at a glance to
prevent a steep learning curve. The star glyph map provides an overview which
allows the user to identify the tumor since the tissue differs from healthy tissue in
the multiparametric dimensions. In addition, the inhomogeneity of the tumor can
be analyzed. When spotting interesting parts of the tumor, a closer investigation
of the area using the interactive probing can be employed.
5.3 3D Probing Visualization
Requirements R3 and R4 support analyzing different parts of the tumor in-
dependently, enabling identification of tumor patterns. Probing spheres deliver
detailed information from data within selected regions. Regions of interest can
be specified by using multiple probing spheres. This enables a comparison of
different regions within the imaging data for a single patient, e.g., healthy tissue
and cancerous tissue. All voxels from all parameters within the spheres can be
used in the statistical analysis, like the approach used for the star glyph map. For
the visual encoding of the probed regions a radial boxplot is used. It shows the
user the summary statistics for selected regions of interest at a glance. Compar-
ison is enabled by the superposition of multiple radial boxplots. Radial boxplots
are favorable because they align with the use of star glyphs in the Stixels view
and represent a more detailed view of selected areas. Differences and similarities
6 Eric M¨orth et al.
Fig. 4. Volume probing using two different probing spheres (a) results in live updates
to the radial boxplot view (b). Probing interaction within another subject (c) results
in a radial boxplot comparing data across subjects (d).
over all modalities can be analyzed by placing multiple spheres either within
the data of a single patient or multiple patients. To establish visual correspon-
dence between the probing spheres and radial boxplots, both the spheres and
boxplot share the same color hue. Interactive probing can be used to define a
multiparametric pattern which describes different tumor characteristics based
only on imaging data and may also be found in other patients suffering from a
similar tumor type.
5.4 Interaction
To support requirements R3 and R4 various interaction methods are provided.
The placement of the probing sphere can be performed either in 2D images or
in 3D volumes. The size of the sphere can be adapted to fit the scale of the
region of interest and the sphere can be placed freely. The quickest option is
the free placement where the sphere is placed according to the intersection of
a ray going from the screen position, where the mouse is located, towards the
volume based on the closest visible point in the volume. In addition to this quick
initial placement of the sphere, we introduce a mode where the sphere can only
be translated within the current X-Y plane the sphere is located at. Another
option only adapts the depth of the sphere along the Z-axis. When using a 2D
view, it may occur that the probing sphere is behind the current slice and thus
occluded. To remedy this, we provide an option to snap the sphere back to the
slice. To support working with brain data, placing a sphere that is automatically
mirrored to the other hemisphere is also possible.
5.5 Similarity Visualization
Requirements R5 and R6 state that a similarity analysis and an interactive pa-
rameter selection is beneficial in tumor analysis. Analyzing the tumor extent
and possible metastases in surrounding tissue is a typical task for radiologists.
In addition, segmentation of tumors is an active field of machine learning re-
search, where some algorithms require feature selection. To support these tasks,
we employ the multiparametric contents of a probed area in a similarity func-
tion. We decided to use the Euclidean distance over all dimensions because they
are equally important. When applying this function to each multi-parametric
voxel in the volume, we derive a new volume consisting of similarity values be-
tween 0 and 1 which can be displayed with an appropriate transfer function.
A transfer function that highlights regions of high similarity through color and
ParaGlyder 7
opacity enables users to identify structures such as tumors and possible metas-
tases and enables a visual clustering with soft boundaries. Metastases which
share the same imaging properties as the primary tumor are highlighted using
direct volume rendering. Editing the transfer function enables the user to ex-
plore the inhomogeneity (R2) and the extent of different parts of the tumor. In
addition, this similarity function-based visual encoding is also applied to the star
glyph map. The fact that the similarity is based on the user-selected parameters
enables the user to perform interactive feature selection (R6).
6 Results
The ParaGlyder application is depicted in Figure 1 and consists of a center view
(Figure 1B), which provides common spatial visualization features, such as a
3D view, 2D slice-based views, cropping, and transfer function editing, and a
probing functionality. Next to the main view, the Stixels view is located (Figure
1C), which consists of a 2D slice view and an overlaid glyph map consisting of star
glyphs. The last view is the probing view, component D in Figure 1. It consists of
a radial boxplot based on probing sphere input. We analyzed different datasets
of endometrial cancer patients, provided by one of our co-authors, as well as a
brain tumor dataset publicly available and provided by Schmainda and Prah [23]
via the Cancer Imaging Archive (TCIA) [5]. The endometrial cancer dataset
comprises standard multiparametric MR sequences and derived parameter maps
visualizing physical parameters, e.g., blood flow and plasma volume. The data is
co-registered due to its origin. For the brain tumor and inflammation data, we
have access to the standard parameters acquired in multiparametric MR, such
as T1-, T2- and diffusion-weighted images.
6.1 Tumor Detection and Multiparametric Homogeneity
Assessment
To detect tumors and assess their multiparametric homogeneity, the Stixels view
is used. The user selects the slice and the parameter to show. A detailed view of
individual Stixels is presented when the user hovers the mouse over the specific
Stixel. A detailed tooltip is shown, visualized in Figure 3c. In order to support
region of interest (ROI) selection, we employ volumetric cropping to select an
appropriate Stixel window. The grid of the Stixels adapts accordingly and then
probes smaller regions determined by the ROI. When placing a probing sphere,
the Stixels are colored by the multiparametric similarity, measured based on
Euclidean distance, using the Viridis colormap. The similarity Stixels view, vis-
ible in Figures 3a and 3b, additionally reveals the inhomogeneity of the tumor.
The red line marks the outline of the tumor and the color and shape variations
of the star glyphs represents the inhomogeneity within the primary tumor. In
Figure 3a, a tumor with a high degree of inhomogeneity is visible, while Fig-
ure 3b reveals a more homogeneous tumor. The inhomogeneity analysis enables
the user to spot distinct parts within the tumor, e.g., a necrotic core and allows
for further analysis of these specific parts in detail.
6.2 Region Comparison for Tumor Characteristic Assessment
Probing spheres are used to analyze different parts within one patient or across
multiple patients. This probing interaction is conceptually similar to a digital
8 Eric M¨orth et al.
Fig. 5. The similarity view highlighting the uterine primary tumor in the center and
two metastatic lymph nodes (a). When an insufficient number of dimensions is selected,
the similarity view fails to capture the tumor and metastases (b). The similarity view
captures brain inflammation (c), while simple thresholding on one modality would
capture the skull as well.
biopsy. The result of the probing interaction is a radial boxplot, visible in com-
ponent D in Figure 1. Figure 4a showcases placement of two spheres for a single
subject, while Figure 4b shows a sphere placed to compare regions across sub-
jects. The radial boxplot is shown in Figure 4b and 4d. On each axis, the median
value is presented as a dot, and these dots are connected by lines. In addition to
the median value, the 25% and the 75% quantile ranges are visible as an overlaid
band. This representation allows the user to see the inhomogeneity of the data
within the sphere. The maximum values of the axes can be adapted to fit the
selected data range. The spheres are used to characterize tumor tissue and to
come up with specific signature shapes in the radial boxplot that can be used
to classify the imaging data of new patients. The interaction responsiveness is
ensured by providing a real-time update of the radar chart with the probed val-
ues of the volumetric multiparametric imaging data while the sphere is moved
interactively through the volume.
6.3 Similarity Visualization for Metastases Detection and Feature
Selection
The similarity view, visible in Figure 1B and Figure 5, visualizes the extent of
a tumor and potential nearby metastases. Figure5(a) shows the similarity vol-
ume when using all multiparametric images and Figure5(b) shows the similarity
volume with only three out of five of the multiparametric images. The Figure
shows that the three selected images do not contain enough information to seg-
ment the tumor and the metastases. The colored Stixels are presented in Figure
3c. For both approaches the Viridis colormap is chosen as a transfer function,
where opacity is mapped to similarity, i.e., the visibility of regions that differ
from the current selection is reduced. In component B of Figure 1, the similarity
view of parameter maps of a patient with endometrial cancer is visible. This
similarity analysis enables a clear and distinct visualization of the tumor (the
lower right structure in the inset), by placing a probing sphere inside the tumor
tissue. Due to their multiparametric similarity, metastases in the lymphatic sys-
tem (structures to the left and above the primary tumor) are also highlighted.
When analyzing only one of the multiparametric images at a time the detection
of metastases is much more difficult because they are not clearly visible. When
probing inflammatory data within the brain, the similarity view provides a quick
ParaGlyder 9
segmentation of inflamed tissue. The segmentation does not include the bone as
a standard thresholding operation based on T2 Flair data only would, visualized
in Figure 5(c) and Figure 5(d). This demonstrates that the multiparametric sim-
ilarity function facilitates a rapid multiparametric segmentation, which could be
used in diagnosis or treatment planning, as well as feature analysis as input to
automatic segmentation methods in a machine learning context.
Table 1. The response of the experts on a 5-point Liker scale. The values range from 1:
Strongly disagree to 5: Strongly agree. Statements marked with a star were rephrased
to present the positive form in this table, also the scores have been inverted. On the
right end of the table the average value over all experts is presented and in the last
row the result of the SUS questionnaire is presented.
Statement N1 N2 N3 Gy1Gy2 Gy3 Gy4 Gy5 M1 M2 M3 Avg.
G1 The linked interactions between the center view and the radar chart are well established and intuitive 3 5 4 5 5 5 5 3 5 5 4 4,45
G2 The linked interactions between the analyze view and the Stixel view are well established and intuitive 5 4 4 5 5 5 5 4 5 5 5 4,73
G3 I see myself using the MRI Explorer in the future* 3 5 3 5 5 3 5 5 5 4 4 4,27
G4 I would like to contribute in the future development of the application 5 5 5 5 5 4 5 3 3 5 4 4,45
G5 I can see the application as a part of my daily work routine* 3 5 2 5 4 1 3 5 4 1 1 3,09
G6 The application is more applicable for research than for daily clinical practice 3 4 5 5 4 5 4 3 5 4 2 4,00
G7 The application should be part of the software used in clinical practice* 1 5 1 3 4 3 5 5 4 4 5 3,64
P1 The navigation in 3D is easy to understand and I can place the sphere where I want* 5 4 2 2 5 5 5 4 5 5 5 4,27
P2 The resizing operation of the sphere is easy to understand and to carry out* 4 4 2 5 4 5 5 5 5 4 5 4,36
P3 I can place the sphere anywhere on the plane using the provided keyboard interactions 5 5 3 4 5 4 5 5 5 5 5 4,64
P4 Setting the probing sphere to a specific depth in the volume is intuitive 2 5 2 3 5 4 4 4 5 5 4 3,91
P5 Snapping the probing sphere to the current slice selection is useful 5 4 3 5 5 5 5 4 5 5 3 4,45
P6 The probing interaction is responsive* 3 5 4 5 5 5 5 5 5 5 5 4,73
P7 The automatic update of the Radar chart is beneficial* 5 5 4 5 5 5 4 5 5 5 5 4,82
P8 The radar chart helps me to interpret multimodal data 5 5 5 5 5 5 5 5 5 4 5 4,91
P9 I am confident in interpreting the values that the radar chart presents 3 5 3 4 5 3 4 3 5 5 3 3,91
P10 With the probing functionality, I am able to compare different regions within one subject* 5 5 3 4 5 5 5 5 4 5 4 4,55
P11 The probing function enables me to compare regions between different subjects 3 5 4 5 5 5 4 5 5 3 4 4,36
St1 I understand what the Stixels view shows me and can interpret the star glyphs used. 3 5 4 4 5 3 4 3 5 5 3 4,00
St2 The Stixels view helps me to gather insight of the inhomogeneity of the data* 5 5 3 5 5 5 5 5 5 3 4 4,55
St3 The cropping functionality helps me to focus the Stixel view on the most important region of the subjects
data*
1 5 2 5 5 5 5 5 5 5 4 4,27
St4 The different grid sizes help me to first get an overview and add details on demand 5 5 4 5 5 4 5 5 4 5 4 4,64
St5 The tooltip helps me to see more details in the Stixels view when I need them 5 5 4 5 5 5 4 5 5 4 4 4,64
S1 I understand the color coding of the Stixels in terms of similarity* 3 5 4 5 5 5 5 5 5 5 5 4,73
S2 The similarity coloring of the Stixels helps me to adapt my probing selection 3 5 3 5 5 4 4 5 4 5 4 4,27
S3 The similarity volume visualization shows me interesting parts of the volumetric data 5 4 5 5 5 5 5 5 5 5 4 4,82
S4 The similarity view is useful to me and I would like to use it in my work routine/research* 5 5 5 5 5 5 5 5 5 1 4 4,55
Gys1 The application can improve the analysis of the inhomogeneity of gynecological cancer 5 5 5 5 5 5,00
Gys2 The application can support hypothesis generation for linking parameters with aggressiveness of gyneco-
logical cancer
5 5 5 5 5 5,00
Gys3 I would find this application useful when analyzing patients gynecological cancer MR data* 5 5 5 5 5 5,00
Gys4 I would like to use this application to explain pathology and treatment to patients 4 5 1 3 3 3,20
Gys5 I would like to use this application to plan a biopsy for analyzing biomarkers of the tumor X 5 1 3 5 3,50
Gys6 The application is useful for finding metastases* 5 4 2 5 5 4,20
Gys7 The similarity view shows me the structure of the tumor 5 4 5 5 4 4,60
Gys8 The similarity view shows me the size and structure of possible metastases 5 5 3 5 5 4,60
Ns1 The application helps me to visualize lesions in the brain 5 4 1 3,33
Ns2 Comparing different regions within the brain using the comparison picker is particularly useful for me* 5 5 5 5,00
Ns3 The similarity view helps me to get a better volumetric view of the lesion* 5 5 4 4,67
Ns4 I would like to use this tool to further analyse multiparametric brain imaging data 3 5 4 4,00
Ns5 The interaction with the comparison tool is suitable for brain images 5 5 3 4,33
Ns6 The application helps me see the intensity relations of the different tissue types between modalities* 5 5 5 5,00
Ms1 The application helps me to carry out feature selection prior to applying my machine learning algorithms 4 4 3 3,67
Ms2 I find the similarity view useful to identify which modalities are important for me* 4 4 5 4,33
Ms3 I can imagine using this tool before applying machine learning algorithms* 5 4 4 4,33
Ms4 This application is particularly useful for segmentation based on machine learning 5 4 5 4,67
SUS System usability scale result 97,5 85 40 75 85 80 87,5 80 95 92,5 85 81,75
7 Evaluation
We conducted a qualitative evaluation with eleven experts (6 male, 5 female)
from the scientific fields of neurological imaging (N1-3), gynecological cancer
imaging (Gy1-5) and machine learning research (M1-3). One expert is co-author
and provided us with clinical data of gynecological cancer patients and one ex-
pert of each domain (N1, Gy2, M2) was included in the interviews during the
development of our application. We were especially interested in validating the ef-
fectiveness of the various visualization components and identifying opportunities
to make our application more suitable for daily research or even clinical practice.
10 Eric M¨orth et al.
The individual evaluation started with a short demonstration of the tool, after-
wards experts were encouraged to explore and analyze the multiparametric data
themselves. They were invited to comment using a think-aloud protocol. The gy-
necological cancer and machine learning experts worked with endometrial cancer
data and the neurological imaging experts with data provided by Schmainda and
Prah [23] via the Cancer Imaging Archive (TCIA) [5]. After this phase, which
lasted around 30 minutes, we conducted a semi-structured interview with the
experts. Finally, a questionnaire consisting of 27 generally applicable statements
and 4-8 targeted statements for the different expert groups was conducted. The
experts were asked to indicate their level of agreement using a five-point Likert
scale. In addition to our targeted evaluation form, we asked the experts to fill
out the system usability scale (SUS) provided by Brook et al. [3]. The evaluation
results of the eleven participants are shown in Table 1.
We conclude from the results presented in Table 1 that the application is
overall valuable for the experts. The probing interaction was rated favorably,
two participants would appreciate a guided 3D placement of the probe. All study
participants think that the Stixels view helps them to see inhomogeneous regions
within the Slice view. The similarity view received the most positive feedback
and is potentially useful for all involved experts. The targeted statements demon-
strate that the application is applicable different scenarios, albeit for different
reasons. The gynecological experts envision that the application could improve
the assessment of tumor heterogeneity both in primary tumors and metastases.
The SUS scores range from 40 to 97,5, where the second lowest score is 75. On
average, the SUS score is 81,75. According to Bangor et al. [1], the score can be
interpreted to be between good and excellent.
8 Conclusion and Future Work
We present ParaGlyder, a multiparametric image visualization tool. The tool
provides different views for tumor detection, inhomogeneity analysis, feature se-
lection, and diagnosis in multiparametric medical images, by a tight coupling
of spatial and non-spatial data visualization techniques. Our tool is based on a
combination of star glyph maps and radar charts. A built-in similarity visualiza-
tion of the volumetric data enables the visualization of, e.g., primary tumor and
the corresponding metastases. The qualitative evaluation confirmed the utility
of our application for diverse application areas. In the future, we plan to extend
our approach to analysis of larger patient cohorts in order to assess whether
this visualization tool could aid in the detection of metastases. Furthermore, the
application has the potential to unravel patient-specific imaging features that
may be linked to specific clinical phenotypes and outcomes, thus representing a
promising tool to facilitate more personalized treatment strategies.
References
1. Bangor, A., Kortum, P., Miller, J.: Determining what individual SUS scores mean:
Adding an adjective rating scale. J. Usability Studies 4(3), 114–123 (May 2009)
2. Berg, A., Fasmer, K.E., Mauland, K.K., Ytre-Hauge, S., Hoivik, E.A., Husby, J.A.,
Tangen, I.L., Trovik, J., Halle, M.K., Woie, K., Bjørge, L., Bjørnerud, A., Salvesen,
ParaGlyder 11
H.B., Werner, H.M.J., Krakstad, C., Haldorsen, I.S.: Tissue and imaging biomark-
ers for hypoxia predict poor outcome in endometrial cancer. Oncotarget 7(43),
69844–69856 (2016). https://doi.org/10.18632/oncotarget.12004
3. Brooke, J.: SUS-a quick and dirty usability scale. usability evaluation in indus-
try. In: Jordan, P., Thomas, B., McClelland, I., Weerdmeester, B. (eds.) Usability
Evaluation In Industry, chap. 10, pp. 266–290. CRC Press (2004)
4. Bruckner, S., Solteszova, V., Gr¨oller, E., Hladuvka, J., B¨uhler, K., Yu, J.,
Dickson, B.: Braingazer - visual queries for neurobiology research. IEEE
transactions on visualization and computer graphics 15, 1497–504 (11 2009).
https://doi.org/10.1109/TVCG.2009.121
5. Clark, K., Vendt, B., Smith, K., Freymann, J., Kirby, J., Koppel, P., Moore, S.,
Phillips, S., Maffitt, D., Pringle, M., Tarbox, L., Prior, F.: The cancer imaging
archive (tcia): Maintaining and operating a public information repository. Journal
of Digital Imaging 26(6), 1045–1057 (Dec 2013). https://doi.org/10.1007/s10278-
013-9622-7
6. Fasmer, K.E., Bjørnerud, A., Ytre-Hauge, S., Gr¨uner, R., Tangen, I.L., Werner,
H.M., Bjørge, L., Salvesen, Ø.O., Trovik, J., Krakstad, C., Haldorsen, I.: Preoper-
ative quantitative dynamic contrast-enhanced mri and diffusion-weighted imaging
predict aggressive disease in endometrial cancer. Acta Radiologica 59(8), 1010–
1017 (2018). https://doi.org/10.1177/0284185117740932
7. Friendly, M.: A.-M. Guerry’s Moral Statistics of France: Challenges for
multivariable spatial analysis. Statistical Science 22(3), 368–399 (2007).
https://doi.org/10.1214/07-STS241
8. Gleicher, M., Albers, D., Walker, R., Jusufi, I., Hansen, C.D., Roberts, J.C.: Visual
comparison for information visualization. Information Visualization 10(4), 289–309
(2011). https://doi.org/10.1177/1473871611416549
9. Haldorsen, I.S., Stefansson, I., Gr¨uner, R., Husby, J.A., Magnussen, I.J.,
Werner, H.M.J., Salvesen, Ø.O., Bjørge, L., Trovik, J., Taxt, T., Akslen,
L.A., Salvesen, H.B.: Increased microvascular proliferation is negatively cor-
related to tumour blood flow and is associated with unfavourable outcome
in endometrial carcinomas. British Journal of Cancer 110(1), 107–114 (2014).
https://doi.org/10.1038/bjc.2013.694
10. Haldorsen, I.S., Gr¨uner, R., Husby, J.A., Magnussen, I.J., Werner, H.M.J.,
Salvesen, Ø.O., Bjørge, L., Stefansson, I., Akslen, L.A., Trovik, J., Taxt, T.,
Salvesen, H.B.: Dynamic contrast-enhanced MRI in endometrial carcinoma identi-
fies patients at increased risk of recurrence. European Radiology 23(10), 2916–2925
(Oct 2013). https://doi.org/10.1007/s00330-013-2901-3
11. Haldorsen, I.S., Salvesen, H.B.: What is the best preoperative imaging
for endometrial cancer? Current Oncology Reports 18(4), 25 (Feb 2016).
https://doi.org/10.1007/s11912-016-0506-0
12. J¨ackle, D., Fuchs, J., Keim, D.A.: Star glyph insets for overview preservation
of multivariate data. IS and T International Symposium on Electronic Imag-
ing Science and Technology pp. 1–9 (2016). https://doi.org/10.2352/issn.2470-
1173.2016.1.vda-506
13. J¨onsson, D., Bergstr¨om, A., Forsell, C., Simon, R., Engstr¨om, M., Ynnerman,
A., Hotz, I.: A visual environment for hypothesis formation and reasoning in
studies with fMRI and multivariate clinical data. In: Kozl´ıkov´a, B., Linsen, L.,
V´azquez, P.P., Lawonn, K., Raidou, R.G. (eds.) Eurographics Workshop on Vi-
sual Computing for Biology and Medicine. The Eurographics Association (2019).
https://doi.org/10.2312/vcbm.20191232
12 Eric M¨orth et al.
14. Klein, S., Staring, M., Murphy, K., Viergever, M.A., Pluim, J.P.W.: Elastix: A tool-
box for intensity-based medical image registration. IEEE Transactions on Medical
Imaging 29(1), 196–205 (Jan 2010). https://doi.org/10.1109/TMI.2009.2035616
15. Klippel, A., Hardisty, F., Weaver, C.: Star plots: How shape characteristics influ-
ence classification tasks. Cartography and Geographic Information Science 36(2),
149–163 (2009). https://doi.org/10.1559/152304009788188808
16. Lawonn, K., Smit, N., B¨uhler, K., Preim, B.: A survey on multimodal med-
ical data visualization. Computer Graphics Forum 37(1), 413–438 (2017).
https://doi.org/10.1111/cgf.13306
17. Malik, M.M., Heinzl, C., Gr¨oller, M.E.: Comparative visualization for parame-
ter studies of dataset series. IEEE Transactions on Visualization and Computer
Graphics 16(5), 829–840 (Sep 2010). https://doi.org/10.1109/TVCG.2010.20
18. Mlejnek, M., Ermes, P., Vilanova, A., van der Rijt, R., van den Bosch, H., Ger-
ritsen, F., Gr¨oller, M.E.: Profile flags: a novel metaphor for probing of t2 maps.
In: C. T. Silva, E. Gr¨oller, H.R. (ed.) Proceedings of IEEE Visualization 2005. pp.
599–606. IEEE CS (Oct 2005)
19. Munzner, T.: A nested model for visualization design and validation. IEEE Trans-
actions on Visualization and Computer Graphics 15(6), 921–928 (2009)
20. Opach, T., Popelka, S., Dolezalova, J., Rød, J.K.: Star and polyline
glyphs in a grid plot and on a map display: which perform better?
Cartography and Geographic Information Science 45(5), 400–419 (2018).
https://doi.org/10.1080/15230406.2017.1364169
21. Peng, W., Ward, M.O., Rundensteiner, E.A.: Clutter reduction in multi-
dimensional data visualization using dimension reordering. Proceedings -
IEEE Symposium on Information Visualization, INFO VIS pp. 89–96 (2004).
https://doi.org/10.1109/INFVIS.2004.15
22. Ropinski, T., Oeltze, S., Preim, B.: Visual computing in biology and
medicine: Survey of glyph-based visualization techniques for spatial mul-
tivariate medical data. Comput. Graph. 35(2), 392–401 (Apr 2011).
https://doi.org/10.1016/j.cag.2011.01.011
23. Schmainda, K., Prah, M.: Data from brain-tumor-progression (2018).
https://doi.org/10.7937/K9/TCIA.2018.15quzvnb
24. Smit, N.N., Kraima, A.C., Jansma, D., Ruiter, M.C.d., Botha, C.P.: A Unified Rep-
resentation for the Model-based Visualization of Heterogeneous Anatomy Data. In:
Meyer, M., Weinkaufs, T. (eds.) EuroVis - Short Papers. The Eurographics Associ-
ation (2012). https://doi.org/10.2312/PE/EuroVisShort/EuroVisShort2012/085-
089
25. Smit, N.N., Haneveld, B.K., Staring, M., Eisemann, E., Botha, C.P., Vilanova,
A.: RegistrationShop: An Interactive 3D Medical Volume Registration System.
In: Viola, I., Buehler, K., Ropinski, T. (eds.) Eurographics Workshop on Vi-
sual Computing for Biology and Medicine. The Eurographics Association (2014).
https://doi.org/10.2312/vcbm.20141193
26. Stoppel, S., Hodneland, E., Hauser, H., Bruckner, S.: Graxels: Information rich
primitives for the visualization of time-dependent spatial data. In: Eurographics
Workshop on Visual Computing for Biology and Medicine. pp. 183–192 (sep 2016).
https://doi.org/10.2312/vcbm.20161286
27. Wickham, H., Hofmann, H., Wickham, C., Cook, D.: Glyph-maps for visually
exploring temporal patterns in climate data and models. Environmetrics 23(5),
382–393 (2012). https://doi.org/10.1002/env.2152