Conference PaperPDF Available

Immersive Analytics


Abstract and Figures

Immersive Analytics is an emerging research thrust investigating how new interaction and display technologies can be used to support analytical reasoning and decision making. The aim is to provide multi-sensory interfaces that support collaboration and allow users to immerse themselves in their data in a way that supports real-world analytics tasks. Immersive Analytics builds on technologies such as large touch surfaces, immersive virtual and augmented reality environments, sensor devices and other, rapidly evolving, natural user interface devices. While there is a great deal of past and current work on improving the display technologies themselves, our focus in this position paper is on bringing attention to the higher-level usability and design issues in creating effective user interfaces for data analytics in immersive environments.
Content may be subject to copyright.
Immersive Analytics
Tom Chandler, Maxime Cordeil, Tobias Czauderna, Tim Dwyer, Jaroslaw Glowacki, Cagatay Goncu,
Matthias Klapperstueck, Karsten Klein, Kim Marriott, Falk Schreiber and Elliot Wilson
Faculty of IT
Monash University
Email: {tom.chandler,max.cordeil,tobias.czauderna,tim.dwyer,jaroslaw.glowacki,cagatay.goncu,
Abstract—Immersive Analytics is an emerging research thrust
investigating how new interaction and display technologies can
be used to support analytical reasoning and decision making.
The aim is to provide multi-sensory interfaces that support
collaboration and allow users to immerse themselves in their
data in a way that supports real-world analytics tasks. Immersive
Analytics builds on technologies such as large touch surfaces,
immersive virtual and augmented reality environments, sensor
devices and other, rapidly evolving, natural user interface devices.
While there is a great deal of past and current work on improving
the display technologies themselves, our focus in this position
paper is on bringing attention to the higher-level usability and
design issues in creating effective user interfaces for data analytics
in immersive environments.
We are living in the era of big data [1], [2]. The amount
and complexity of the data available for analysis in the
physical-, life- and social-sciences—as well as that available
for guiding business and government decision making—is
growing exponentially. Several overlapping research fields are
concerned with the development of methods to support the
analysis of such data; particularly: information visualisation,
scientific visualisation, machine learning, data mining, data
science, human computer interaction and visual analytics.
Visual Analytics was introduced a decade ago as “the
science of analytical reasoning facilitated by interactive visual
interfaces” [3]. Techniques and practices developed by this
discipline are now common-place in (for example): business
intelligence; transport and logistics; scientific applications in-
cluding astronomy, biology and physics; environmental moni-
toring and personal information management. It is now a key
technology for dealing with big data [4], and there is massive
potential for its use in further emerging application areas like
health informatics and many others.
The definition for visual analytics given above is agnostic
of the actual interface devices employed by visual analysis
systems. Nevertheless, the affordances of display and input
devices used for analysing data strongly affect the experience
of the users of such systems. It is now well understood that
user experience in interactive systems strongly affects their
degree of engagement and, hence, productivity [5]. Ultimately,
therefore, it affects the adoption and ubiquity of data analysis
tools. For practical visual analysis tools used in the industry
and areas of data science described above, the platform for
interaction is almost always a standard desktop computer: a
single average-sized display, keyboard and mouse.
In recent years we have seen significant advances in the
development of new technologies for human-computer inter-
faces. On one hand, we have seen the development of expen-
sive bespoke technologies mainly for scientific visualisation.
For example, immersive environments like a Cave Automatic
Virtual Environment (CAVE) [6] can make use of ultra-high
resolution technology, and combine 2D and 3D visualisations
to allow users to immerse themselves in computer generated
scenes. A classic use-case for the CAVE is “walk-throughs”
of human-scale architectural and engineering models [7].
On the other hand, we have also seen the development of
inexpensive commodity technologies—such as the Leap Mo-
tion1, Kinect2, Oculus Rift3and (soon) Microsoft HoloLens4
for providing natural user interfaces, virtual and augmented
reality. Industry—in particular the entertainment industry—
has driven this development and it has progressed rapidly,
providing engaging and immersive experiences for a fraction
of the cost of the CAVE. As a consequence, voice- and gesture-
based control as well as 3D visualisation are becoming more
and more a part of everyday life, and corresponding devices are
becoming available and affordable for small businesses and the
general public. The amount of new, low-cost interaction tech-
nology hitting the market is unprecedented, and given the rise
of new manufacturing technologies (additive manufacturing)
and business models (crowdfunding, etc.) the number of new
and affordable display and interaction devices is only going to
Visual analytics and information visualisation researchers
sometimes publish studies exploring various data visualisation
applications in 3D, for example [8]. However, this has never
been a core topic in information visualisation. Meanwhile,
the rise in popularity of consumer touch-screen devices has,
arguably, led to an explosion in research visualisation tools
incorporating multitouch interaction. In other words, touch is
an example of a recent natural user-interface technology reach-
ing maturity and acceptance in the information visualisation
community, for example, a compelling research agenda on
this topic is put forward by Isenberg et al. [9]. Yet, to date,
3D user interface researchers have tended to focus more on
the lower-level challenges such as making gesture recognition,
head-tracking and rendering techniques robust and reliable.
This is understandable, given the significant algorithmic and
(a) (b)
Fig. 1. A use-case to be explored by immersive analytics: collaborative data analysis in augmented and virtual reality. (a) Wearable display devices that can
augment the world around us with arbitrary text and graphics enable a new degree of flexibility in data analysis. A concept sketch, two people in the same room
(Peter and Sarah) discuss complex data. They can summon visual representations into one another’s field of view with a simple hand gesture or voice command.
Just as information graphics can be overlaid on their environment, a third person (John) can join the meeting and be given a virtual presence. At John’s end,
he also sees Peter and Sarah as if they are physically in the same room, together with the data visuals. The devices to at least prototype such experiencesare
now cheap and reliable enough that we can focus on the application and ergonomics rather than the underlying technologies such as motion tracking and image
registration. (b) Wearable VR headsets can immerse us into our data. A prototype using the Oculus Rift that we are building to try and realise this scenario.
hardware challenges involved. The higher-level concerns of
applications and effective user experience, let alone the use
of this technology in data analysis applications, has lagged
behind. Therefore, a systematic approach to develop practical
visual analysis tools in immersive environments is lacking.
We propose a new facet of data analytics research that seeks
to unify these efforts to identify the most enabling aspects of
these emerging natural user interface and augmented reality
technologies for real-world analysis of data. We call this new
research thrust Immersive Analytics, a topic that will explore
the applicability and development of emerging user-interface
technologies for creating more engaging and immersive expe-
riences and seamless workflows for data analysis applications.
Immersive Analytics builds on technologies such as large
touch surfaces, immersive virtual and augmented reality envi-
ronments like Oculus Rift and CAVE2 [6], and tracking de-
vices such as Leap Motion and Kinect to provide environments
beyond the classical monitor, keyboard and mouse desktop
configuration [10]. The environments we envision will be
developed by immersive analytics researchers, will be usable
by experts and analysts to help in the detailed analysis of
complex, big data sets. However, they will also be accessible to
decision makers—that is, the managers who spend more time
working face-to-face with others than in front of a desktop
computer—and to the everyday public to help them in tasks
like shopping, understanding public transport and social and
political issues affecting their lives.
In this position paper, we first give examples of the current
state of main stream research into visual analytics, virtual
reality and collaboration. We then present immersive analytics
by describing near-future scenarios and a number of projects
that are currently being developed by the Immersive Analytics
research group at Monash University5. These projects are all
striving to move beyond traditional desktop visualisation and
provide a much richer, more immersive experience. We finish
by presenting a number of research questions that we feel this
new field should address.
Research on immersive analytics comprises the develop-
ment and evaluation of innovative interfaces and devices, as
well as the corresponding interaction metaphors and visual-
isations that allow these to be used for understanding data
and for decision making. Immersive analytics is therefore
inherently multi-disciplinary and involves researchers from
human-computer interaction, visual analytics, augmented re-
ality, and scientific and information visualisation. Relevant
conferences include, for instance, HCI6, 3DUI7,VR
CSCW10, TableTop11, Vis and InfoVis12 and VAST13.Atsome
of these conferences we are already seeing workshops and
special sessions that address aspects of immersive analytics:
e.g. “Death of the Desktop”14, “3DVis”15, etc. However, at
this stage there is no coordinated exploration of immersive
During the last three decades, research in Virtual Reality
Environments (VRE) has been focused on the use of head
6Human Computer Interaction
73D User Interfaces
8Virtual Reality
9Augmented Reality
10Computer Supported Cooperative Work
11Interactive Tabletops and Surfaces
12(Information) Visualization
13Visual Analytics Science and Technology
(a) (b)
Fig. 2. “Immersing people in their data” does not necessarily involve 3D or stereo display. (a) In this concept, analysts work in a purpose-built, collaborative
data analytics room. Interaction technologies such as pen, touch and gesture control allow them to interact directly with their data and collaborate in a more
egalitarian way than keyboards and mice, which tether individual users to a desktop. Immersive analytics will systematically research how many kindsof
emerging interaction technologies can be harnessed to engage and enable people to work together to better understand data. (b) The figure shows a prototype
that we are building using the Monash CAVE2 immersive visualisation facility to try and realise this scenario. (c) Ultimately, the so-called “ContextuWall”
project is intended to enclose a natural space where people can work collaboratively, their discussions supported by the contextual display of information at
high-resolution on the CAVE2 wall.
mounted displays (HMDs) and CAVEs to immerse users into
data graphics. VREs have proven their effectiveness in many
scientific applications such as brain tumour analysis [11],
archaeology [12], [13], geographic information systems [14],
geosciences [15], [16] or physics [17]. These studies focused
on the perception of abstract data visualisation for a single user.
VREs have also been studied to support collaborative work.
VRE have shown benefits to support collaborative tasks such as
puzzle solving [18], navigation with individualized views [19],
or complex manipulations such as moving a ring on a U-
shaped hoop [20]. Szalavri et al. [21] developed a collaborative
augmented reality system with see-through HMDs in order
to support collaborative scientific visualisation. They observed
that their system seems to be superior to a classic desktop
environment, but did not provide formal results.
Huang et al. [22] investigate Personal Visual Analytics
(PVA), Visual Analytics used within a personal context, but
not restricted to personal data. They describe a design space
for PVA and present a literature survey on the topic. This
application is a natural fit for augmented reality as headsets
become lighter and less obtrusive.
Collaboration can play an important role in information
visualisation by allowing groups of humans to make sense of
data [23], [24]. Collaboration might also be the key feature to
successfully understanding big and complex data [6], [25]. Yet
little research has focused on collaborative and immersive envi-
ronments for abstract visualisation. Mahyar and Tory explored
how communication and coordination can be supported to
facilitate synchronous collaborative sensemaking activities in
Visual Analytics [26]. Recently, Donalek et al. [25] published a
progress report of the exploration of VR as a collaborative plat-
form for information visualisation. They provided a descrip-
tion of iVIZ, a web-distributed collaborative VR visualisation
system that supports the Oculus Rift. Their studies are still
at an exploratory level and thus the authors did not provide
evidence of how effective this system is for collaborative
visualisation of big and complex data. Telearch [12], a virtual
reality system for collaborative archaeology, and Shvil [27],
an augmented reality system for collaborative land navigation,
both support distributed collaboration. However, both systems
are designed for specific use cases. A further prominent use of
augmented reality in is industrial applications, e.g. to support
asynchronous collaboration [28].
To conclude, VREs featuring 3D stereoscopic vision, high
resolution and head tracking provide considerable benefits for
collaborative visual analytics. The current technologies each
have their own advantages and disadvantages. For example, the
CAVE2 supports collaborative work of a considerable group of
people in a large room (diameter 8m), with a 3D stereoscopic
display. Yet the CAVE2 provides head tracking for only a
single user. Conversely, HMDs such as the Oculus Rift provide
3D stereoscopic vision with head tracking for every user, but
prevent users from moving freely in space since their HMD is
tethered to a desktop machine. Despite these limitations, the
current technologies are now adequate to allow us to explore
a design space for immersive data exploration. The following
section explores several use-cases from researchers at Monash
University that exploit these emerging display technologies—
but also other media and modalities—to allow people to
experience their data “immersively”.
In this section we introduce what we mean by immersive
analytics using a number of examples.
The most obvious developments in technology that allow
immersive analytics are the new low-cost virtual reality HMDs
such as Oculus Rift, augmented reality HMDs such as Google
Glass or (soon) Microsoft HoloLens as well as augmented real-
ity through mobile phones or tablets. These support immersive
3D data visualisation as well as allowing data visualisation to
be physically embedded in the real world. In addition, there is
great potential to support distributed collaboration. A serious
issue is interaction: motion and gesture tracking devices like
the Kinect and Leap Motion provide one possible approach.
Figure 1 gives a potential use case for such devices that can
be explored by immersive analytics.
An obvious and already popular use for ‘immersive’ dis-
play technology is in constructing virtual environments that
allow walk-through and ‘fly-over’ of (for example) engineering
or architectural designs. This technology is being put to novel
use by the Monash Immersive Analytics lab in reconstructing
ancient artefacts for archaeological analysis16. Figure 3 shows
a virtual reconstruction of the ancient Angkor site. However,
our visualisation is not just about reconstructing a view of the
site, but rather reconstructing activity in and around the site
using multi-agent technology. We explore data gathered from
this simulation using visual analytic techniques such as over-
laid heatmaps of activity, e.g. Figure 3(b). Besides supporting
the evaluation of hypotheses, e.g. on the settlement density,
detailed 3D interactive animations like the Angkor virtual
reconstruction can also be used to disseminate knowledge, e.g.
for educational purposes [29].
The term ‘immersive’ suggests 3D virtual reality style dis-
plays. However, this is only one possible direction for research.
We see touchscreens that allow direct interaction with data as
being more natural and therefore more ‘immersive’ than mouse
and desktop display. There has been exciting research, for
example, about the natural collaborative experiences supported
by tabletop interfaces [9]. Large displays utilising projectors
or tiled displays have been in use for many years. Particular
applications have been as command and control centres or for
immersive scientific visualisation and for immersive display of
engineering and architectural designs. Next-generation CAVEs
provide higher resolution and contrast than their predecessors.
The rapidly decreasing cost of video projectors and of 3D and
Ultra-High-Definition (4K) monitors that can be combined in
tiled displays mean that CAVEs and ‘mini-CAVEs’ are now
affordable for wide spread use in research groups and industry.
Combining large displays with touch or spatial tracking can
support multiple users in a way that allows them to collaborate
in a seamless way that ‘immerses’ them in their data. Figure 2
explores a use-case exploiting such devices. Another driver for
immersive analytics is that scientists, engineers, architects and
analysts are increasingly trying to understand heterogeneous
data. Some components are physical and have a natural 3D
representation while other components are more abstract and
are better visualised in 2D. The development of cheap 3D
display technology means that it is now possible to provide
Fig. 3. (a) A virtual reconstruction of the ancient Cambodian town and temple of Angkor. Our visualisation is not just about reconstructing a view of the site,
but rather reconstructing activity in and around the site using multi-agent technology. (b) We explore data gathered from this simulation using visual analytic
techniques such as overlaid heatmaps of activity.
hybrid 2D/3D visualisations, for example, by combining 3D
tracked monitors such as ZSpace17 with 2D displays.
Immersive analytics is not only about visualisation. While
vision is our most important sense for quickly exploring our
environment, sound and touch are also powerful ways of
understanding the world. Their use in data analytics has largely
been overlooked. A particularly important application is for
presenting data to people who are blind or have severe vision
impairment. One of the most disabling consequences of vision
impairment is lack of access to information: the development
of audio and haptic presentation devices for such users is a
potential application area of immersive analytics with huge
social benefit. Figure 4 shows the GraVVITAS system18 [30]
developed at Monash University to enable blind people to
interact with information graphics through touchscreen devices
with sonic and haptic feedback.
Despite much progress with technologies for creating
immersive virtual and augmented reality experiences, until
we have perfect brain-computer interfaces a gap will remain
between the tangibility of real physical objects and virtual
objects rendered in immersive environments. However, it is
already possible to experiment with “physicalisations” of data
in order to study the benefits of “perfect” immersion. Indeed,
people have been constructing physical representations of data
for some time. An early and famous data physicalisation was a
tangible matrix display by Bertin [31]. Bertin was an advocate
of reordering tabular data displays to better show high-level
structure (such as clustering). His physicalisations featured
crude interactivity allowing people to physically reorder matrix
Fig. 4. Immersion is not just visual. Sound and haptic feedback can make
information graphics and data analytics applications accessible to people with
severe vision impairment.
rows of dots mounted on skewers.
Another data physicalisation is shown in Figure 5(a).
This physicalisation of inflation vs. unemployment data for
OECD countries over ten years was an early experiment
by a current member of the Monash Immersive Analytics
team in avoiding the limitations of digital display and haptic
technology while testing the cognitive effectiveness of 3D data
spatialisation [32]. In 2003 construction of this physicalisation
was a time-consuming process, constrained by the materials
and tools available. In 2015 digital fabrication is emerging as a
technology with huge potential to allow people to engage with
data in new ways [33]. Machines such as 3D-printers, laser
cutters and CNC-routers have all fallen massively in price in
recent years making it easier than ever for people to create
physical representations of their data automatically. Figure
5(b) shows a data physicalisation recently produced in our lab
very quickly using laser-cutter and vinyl-printer/cutter devices.
Furthermore, with such automatic fabrication techniques we
can produce physicalisations of large data-sets with ease.
Immersive analytics shares many goals with visual ana-
lytics, in particular: how to derive insight from complex, big
data sets. However, we can see from these various scenarios
and case-studies that in contrast to visual analytics, immersive
analytics focuses on the investigation of the use of new im-
mersive technologies and considers multi-sensory interaction,
not only visualisation. In short:
Immersive Analytics investigates how new interac-
tion and display technologies can be used to support
analytical reasoning and decision making. The aim
is to provide multi-sensory interfaces for analytics
approaches that support collaboration and allow
users to immerse themselves in their data. Immersive
Analytics builds on technologies such as large touch
surfaces, immersive virtual and augmented reality
environments, haptic and audio displays and modern
fabrication techniques.
The overarching goal of immersive analytics research is
to understand how (and whether) new interface and display
technologies can be used to create a more immersive kind of
data analysis and exploration. The kinds of devices and envi-
ronments include augmented and virtual reality displays, large
high-resolution 2- and 3-D displays, haptic and audio feedback
and gesture and touch controlled interaction. These potentially
require very different interaction and visualisation models and
techniques to those used in standard visual analytics. Some of
the main research questions are:
1) Much research has been devoted to computer-assisted
collaboration both synchronous and asynchronous
and local and remote. The new devices and environ-
ments potentially support new models for collabora-
tion as shown in Figures 1 and 2. What paradigms are
potentially enabled by these new interaction modali-
ties? How do we evaluate them?
2) Traditionally 3D visualisation has been used in the
physical sciences, engineering and design; while 2D
visualisations have been used to display statistical
and abstract data in information visualisations. In-
creasingly there is a need to combine both sorts of
visualisation in holistic visualisations. For instance,
in the life sciences different aspects of a cell are
displayed using 2D images and 3D volumes, 2D
network data and the various -omics data. Can these
new technologies support more holistic visualisations
of such data incorporating 3D spatial information as
well as abstract data?
3) What questions do technologies like augmented re-
ality raise significantly for data and visual analyt-
ics? For instance, traditional information visualisation
supports open-ended exploration based on Shnei-
derman’s information mantra: overview first, zoom
and filter, then details on demand. In our view a
different model is required for analytical applications
grounded in the physical world. In this case objects
in the physical environment provide the immediate
and primary focus and so the natural model is to
provide detailed information about these objects and
only provide contextual information on demand.
4) What are the interface ‘tricks’ and affordances such as
high-resolution displays, sound, touch and responsive
interaction that change the user perception from an
allocentric view of the data to a more egocentric and
immersive view of the data?
5) What are the lessons that can be learnt from pre-
vious research into the use of 3D visualisation for
information visualisation? Do the new technologies
invalidate the current wisdom that it is better to use
2D visualisation for abstract data since the designer
of the visualisation has complete freedom to map data
to an occlusion free 2D display?
6) What are the most fertile application areas for im-
mersive analytics? For example, these could be in
life sciences, disaster and emergency management,
business (immersive) intelligence and many more.
What are domain-specific requirements, what are
general requirements? What are typical workflows
(a) (b)
Fig. 5. Data physicalisations: (a) A physicalisation of inflation vs. unemployment data for OECD countries over ten years. The physical instantiationofthis
dataset was an early experiment [32] in avoiding the limitations of digital display and haptic technology while testing the effectiveness of 3D data spatialisation.
This model, constructed in the 2000s, was time-consuming and tedious to build with conventional tooling. (b) A physicalisation of a grouped graph. The group
hierarchy is represented by stacked levels. Using modern fabrication techniques such as 3D printing, laser-cutters, vinyl cutter/printers, etc. we can prototype
such physicalisations very quickly and easily.
for data analysis in the different domains, are they
domain-specific or are they generic across the differ-
ent domains?
7) How do we develop generic platforms that support
immersive analytics? Currently there is a wide range
of different development platforms and existing cross-
platform tools, however, they do not quite have a
broad enough focus for immersive analytics. For
example, Unity19 is designed for gaming applications
rather than analytics while the Visualization Toolkit
(VTK)20 is targeted at scientific visualisation appli-
Research and development for approaches related to im-
mersive analytics are scattered across several research commu-
nities and domains. In addition, a large part of the development
for emerging technologies is done in companies and start-ups.
With our Immersive Analytics initiative we would like to bring
together these efforts and establish a community of researchers,
developers, educators and users from both academia and indus-
try in this area. There are no dedicated events for Immersive
Analytics such as a conference or workshop yet for discussion
and dissemination of immersive analytics research, therefore
we would like to foster community building by two dedicated
Immersive Analytics seminars in Dagstuhl and Shonan in
The authors would like to acknowledge support through the
ARC DP 140100077. Thanks to Jon McCormack, director of
the SensiLab at Monash University which hosts the Immersive
Analytics project. Thanks to Tom Chandler and his team
working on the Visualising Angkor project. Thanks also to
Bruce Thomas, Nathalie Henry Riche, Takayuki Itoh, Uwe
ossner and Wolfgang Stuerzlinger who are helping us to
organise forthcoming workshops on the topic of Immersive
[1] Nature specials, “Big Data,” Nature, vol. 455, no. 7209, pp. 1–136,
[2] Science special, “Dealing with data,” Science, vol. 331, no. 6018, pp.
639–806, 2011.
[3] J. J. Thomas and K. A. Cook, Eds., Illuminating the path: The research
and development agenda for visual analytics. IEEE Computer Society
Press, 2005.
[4] D. A. Keim, F. Mansmann, J. Schneidewind, J. Thomas, and H. Ziegler,
“Visual Analytics: Scope and Challenges,” in Visual Data Mining,ser.
Lecture Notes in Computer Science, S. J. Simoff, M. H. B¨
ohlen, and
A. Mazeika, Eds. Springer Berlin Heidelberg, 2008, vol. 4404, pp.
[5] M. Hassenzahl and N. Tractinsky, “User experience-a research agenda,”
Behaviour & information technology, vol. 25, no. 2, pp. 91–97, 2006.
[6] A. Febretti, A. Nishimoto, T. Thigpen, J. Talandis, L. Long, J. D. Pirtle,
T. Peterka, A. Verlo, M. Brown, D. Plepys, and others, “CAVE2: a
hybrid reality environment for immersive simulation and information
analysis,” in Proceedings IS&T / SPIE Electronic Imaging, vol. 8649.
SPIE, 2013, pp. 864 903.1–12.
[7] D. Tutt and C. Harty, “Journeys through the CAVE: The use of 3D
immersive environments for client engagement practices in hospital
design,” in Proceedings 29th Annual ARCOM Conference,S.D.Smith
and D. D. Ahiaga-Dagbui, Eds. Association of Researchers in
Construction Management, 2013, pp. 111–121.
[8] K. Reda, A. Febretti, A. Knoll, J. Aurisano, J. Leigh, A. Johnson,
M. Papka, and M. Hereld, “Visualizing large, heterogeneous data in
hybrid-reality environments,IEEE Computer Graphics and Applica-
tions, vol. 33, no. 4, pp. 38–48, 2013.
[9] P. Isenberg, T. Isenberg, T. Hesselmann, B. Lee, U. Von Zadow, and
A. Tang, “Data visualization on interactive surfaces: A research agenda,”
IEEE Computer Graphics and Applications, vol. 33, no. 2, pp. 16–24,
[10] J. C. Roberts, P. D. Ritsos, S. K. Badam, D. Brodbeck, J. Kennedy, and
N. Elmqvist, “Visualization beyond the Desktop–the Next Big Thing,
IEEE Computer Graphics and Applications, vol. 34, no. 6, pp. 26–34,
[11] S. Zhang, C. Demiralp, D. Keefe, M. DaSilva, D. Laidlaw, B. Green-
berg, P. Basser, C. Pierpaoli, E. Chiocca, and T. Deisboeck, “An immer-
sive virtual environment for DT-MRI volume visualization applications:
a case study,” in Proceedings Visualization 2001. IEEE, 2001, pp. 437–
[12] G. Kurillo and M. Forte, “Telearch – Integrated visual simulation
environment for collaborative virtual archaeology,Mediterranean Ar-
chaeology and Archaeometry, vol. 12, no. 1, pp. 11–20, 2012.
[13] N. G. Smith, K. Knabb, C. DeFanti, P. Weber, J. Schulze, A. Prud-
homme, F. Kuester, T. E. Levy, and T. A. DeFanti, “ArtifactVis2: Man-
aging real-time archaeological data in immersive 3D environments,” in
Proceedings Digital Heritage International Congress, vol. 1. IEEE,
2013, pp. 363–370.
[14] R. Bennett, D. J. Zielinski, and R. Kopper, “Comparison of Interactive
Environments for the Archaeological Exploration of 3D Landscape
Data,” in IEEE VIS International Workshop on 3DVis, 2014.
[15] T.-J. Hsieh, Y.-L. Chang, and B. Huang, “Visual Analytics of Terres-
trial Lidar Data for Cliff Erosion Assessment on Large Displays,” in
Proceedings SPIE Satellite Data Compression, Communications, and
Processing VII, vol. 8157. SPIE, 2011, pp. 81570D.1–17.
[16] C. Helbig, H.-S. Bauer, K. Rink, V. Wulfmeyer, M. Frank, and
O. Kolditz, “Concept and workflow for 3D visualization of atmospheric
data in a virtual reality environment for analytical approaches,Envi-
ronmental Earth Sciences, vol. 72, no. 10, pp. 3767–3780, 2014.
[17] A. Kageyama, Y. Tamura, and T. Sato, “Visualization of Vector Field
by Virtual Reality,Progress of Theoretical Physics Supplement,vol.
138, pp. 665–673, 2000.
[18] I. Heldal, M. Spante, and M. Connell, “Are two heads better than
one?: object-focused work in physical and in virtual environments,
in Proceedings of the ACM Symposium on Virtual Reality Software and
Technology. ACM, 2006, pp. 287–296.
[19] H. Yang and G. M. Olson, “Exploring Collaborative Navigation:: The
Effect of Perspectives on Group Performance,” in Proceedings of the
4th International Conference on Collaborative Virtual Environments.
ACM, 2002, pp. 135–142.
[20] M. Narayan, L. Waugh, X. Zhang, P. Bafna, and D. Bowman, “Quan-
tifying the Benefits of Immersion for Collaboration in Virtual Envi-
ronments,” in Proceedings of the ACM Symposium on Virtual Reality
Software and Technology. ACM, 2005, pp. 78–81.
[21] Z. Szalav´
ari, D. Schmalstieg, A. Fuhrmann, and M. Gervautz, “Studier-
stube: An environment for collaboration in augmented reality,” Virtual
Reality, vol. 3, no. 1, pp. 37–48, 1998.
[22] D. Huang, M. Tory, B. A. Aseniero, L. Bartram, S. Bateman, S. Carpen-
dale, A. Tang, and R. Woodbury, “Personal visualization and personal
visual analytics,” IEEE Transactions on Visualization and Computer
Graphics, vol. 21, no. 3, pp. 420–433, 2015.
[23] J. Heer and M. Agrawala, “Design Considerations for Collaborative
Visual Analytics,” Information Visualization, vol. 7, no. 1, pp. 49–62,
[24] P. Isenberg, N. Elmqvist, J. Scholtz, D. Cernea, K.-L. Ma, and H. Ha-
gen, “Collaborative visualization: Definition, challenges, and research
agenda,” Information Visualization, vol. 10, no. 4, pp. 310–326, 2011.
[25] C. Donalek, S. Djorgovski, A. Cioc, A. Wang, J. Zhang, E. Lawler,
S. Yeh, A. Mahabal, M. Graham, A. Drake, S. Davidoff, J. Norris, and
G. Longo, “Immersive and collaborative data visualization using virtual
reality platforms,” in 2014 IEEE International Conference on Big Data
(Big Data), 2014, pp. 609–614.
[26] N. Mahyar and M. Tory, “Supporting communication and coordination
in collaborative sensemaking,IEEE Transactions on Visualization and
Computer Graphics, vol. 20, no. 12, pp. 1633–1642, 2014.
[27] N. Li, A. S. Nittala, E. Sharlin, and M. Costa Sousa, “Shvil: Collab-
orative augmented reality land navigation,” in Proceedings Conference
on Human Factors in Computing Systems (CHI 2014). ACM, 2014,
pp. 1291–1296.
[28] A. Irlitti, S. Von Itzstein, L. Alem, and B. Thomas, “Tangible interac-
tion techniques to support asynchronous collaboration,” in 2013 IEEE
International Symposium on Mixed and Augmented Reality (ISMAR),
2013, pp. 1–6.
[29] T. Chandler, B. McKee, E. Wilson, and M. Yeates, “Exploring Angkor
with Zhou Daguan,” ABC Splash Digibook supporting Australian Year
8 History curriculum, 2015.
[30] C. Goncu and K. Marriott, “GraVVITAS: Generic Multi-touch Pre-
sentation of Accessible Graphics,” in Human-Computer Interaction –
INTERACT 2011, ser. Lecture Notes in Computer Science, P. Campos,
N. Graham, J. Jorge, N. Nunes, P. Palanque, and M. Winckler, Eds.
Springer Berlin Heidelberg, 2011, vol. 6946, pp. 30–48.
[31] J. Bertin, Semiology of Graphics. University of Wisconsin Press, 1983.
[32] T. Dwyer, “Two-and-a-half-dimensional visualisation of relational net-
works,” Ph.D. dissertation, School of Information Technologies, Faculty
of Science, University of Sydney, 2005.
[33] Y. Jansen, P. Dragicevic, P. Isenberg, J. Alexander, A. Karnik, J. Kildal,
S. Subramanian, and K. Hornbæk, “Opportunities and Challenges for
Data Physicalization,” in CHI 2015 - Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems. ACM, 2015,
pp. 3227–3236.
... Augmented reality (AR) is a technology that provides an immersive experience for data exploration and analysis (Chandler et al., 2015). Individualized AR in particular uses head mounted displays or mobile devices to overlay virtual content in objects, places or persons present in the physical world, offering an augmented perspective of an object to a single individual (Satriadi et al., 2022). ...
Data is the new "natural resource" of the 21st century. Recent years have witnessed its rapid expansion, increasing the production and availability of data. In this context, there are still significant challenges to fully realizing its potential, especially to make it accessible and more useful to the larger public. Considering design as an activity of cultural mediation and since "data has no naturally corresponding form", we detail the process and challenges involved in combining different artifacts to support data visualization and exploration. "GRAC" combines a physicalization (physical object embedded with data) with an augmented reality application, enabling users to explore two datasets related to design education and design companies in Portugal. The artifact was tested with 21 people at the DesignOBS exhibition of the Faculty of Fine Arts of Lisbon and Porto. Based on participants' feedback, we draw design implications for artifact integration; and propose future research directions.
... Ideal immersive analytic tools create seamless workflows that people can use to collaborate locally and remotely and allow for data processing in immersive environments, free from desktops and computer screens (Chandler et al., 2015). This section describes a series of design criteria for ideal immersive dashboards and provides possible methods to realize them. ...
Innovative workforce training solutions are essential as Industry 4.0 transforms the manufacturing and automation landscape. The Virtual Reality (VR) training system described in this paper uses multiple visualization techniques to track employees’ progress. To accommodate a range of user preferences and training objectives, the system makes use of various visualization techniques. PDF printouts offer a concise, portable summary of individual performance metrics, while 2D maps provide an overview of spatial navigation and task completion within the VR environment. 3D scenes allow for a more immersive, interactive visualization of employee progress, allowing users to better understand complex scenarios and relationships. The proposed system seeks to improve user engagement, comprehension, and retention of training outcomes by incorporating these visualization techniques.KeywordsVisualizationVirtual RealityInterfaceIndustry 4.0Training
Full-text available
Analysts perform sensemaking on large complex multimedia datasets in order to extract concepts, themes, and other kinds of insights from them. Immersive analytics, in particular, puts users in virtual environments that allow them to explore data in a unique way where they can interact and move through the data. Previous research using virtual reality immersive analytics tools found users wanting to refer to real-world objects or understand the physical world around them while continuing to perform their analysis. Therefore, we designed and ran a comparative study looking at the tradeoffs between virtual and augmented reality for our immersive analytics approach: Immersive Space to Think. Through two mixed-methods studies we found that virtual reality affords users a space where users can focus more on their task, but augmented reality allows them to use various real-world tools that can increase user satisfaction. In future immersive analytics tools, we recommend a blend of the two—augmented virtuality—with pass-through portals which allow users to see various real-world tools, such as whiteboards or desks and keyboards, while still giving themselves a space to focus.
Full-text available
Immersive technologies, including augmented reality (AR), virtual reality (VR), mixed reality (MR), and three-dimensional (3D) views, are digitally expanding consumers’ reality, enabling never-before-seen experiences. As these technologies transform from a quirky novelty into a ubiquitous utility for consumers, we analyze the advances of the past two decades of research that elaborate on the influence of immersive technologies on consumer behavior. While the past reviews in this area have focused on a limited number of immersive technologies or their application in a sole industry, we holistically analyze and compare the influence of AR, VR, MR, and 3D on consumer behavior across various sectors. We adopt an integrated TCM (Theories, Contexts, and Methods)-ADO (Antecedents, Decisions, and Outcomes) framework to systematically review 129 studies from high-quality academic journals. The findings from this analysis present implications for future research, theory, and practice. JEL Classification M31 Marketing
As visualization makes the leap to mobile and situated settings, where data is increasingly integrated with the physical world using mixed reality, there is a corresponding need for effectively managing the immersed user's view of situated visualizations. In this paper we present an analysis of view management techniques for situated 3D visualizations in handheld augmented reality: a shadowbox, a world‐in‐miniature metaphor, and an interactive tour. We validate these view management solutions through a concrete implementation of all techniques within a situated visualization framework built using a web‐based augmented reality visualization toolkit, and present results from a user study in augmented reality accessed using handheld mobile devices.
Full-text available
More diverse data on animal ecology are now available. This "data deluge" presents challenges for both biologists and computer scientists; however, it also creates opportunities to improve analysis and answer more holistic research questions. We aim to increase awareness of the current opportunity for interdisciplinary research between animal ecology researchers and computer scientists. Immersive analytics (IA) is an emerging research field in which investigations are performed into how immersive technologies, such as large display walls and virtual reality and augmented reality devices, can be used to improve data analysis, outcomes, and communication. These investigations have the potential to reduce the analysis effort and widen the range of questions that can be addressed. We propose that biologists and computer scientists combine their efforts to lay the foundation for IA in animal ecology research. We discuss the potential and the challenges and outline a path toward a structured approach. We imagine that a joint effort would combine the strengths and expertise of both communities, leading to a well-defined research agenda and design space, practical guidelines, robust and reusable software frameworks, reduced analysis effort, and better comparability of results.
Full-text available
The Human Reference Atlas (HRA, funded by the NIH Human Biomolecular Atlas Program (HuBMAP, and other projects engages 17 international consortia to create a spatial reference of the healthy adult human body at single-cell resolution. The specimen, biological structure, and spatial data that define the HRA are disparate in nature and benefit from a visually explicit method of data integration. Virtual reality (VR) offers unique means to enable users to explore complex data structures in a three-dimensional (3D) immersive environment. On a 2D desktop application, the 3D spatiality and real-world size of the 3D reference organs of the atlas is hard to understand. If viewed in VR, the spatiality of the organs and tissue blocks mapped to the HRA can be explored in their true size and in a way that goes beyond traditional 2D user interfaces. Added 2D and 3D visualizations can then provide data-rich context. In this paper, we present the HRA Organ Gallery, a VR application to explore the atlas in an integrated VR environment. Presently, the HRA Organ Gallery features 55 3D reference organs, 1,203 mapped tissue blocks from 292 demographically diverse donors and 15 providers that link to 6,000+ datasets; it also features prototype visualizations of cell type distributions and 3D protein structures. We outline our plans to support two biological use cases: on-ramping novice and expert users to HuBMAP data available via the Data Portal (, and quality assurance/quality control (QA/QC) for HRA data providers. Code and onboarding materials are available at
Conference Paper
Full-text available
Physical representations of data have existed for thousands of years. Yet it is now that advances in digital fabrication, actuated tangible interfaces, and shape-changing displays are spurring an emerging area of research that we call Data Physicalization. It aims to help people explore, understand, and communicate data using computer-supported physical data representations. We call these representations physicalizations, analogously to visualizations – their purely visual counterpart. In this article, we go beyond the focused research questions addressed so far by delineating the research area, synthesizing its open challenges, and laying out a research agenda.
Full-text available
Data surrounds each and every one of us in our daily lives, ranging from exercise logs, to archives of our interactions with others on social media, to online resources pertaining to our hobbies. There is enormous potential for us to use these data to understand ourselves better and make positive changes in our lives. Visualization (Vis) and visual analytics (VA) offer substantial opportunities to help individuals gain insights about themselves, their communities and their interests; however, designing tools to support data analysis in non-professional life brings a unique set of research and design challenges. We investigate the requirements and research directions required to take full advantage of Vis and VA in a personal context. We develop a taxonomy of design dimensions to provide a coherent vocabulary for discussing personal visualization and personal visual analytics. By identifying and exploring clusters in the design space, we discuss challenges and share perspectives on future research. This work brings together research that was previously scattered across disciplines. Our goal is to call research attention to this space and engage researchers to explore the enabling techniques and technology that will support people to better understand data relevant to their personal lives, interests, and needs.
Full-text available
In the future, climate change will strongly influence our environment and living conditions. Weather and Climate simulations that predict possible changes produce big data sets. The combination of various variables of climate models with spatial data from different sources helps to identify correlations and to study key processes. In this paper, the results of the Weather Research and Forecasting model are visualized for two regions. For this purpose, a continuous workflow that leads from the integration of heterogeneous raw data to 3D visualizations that can be displayed on a desktop computer or in an interactive virtual reality environment is developed. These easy-to-understand visualizations of complex data are the basis for scientific communication and for the evaluation and verification of models as well as for interdisciplinary discussions of the research results. The final publication is available at, .
Conference Paper
Full-text available
In this paper, we present a stereoscopic research and training environment for archaeologists called ArtifactVis2. This application enables the management and visualization of diverse types of cultural datasets within a collaborative virtual 3D system. The archaeologist is fully immersed in a large-scale visualization of on-going excavations. Massive 3D datasets are seamlessly rendered in real-time with field recorded GIS data, 3D artifact scans and digital photography. Dynamic content can be visualized and cultural analytics can be performed on archaeological datasets collected through a rigorous digital archaeological methodology. The virtual collaborative environment provides a menu driven query system and the ability to annotate, markup, measure, and manipulate any of the datasets. These features enable researchers to re-experience and analyze the minute details of an archaeological site's excavation. It enhances their visual capacity to recognize deep patterns and structures and perceive changes and reoccurrences. As a complement and development from previous work in the field of 3D immersive archaeological environments, ArtifactVis2 provides a GIS based immersive environment that taps directly into archaeological datasets to investigate cultural and historical issues of ancient societies and cultural heritage in ways not possible before.
Full-text available
When people work together to analyze a data set, they need to organize their findings, hypotheses, and evidence, share that information with their collaborators, and coordinate activities amongst team members. Sharing externalizations (recorded information such as notes) could increase awareness and assist with team communication and coordination. However, we currently know little about how to provide tool support for this sort of sharing. We explore how linked common work (LCW) can be employed within a 'collaborative thinking space', to facilitate synchronous collaborative sensemaking activities in Visual Analytics (VA). Collaborative thinking spaces provide an environment for analysts to record, organize, share and connect externalizations. Our tool, CLIP, extends earlier thinking spaces by integrating LCW features that reveal relationships between collaborators' findings. We conducted a user study comparing CLIP to a baseline version without LCW. Results demonstrated that LCW significantly improved analytic outcomes at a collaborative intelligence task. Groups using CLIP were also able to more effectively coordinate their work, and held more discussion of their findings and hypotheses. LCW enabled them to maintain awareness of each other's activities and findings and link those findings to their own work, preventing disruptive oral awareness notifications.
Conference Paper
Full-text available
Effective data visualization is a key part of the discovery process in the era of big data. It is the bridge between the quantitative content of the data and human intuition, and thus an essential component of the scientific path from data into knowledge and understanding. Visualization is also essential in the data mining process, directing the choice of the applicable algorithms, and in helping to identify and remove bad data from the analysis. However, a high complexity or a high dimensionality of modern data sets represents a critical obstacle. How do we visualize interesting structures and patterns that may exist in hyper-dimensional data spaces? A better understanding of how we can perceive and interact with multi dimensional information poses some deep questions in the field of cognition technology and human computer interaction. To this effect, we are exploring the use of immersive virtual reality platforms for scientific data visualization, both as software and inexpensive commodity hardware. These potentially powerful and innovative tools for multi dimensional data visualization can also provide an easy and natural path to a collaborative data visualization and exploration, where scientists can interact with their data and their colleagues in the same visual space. Immersion provides benefits beyond the traditional desktop visualization tools: it leads to a demonstrably better perception of a datascape geometry, more intuitive data understanding, and a better retention of the perceived relationships in the data.
Full-text available
We present our prototype of Shvil, an Augmented Reality (AR) system for collaborative land navigation. Shvil facilitates path planning and execution by creating a collaborative medium between an overseer (indoor user) and an explorer (outdoor user) using AR and 3D printing techniques. Shvil provides a remote overseer with a physical representation of the topography of the mission via a 3D printout of the terrain, and merges the physical presence of the explorer and the actions of the overseer via dynamic AR visualization. The system supports collaboration by both overlaying visual information related to the explorer on top of the overseer's scaled-down physical representation, and overlaying visual information for the explorer in-situ as it emerges from the overseer. We report our current prototype effort and preliminary results, and our vision for the future of Shvil.
Conference Paper
The increasingly widespread availability of high-accuracy terrain models is revolutionizing our understanding of historic landscapes across the globe, yet much of this inherently 3D data is viewed and analyzed using 2D Geographical Information System (GIS). The ability to explore the environments in a more immersive way that takes advantage of the full data content is advantageous for professionals and researchers, but is also highly desirable for education and public outreach. This paper describes the method and outcomes of a comparison of three virtual environments; a six-sided CAVE-type immersive virtual reality system (referred to henceforth as CAVE); a 3D web application and a standard 2D desktop paradigm in the form of a GIS. Two groups of participants were used to reflect specialist and non-specialist interests. This study showed that while the 2D GIS, the most common interface for exploring archaeological data, is well-suited to expert interpretation (based on previous familiarity with the system), it is significantly harder for non-specialists to undertake a feature identification and location task in this environment when compared with the 3D environments. Specialist users also mostly preferred the ability to view terrain data in 3D. The experience of fully-immersive CAVE-type system was valuable for a sense of place and contextualizing features in a way that was not possible in the other environments. However it was not shown that this led to improved archaeological observations during the exploration and there is some evidence that the lack of orientation made recounting features in the reflection time more difficult. Although small-scale the experiment gave valuable insight into the use of the different environments by specialist and non-specialist groups, allowing the 3D web application to be identified as the optimal environment for pedagogical purposes.
One of the benefits of the growth in BIM use in the design and construction industries is the opportunity for increasing the involvement of, and interaction with, various stakeholders and end users through the design process. This includes the use of virtual models in collaborative 3D immersive environments, such as the CAVE (Cave Automatic Virtual Environment), during critical moments of client engagement. These opportunities and developing work practices have, however, received little academic attention. These encounters provide the opportunity for stakeholders to virtually experience the proposed design of buildings and spaces ahead of construction, and for design teams to communicate in a sensory and embodied way that the contract and design requirements are being met. This research uses video-based methods to study the collaborative, 'real world' design review work undertaken during client engagement sessions in the CAVE, in the context of a bidding process for a new NHS hospital. In this immersive setting, the navigable space is both a site of interactive encounter and an architectural model. It is argued that design teams establish a narrative to support the navigable space which the client participants experience, through the careful consideration and planning of their journey including what can be revealed or concealed along the way.