Content uploaded by Maxime Cordeil
Author content
All content in this area was uploaded by Maxime Cordeil on Sep 22, 2016
Content may be subject to copyright.
Immersive Analytics
Tom Chandler, Maxime Cordeil, Tobias Czauderna, Tim Dwyer, Jaroslaw Glowacki, Cagatay Goncu,
Matthias Klapperstueck, Karsten Klein, Kim Marriott, Falk Schreiber and Elliot Wilson
Faculty of IT
Monash University
Melbourne
Email: {tom.chandler,max.cordeil,tobias.czauderna,tim.dwyer,jaroslaw.glowacki,cagatay.goncu,
matthias.klapperstueck,karsten.klein,kim.marriott,falk.schreiber,elliott.wilson}@monash.edu
Abstract—Immersive Analytics is an emerging research thrust
investigating how new interaction and display technologies can
be used to support analytical reasoning and decision making.
The aim is to provide multi-sensory interfaces that support
collaboration and allow users to immerse themselves in their
data in a way that supports real-world analytics tasks. Immersive
Analytics builds on technologies such as large touch surfaces,
immersive virtual and augmented reality environments, sensor
devices and other, rapidly evolving, natural user interface devices.
While there is a great deal of past and current work on improving
the display technologies themselves, our focus in this position
paper is on bringing attention to the higher-level usability and
design issues in creating effective user interfaces for data analytics
in immersive environments.
I. INTRODUCTION
We are living in the era of big data [1], [2]. The amount
and complexity of the data available for analysis in the
physical-, life- and social-sciences—as well as that available
for guiding business and government decision making—is
growing exponentially. Several overlapping research fields are
concerned with the development of methods to support the
analysis of such data; particularly: information visualisation,
scientific visualisation, machine learning, data mining, data
science, human computer interaction and visual analytics.
Visual Analytics was introduced a decade ago as “the
science of analytical reasoning facilitated by interactive visual
interfaces” [3]. Techniques and practices developed by this
discipline are now common-place in (for example): business
intelligence; transport and logistics; scientific applications in-
cluding astronomy, biology and physics; environmental moni-
toring and personal information management. It is now a key
technology for dealing with big data [4], and there is massive
potential for its use in further emerging application areas like
health informatics and many others.
The definition for visual analytics given above is agnostic
of the actual interface devices employed by visual analysis
systems. Nevertheless, the affordances of display and input
devices used for analysing data strongly affect the experience
of the users of such systems. It is now well understood that
user experience in interactive systems strongly affects their
degree of engagement and, hence, productivity [5]. Ultimately,
therefore, it affects the adoption and ubiquity of data analysis
tools. For practical visual analysis tools used in the industry
and areas of data science described above, the platform for
interaction is almost always a standard desktop computer: a
single average-sized display, keyboard and mouse.
In recent years we have seen significant advances in the
development of new technologies for human-computer inter-
faces. On one hand, we have seen the development of expen-
sive bespoke technologies mainly for scientific visualisation.
For example, immersive environments like a Cave Automatic
Virtual Environment (CAVE) [6] can make use of ultra-high
resolution technology, and combine 2D and 3D visualisations
to allow users to immerse themselves in computer generated
scenes. A classic use-case for the CAVE is “walk-throughs”
of human-scale architectural and engineering models [7].
On the other hand, we have also seen the development of
inexpensive commodity technologies—such as the Leap Mo-
tion1, Kinect2, Oculus Rift3and (soon) Microsoft HoloLens4—
for providing natural user interfaces, virtual and augmented
reality. Industry—in particular the entertainment industry—
has driven this development and it has progressed rapidly,
providing engaging and immersive experiences for a fraction
of the cost of the CAVE. As a consequence, voice- and gesture-
based control as well as 3D visualisation are becoming more
and more a part of everyday life, and corresponding devices are
becoming available and affordable for small businesses and the
general public. The amount of new, low-cost interaction tech-
nology hitting the market is unprecedented, and given the rise
of new manufacturing technologies (additive manufacturing)
and business models (crowdfunding, etc.) the number of new
and affordable display and interaction devices is only going to
increase.
Visual analytics and information visualisation researchers
sometimes publish studies exploring various data visualisation
applications in 3D, for example [8]. However, this has never
been a core topic in information visualisation. Meanwhile,
the rise in popularity of consumer touch-screen devices has,
arguably, led to an explosion in research visualisation tools
incorporating multitouch interaction. In other words, touch is
an example of a recent natural user-interface technology reach-
ing maturity and acceptance in the information visualisation
community, for example, a compelling research agenda on
this topic is put forward by Isenberg et al. [9]. Yet, to date,
3D user interface researchers have tended to focus more on
the lower-level challenges such as making gesture recognition,
head-tracking and rendering techniques robust and reliable.
This is understandable, given the significant algorithmic and
1http://www.leapmotion.com
2https://www.microsoft.com/en-us/kinectforwindows/
3https://www.oculus.com
4https://www.microsoft.com/hololens
(a) (b)
Fig. 1. A use-case to be explored by immersive analytics: collaborative data analysis in augmented and virtual reality. (a) Wearable display devices that can
augment the world around us with arbitrary text and graphics enable a new degree of flexibility in data analysis. A concept sketch, two people in the same room
(Peter and Sarah) discuss complex data. They can summon visual representations into one another’s field of view with a simple hand gesture or voice command.
Just as information graphics can be overlaid on their environment, a third person (John) can join the meeting and be given a virtual presence. At John’s end,
he also sees Peter and Sarah as if they are physically in the same room, together with the data visuals. The devices to at least prototype such experiencesare
now cheap and reliable enough that we can focus on the application and ergonomics rather than the underlying technologies such as motion tracking and image
registration. (b) Wearable VR headsets can immerse us into our data. A prototype using the Oculus Rift that we are building to try and realise this scenario.
hardware challenges involved. The higher-level concerns of
applications and effective user experience, let alone the use
of this technology in data analysis applications, has lagged
behind. Therefore, a systematic approach to develop practical
visual analysis tools in immersive environments is lacking.
We propose a new facet of data analytics research that seeks
to unify these efforts to identify the most enabling aspects of
these emerging natural user interface and augmented reality
technologies for real-world analysis of data. We call this new
research thrust Immersive Analytics, a topic that will explore
the applicability and development of emerging user-interface
technologies for creating more engaging and immersive expe-
riences and seamless workflows for data analysis applications.
Immersive Analytics builds on technologies such as large
touch surfaces, immersive virtual and augmented reality envi-
ronments like Oculus Rift and CAVE2 [6], and tracking de-
vices such as Leap Motion and Kinect to provide environments
beyond the classical monitor, keyboard and mouse desktop
configuration [10]. The environments we envision will be
developed by immersive analytics researchers, will be usable
by experts and analysts to help in the detailed analysis of
complex, big data sets. However, they will also be accessible to
decision makers—that is, the managers who spend more time
working face-to-face with others than in front of a desktop
computer—and to the everyday public to help them in tasks
like shopping, understanding public transport and social and
political issues affecting their lives.
In this position paper, we first give examples of the current
state of main stream research into visual analytics, virtual
reality and collaboration. We then present immersive analytics
by describing near-future scenarios and a number of projects
that are currently being developed by the Immersive Analytics
research group at Monash University5. These projects are all
striving to move beyond traditional desktop visualisation and
provide a much richer, more immersive experience. We finish
by presenting a number of research questions that we feel this
new field should address.
II. BACKGROUND
Research on immersive analytics comprises the develop-
ment and evaluation of innovative interfaces and devices, as
well as the corresponding interaction metaphors and visual-
isations that allow these to be used for understanding data
and for decision making. Immersive analytics is therefore
inherently multi-disciplinary and involves researchers from
human-computer interaction, visual analytics, augmented re-
ality, and scientific and information visualisation. Relevant
conferences include, for instance, HCI6, 3DUI7,VR
8,AR
9,
CSCW10, TableTop11, Vis and InfoVis12 and VAST13.Atsome
of these conferences we are already seeing workshops and
special sessions that address aspects of immersive analytics:
e.g. “Death of the Desktop”14, “3DVis”15, etc. However, at
this stage there is no coordinated exploration of immersive
analytics.
During the last three decades, research in Virtual Reality
Environments (VRE) has been focused on the use of head
5https://immersive-analytics.infotech.monash.edu.au/
6Human Computer Interaction www.sigchi.org
73D User Interfaces www.3dui.org
8Virtual Reality www.ieeevr.org
9Augmented Reality www.ismar.vgtc.org
10Computer Supported Cooperative Work www.cscw.acm.org
11Interactive Tabletops and Surfaces www.its2014.org
12(Information) Visualization www.ieeevis.org
13Visual Analytics Science and Technology www.vacommunity.org
14http://beyond.wallviz.dk
15https://sites.google.com/site/3dvisieeevis2014/
(a) (b)
(c)
Fig. 2. “Immersing people in their data” does not necessarily involve 3D or stereo display. (a) In this concept, analysts work in a purpose-built, collaborative
data analytics room. Interaction technologies such as pen, touch and gesture control allow them to interact directly with their data and collaborate in a more
egalitarian way than keyboards and mice, which tether individual users to a desktop. Immersive analytics will systematically research how many kindsof
emerging interaction technologies can be harnessed to engage and enable people to work together to better understand data. (b) The figure shows a prototype
that we are building using the Monash CAVE2 immersive visualisation facility to try and realise this scenario. (c) Ultimately, the so-called “ContextuWall”
project is intended to enclose a natural space where people can work collaboratively, their discussions supported by the contextual display of information at
high-resolution on the CAVE2 wall.
mounted displays (HMDs) and CAVEs to immerse users into
data graphics. VREs have proven their effectiveness in many
scientific applications such as brain tumour analysis [11],
archaeology [12], [13], geographic information systems [14],
geosciences [15], [16] or physics [17]. These studies focused
on the perception of abstract data visualisation for a single user.
VREs have also been studied to support collaborative work.
VRE have shown benefits to support collaborative tasks such as
puzzle solving [18], navigation with individualized views [19],
or complex manipulations such as moving a ring on a U-
shaped hoop [20]. Szalavri et al. [21] developed a collaborative
augmented reality system with see-through HMDs in order
to support collaborative scientific visualisation. They observed
that their system seems to be superior to a classic desktop
environment, but did not provide formal results.
Huang et al. [22] investigate Personal Visual Analytics
(PVA), Visual Analytics used within a personal context, but
not restricted to personal data. They describe a design space
for PVA and present a literature survey on the topic. This
application is a natural fit for augmented reality as headsets
become lighter and less obtrusive.
Collaboration can play an important role in information
visualisation by allowing groups of humans to make sense of
data [23], [24]. Collaboration might also be the key feature to
successfully understanding big and complex data [6], [25]. Yet
little research has focused on collaborative and immersive envi-
ronments for abstract visualisation. Mahyar and Tory explored
how communication and coordination can be supported to
facilitate synchronous collaborative sensemaking activities in
Visual Analytics [26]. Recently, Donalek et al. [25] published a
progress report of the exploration of VR as a collaborative plat-
form for information visualisation. They provided a descrip-
tion of iVIZ, a web-distributed collaborative VR visualisation
system that supports the Oculus Rift. Their studies are still
at an exploratory level and thus the authors did not provide
evidence of how effective this system is for collaborative
visualisation of big and complex data. Telearch [12], a virtual
reality system for collaborative archaeology, and Shvil [27],
an augmented reality system for collaborative land navigation,
both support distributed collaboration. However, both systems
are designed for specific use cases. A further prominent use of
augmented reality in is industrial applications, e.g. to support
asynchronous collaboration [28].
To conclude, VREs featuring 3D stereoscopic vision, high
resolution and head tracking provide considerable benefits for
collaborative visual analytics. The current technologies each
have their own advantages and disadvantages. For example, the
CAVE2 supports collaborative work of a considerable group of
people in a large room (diameter 8m), with a 3D stereoscopic
display. Yet the CAVE2 provides head tracking for only a
single user. Conversely, HMDs such as the Oculus Rift provide
3D stereoscopic vision with head tracking for every user, but
prevent users from moving freely in space since their HMD is
tethered to a desktop machine. Despite these limitations, the
current technologies are now adequate to allow us to explore
a design space for immersive data exploration. The following
section explores several use-cases from researchers at Monash
University that exploit these emerging display technologies—
but also other media and modalities—to allow people to
experience their data “immersively”.
III. IMMERSIVE ANALYTICS
In this section we introduce what we mean by immersive
analytics using a number of examples.
The most obvious developments in technology that allow
immersive analytics are the new low-cost virtual reality HMDs
such as Oculus Rift, augmented reality HMDs such as Google
Glass or (soon) Microsoft HoloLens as well as augmented real-
ity through mobile phones or tablets. These support immersive
3D data visualisation as well as allowing data visualisation to
be physically embedded in the real world. In addition, there is
great potential to support distributed collaboration. A serious
issue is interaction: motion and gesture tracking devices like
the Kinect and Leap Motion provide one possible approach.
Figure 1 gives a potential use case for such devices that can
be explored by immersive analytics.
An obvious and already popular use for ‘immersive’ dis-
play technology is in constructing virtual environments that
allow walk-through and ‘fly-over’ of (for example) engineering
or architectural designs. This technology is being put to novel
use by the Monash Immersive Analytics lab in reconstructing
ancient artefacts for archaeological analysis16. Figure 3 shows
a virtual reconstruction of the ancient Angkor site. However,
our visualisation is not just about reconstructing a view of the
site, but rather reconstructing activity in and around the site
using multi-agent technology. We explore data gathered from
this simulation using visual analytic techniques such as over-
laid heatmaps of activity, e.g. Figure 3(b). Besides supporting
the evaluation of hypotheses, e.g. on the settlement density,
detailed 3D interactive animations like the Angkor virtual
reconstruction can also be used to disseminate knowledge, e.g.
for educational purposes [29].
The term ‘immersive’ suggests 3D virtual reality style dis-
plays. However, this is only one possible direction for research.
We see touchscreens that allow direct interaction with data as
being more natural and therefore more ‘immersive’ than mouse
and desktop display. There has been exciting research, for
example, about the natural collaborative experiences supported
by tabletop interfaces [9]. Large displays utilising projectors
or tiled displays have been in use for many years. Particular
applications have been as command and control centres or for
immersive scientific visualisation and for immersive display of
engineering and architectural designs. Next-generation CAVEs
provide higher resolution and contrast than their predecessors.
The rapidly decreasing cost of video projectors and of 3D and
Ultra-High-Definition (4K) monitors that can be combined in
tiled displays mean that CAVEs and ‘mini-CAVEs’ are now
affordable for wide spread use in research groups and industry.
Combining large displays with touch or spatial tracking can
support multiple users in a way that allows them to collaborate
in a seamless way that ‘immerses’ them in their data. Figure 2
explores a use-case exploiting such devices. Another driver for
immersive analytics is that scientists, engineers, architects and
analysts are increasingly trying to understand heterogeneous
data. Some components are physical and have a natural 3D
representation while other components are more abstract and
are better visualised in 2D. The development of cheap 3D
display technology means that it is now possible to provide
16http://www.infotech.monash.edu.au/research/groups/3dg/projects/
visualising-angkor.html
(a)
(b)
Fig. 3. (a) A virtual reconstruction of the ancient Cambodian town and temple of Angkor. Our visualisation is not just about reconstructing a view of the site,
but rather reconstructing activity in and around the site using multi-agent technology. (b) We explore data gathered from this simulation using visual analytic
techniques such as overlaid heatmaps of activity.
hybrid 2D/3D visualisations, for example, by combining 3D
tracked monitors such as ZSpace17 with 2D displays.
Immersive analytics is not only about visualisation. While
vision is our most important sense for quickly exploring our
environment, sound and touch are also powerful ways of
understanding the world. Their use in data analytics has largely
been overlooked. A particularly important application is for
presenting data to people who are blind or have severe vision
impairment. One of the most disabling consequences of vision
impairment is lack of access to information: the development
of audio and haptic presentation devices for such users is a
potential application area of immersive analytics with huge
social benefit. Figure 4 shows the GraVVITAS system18 [30]
17http://zspace.com
18http://gravvitas.infotech.monash.edu
developed at Monash University to enable blind people to
interact with information graphics through touchscreen devices
with sonic and haptic feedback.
Despite much progress with technologies for creating
immersive virtual and augmented reality experiences, until
we have perfect brain-computer interfaces a gap will remain
between the tangibility of real physical objects and virtual
objects rendered in immersive environments. However, it is
already possible to experiment with “physicalisations” of data
in order to study the benefits of “perfect” immersion. Indeed,
people have been constructing physical representations of data
for some time. An early and famous data physicalisation was a
tangible matrix display by Bertin [31]. Bertin was an advocate
of reordering tabular data displays to better show high-level
structure (such as clustering). His physicalisations featured
crude interactivity allowing people to physically reorder matrix
Fig. 4. Immersion is not just visual. Sound and haptic feedback can make
information graphics and data analytics applications accessible to people with
severe vision impairment.
rows of dots mounted on skewers.
Another data physicalisation is shown in Figure 5(a).
This physicalisation of inflation vs. unemployment data for
OECD countries over ten years was an early experiment
by a current member of the Monash Immersive Analytics
team in avoiding the limitations of digital display and haptic
technology while testing the cognitive effectiveness of 3D data
spatialisation [32]. In 2003 construction of this physicalisation
was a time-consuming process, constrained by the materials
and tools available. In 2015 digital fabrication is emerging as a
technology with huge potential to allow people to engage with
data in new ways [33]. Machines such as 3D-printers, laser
cutters and CNC-routers have all fallen massively in price in
recent years making it easier than ever for people to create
physical representations of their data automatically. Figure
5(b) shows a data physicalisation recently produced in our lab
very quickly using laser-cutter and vinyl-printer/cutter devices.
Furthermore, with such automatic fabrication techniques we
can produce physicalisations of large data-sets with ease.
Immersive analytics shares many goals with visual ana-
lytics, in particular: how to derive insight from complex, big
data sets. However, we can see from these various scenarios
and case-studies that in contrast to visual analytics, immersive
analytics focuses on the investigation of the use of new im-
mersive technologies and considers multi-sensory interaction,
not only visualisation. In short:
Immersive Analytics investigates how new interac-
tion and display technologies can be used to support
analytical reasoning and decision making. The aim
is to provide multi-sensory interfaces for analytics
approaches that support collaboration and allow
users to immerse themselves in their data. Immersive
Analytics builds on technologies such as large touch
surfaces, immersive virtual and augmented reality
environments, haptic and audio displays and modern
fabrication techniques.
IV. RESEARCH QUESTIONS
The overarching goal of immersive analytics research is
to understand how (and whether) new interface and display
technologies can be used to create a more immersive kind of
data analysis and exploration. The kinds of devices and envi-
ronments include augmented and virtual reality displays, large
high-resolution 2- and 3-D displays, haptic and audio feedback
and gesture and touch controlled interaction. These potentially
require very different interaction and visualisation models and
techniques to those used in standard visual analytics. Some of
the main research questions are:
1) Much research has been devoted to computer-assisted
collaboration both synchronous and asynchronous
and local and remote. The new devices and environ-
ments potentially support new models for collabora-
tion as shown in Figures 1 and 2. What paradigms are
potentially enabled by these new interaction modali-
ties? How do we evaluate them?
2) Traditionally 3D visualisation has been used in the
physical sciences, engineering and design; while 2D
visualisations have been used to display statistical
and abstract data in information visualisations. In-
creasingly there is a need to combine both sorts of
visualisation in holistic visualisations. For instance,
in the life sciences different aspects of a cell are
displayed using 2D images and 3D volumes, 2D
network data and the various -omics data. Can these
new technologies support more holistic visualisations
of such data incorporating 3D spatial information as
well as abstract data?
3) What questions do technologies like augmented re-
ality raise significantly for data and visual analyt-
ics? For instance, traditional information visualisation
supports open-ended exploration based on Shnei-
derman’s information mantra: overview first, zoom
and filter, then details on demand. In our view a
different model is required for analytical applications
grounded in the physical world. In this case objects
in the physical environment provide the immediate
and primary focus and so the natural model is to
provide detailed information about these objects and
only provide contextual information on demand.
4) What are the interface ‘tricks’ and affordances such as
high-resolution displays, sound, touch and responsive
interaction that change the user perception from an
allocentric view of the data to a more egocentric and
immersive view of the data?
5) What are the lessons that can be learnt from pre-
vious research into the use of 3D visualisation for
information visualisation? Do the new technologies
invalidate the current wisdom that it is better to use
2D visualisation for abstract data since the designer
of the visualisation has complete freedom to map data
to an occlusion free 2D display?
6) What are the most fertile application areas for im-
mersive analytics? For example, these could be in
life sciences, disaster and emergency management,
business (immersive) intelligence and many more.
What are domain-specific requirements, what are
general requirements? What are typical workflows
(a) (b)
Fig. 5. Data physicalisations: (a) A physicalisation of inflation vs. unemployment data for OECD countries over ten years. The physical instantiationofthis
dataset was an early experiment [32] in avoiding the limitations of digital display and haptic technology while testing the effectiveness of 3D data spatialisation.
This model, constructed in the 2000s, was time-consuming and tedious to build with conventional tooling. (b) A physicalisation of a grouped graph. The group
hierarchy is represented by stacked levels. Using modern fabrication techniques such as 3D printing, laser-cutters, vinyl cutter/printers, etc. we can prototype
such physicalisations very quickly and easily.
for data analysis in the different domains, are they
domain-specific or are they generic across the differ-
ent domains?
7) How do we develop generic platforms that support
immersive analytics? Currently there is a wide range
of different development platforms and existing cross-
platform tools, however, they do not quite have a
broad enough focus for immersive analytics. For
example, Unity19 is designed for gaming applications
rather than analytics while the Visualization Toolkit
(VTK)20 is targeted at scientific visualisation appli-
cations.
V. C ONCLUSION
Research and development for approaches related to im-
mersive analytics are scattered across several research commu-
nities and domains. In addition, a large part of the development
for emerging technologies is done in companies and start-ups.
With our Immersive Analytics initiative we would like to bring
together these efforts and establish a community of researchers,
developers, educators and users from both academia and indus-
try in this area. There are no dedicated events for Immersive
Analytics such as a conference or workshop yet for discussion
and dissemination of immersive analytics research, therefore
19www.unity3d.org
20www.vtk.org
we would like to foster community building by two dedicated
Immersive Analytics seminars in Dagstuhl and Shonan in
2016.
ACKNOWLEDGMENT
The authors would like to acknowledge support through the
ARC DP 140100077. Thanks to Jon McCormack, director of
the SensiLab at Monash University which hosts the Immersive
Analytics project. Thanks to Tom Chandler and his team
working on the Visualising Angkor project. Thanks also to
Bruce Thomas, Nathalie Henry Riche, Takayuki Itoh, Uwe
W¨
ossner and Wolfgang Stuerzlinger who are helping us to
organise forthcoming workshops on the topic of Immersive
Analytics.
REFERENCES
[1] Nature specials, “Big Data,” Nature, vol. 455, no. 7209, pp. 1–136,
2008.
[2] Science special, “Dealing with data,” Science, vol. 331, no. 6018, pp.
639–806, 2011.
[3] J. J. Thomas and K. A. Cook, Eds., Illuminating the path: The research
and development agenda for visual analytics. IEEE Computer Society
Press, 2005.
[4] D. A. Keim, F. Mansmann, J. Schneidewind, J. Thomas, and H. Ziegler,
“Visual Analytics: Scope and Challenges,” in Visual Data Mining,ser.
Lecture Notes in Computer Science, S. J. Simoff, M. H. B¨
ohlen, and
A. Mazeika, Eds. Springer Berlin Heidelberg, 2008, vol. 4404, pp.
76–90.
[5] M. Hassenzahl and N. Tractinsky, “User experience-a research agenda,”
Behaviour & information technology, vol. 25, no. 2, pp. 91–97, 2006.
[6] A. Febretti, A. Nishimoto, T. Thigpen, J. Talandis, L. Long, J. D. Pirtle,
T. Peterka, A. Verlo, M. Brown, D. Plepys, and others, “CAVE2: a
hybrid reality environment for immersive simulation and information
analysis,” in Proceedings IS&T / SPIE Electronic Imaging, vol. 8649.
SPIE, 2013, pp. 864 903.1–12.
[7] D. Tutt and C. Harty, “Journeys through the CAVE: The use of 3D
immersive environments for client engagement practices in hospital
design,” in Proceedings 29th Annual ARCOM Conference,S.D.Smith
and D. D. Ahiaga-Dagbui, Eds. Association of Researchers in
Construction Management, 2013, pp. 111–121.
[8] K. Reda, A. Febretti, A. Knoll, J. Aurisano, J. Leigh, A. Johnson,
M. Papka, and M. Hereld, “Visualizing large, heterogeneous data in
hybrid-reality environments,” IEEE Computer Graphics and Applica-
tions, vol. 33, no. 4, pp. 38–48, 2013.
[9] P. Isenberg, T. Isenberg, T. Hesselmann, B. Lee, U. Von Zadow, and
A. Tang, “Data visualization on interactive surfaces: A research agenda,”
IEEE Computer Graphics and Applications, vol. 33, no. 2, pp. 16–24,
2013.
[10] J. C. Roberts, P. D. Ritsos, S. K. Badam, D. Brodbeck, J. Kennedy, and
N. Elmqvist, “Visualization beyond the Desktop–the Next Big Thing,”
IEEE Computer Graphics and Applications, vol. 34, no. 6, pp. 26–34,
2014.
[11] S. Zhang, C. Demiralp, D. Keefe, M. DaSilva, D. Laidlaw, B. Green-
berg, P. Basser, C. Pierpaoli, E. Chiocca, and T. Deisboeck, “An immer-
sive virtual environment for DT-MRI volume visualization applications:
a case study,” in Proceedings Visualization 2001. IEEE, 2001, pp. 437–
440.
[12] G. Kurillo and M. Forte, “Telearch – Integrated visual simulation
environment for collaborative virtual archaeology,” Mediterranean Ar-
chaeology and Archaeometry, vol. 12, no. 1, pp. 11–20, 2012.
[13] N. G. Smith, K. Knabb, C. DeFanti, P. Weber, J. Schulze, A. Prud-
homme, F. Kuester, T. E. Levy, and T. A. DeFanti, “ArtifactVis2: Man-
aging real-time archaeological data in immersive 3D environments,” in
Proceedings Digital Heritage International Congress, vol. 1. IEEE,
2013, pp. 363–370.
[14] R. Bennett, D. J. Zielinski, and R. Kopper, “Comparison of Interactive
Environments for the Archaeological Exploration of 3D Landscape
Data,” in IEEE VIS International Workshop on 3DVis, 2014.
[15] T.-J. Hsieh, Y.-L. Chang, and B. Huang, “Visual Analytics of Terres-
trial Lidar Data for Cliff Erosion Assessment on Large Displays,” in
Proceedings SPIE Satellite Data Compression, Communications, and
Processing VII, vol. 8157. SPIE, 2011, pp. 81570D.1–17.
[16] C. Helbig, H.-S. Bauer, K. Rink, V. Wulfmeyer, M. Frank, and
O. Kolditz, “Concept and workflow for 3D visualization of atmospheric
data in a virtual reality environment for analytical approaches,” Envi-
ronmental Earth Sciences, vol. 72, no. 10, pp. 3767–3780, 2014.
[17] A. Kageyama, Y. Tamura, and T. Sato, “Visualization of Vector Field
by Virtual Reality,” Progress of Theoretical Physics Supplement,vol.
138, pp. 665–673, 2000.
[18] I. Heldal, M. Spante, and M. Connell, “Are two heads better than
one?: object-focused work in physical and in virtual environments,”
in Proceedings of the ACM Symposium on Virtual Reality Software and
Technology. ACM, 2006, pp. 287–296.
[19] H. Yang and G. M. Olson, “Exploring Collaborative Navigation:: The
Effect of Perspectives on Group Performance,” in Proceedings of the
4th International Conference on Collaborative Virtual Environments.
ACM, 2002, pp. 135–142.
[20] M. Narayan, L. Waugh, X. Zhang, P. Bafna, and D. Bowman, “Quan-
tifying the Benefits of Immersion for Collaboration in Virtual Envi-
ronments,” in Proceedings of the ACM Symposium on Virtual Reality
Software and Technology. ACM, 2005, pp. 78–81.
[21] Z. Szalav´
ari, D. Schmalstieg, A. Fuhrmann, and M. Gervautz, “Studier-
stube: An environment for collaboration in augmented reality,” Virtual
Reality, vol. 3, no. 1, pp. 37–48, 1998.
[22] D. Huang, M. Tory, B. A. Aseniero, L. Bartram, S. Bateman, S. Carpen-
dale, A. Tang, and R. Woodbury, “Personal visualization and personal
visual analytics,” IEEE Transactions on Visualization and Computer
Graphics, vol. 21, no. 3, pp. 420–433, 2015.
[23] J. Heer and M. Agrawala, “Design Considerations for Collaborative
Visual Analytics,” Information Visualization, vol. 7, no. 1, pp. 49–62,
2008.
[24] P. Isenberg, N. Elmqvist, J. Scholtz, D. Cernea, K.-L. Ma, and H. Ha-
gen, “Collaborative visualization: Definition, challenges, and research
agenda,” Information Visualization, vol. 10, no. 4, pp. 310–326, 2011.
[25] C. Donalek, S. Djorgovski, A. Cioc, A. Wang, J. Zhang, E. Lawler,
S. Yeh, A. Mahabal, M. Graham, A. Drake, S. Davidoff, J. Norris, and
G. Longo, “Immersive and collaborative data visualization using virtual
reality platforms,” in 2014 IEEE International Conference on Big Data
(Big Data), 2014, pp. 609–614.
[26] N. Mahyar and M. Tory, “Supporting communication and coordination
in collaborative sensemaking,” IEEE Transactions on Visualization and
Computer Graphics, vol. 20, no. 12, pp. 1633–1642, 2014.
[27] N. Li, A. S. Nittala, E. Sharlin, and M. Costa Sousa, “Shvil: Collab-
orative augmented reality land navigation,” in Proceedings Conference
on Human Factors in Computing Systems (CHI 2014). ACM, 2014,
pp. 1291–1296.
[28] A. Irlitti, S. Von Itzstein, L. Alem, and B. Thomas, “Tangible interac-
tion techniques to support asynchronous collaboration,” in 2013 IEEE
International Symposium on Mixed and Augmented Reality (ISMAR),
2013, pp. 1–6.
[29] T. Chandler, B. McKee, E. Wilson, and M. Yeates, “Exploring Angkor
with Zhou Daguan,” ABC Splash Digibook supporting Australian Year
8 History curriculum, 2015.
[30] C. Goncu and K. Marriott, “GraVVITAS: Generic Multi-touch Pre-
sentation of Accessible Graphics,” in Human-Computer Interaction –
INTERACT 2011, ser. Lecture Notes in Computer Science, P. Campos,
N. Graham, J. Jorge, N. Nunes, P. Palanque, and M. Winckler, Eds.
Springer Berlin Heidelberg, 2011, vol. 6946, pp. 30–48.
[31] J. Bertin, Semiology of Graphics. University of Wisconsin Press, 1983.
[32] T. Dwyer, “Two-and-a-half-dimensional visualisation of relational net-
works,” Ph.D. dissertation, School of Information Technologies, Faculty
of Science, University of Sydney, 2005.
[33] Y. Jansen, P. Dragicevic, P. Isenberg, J. Alexander, A. Karnik, J. Kildal,
S. Subramanian, and K. Hornbæk, “Opportunities and Challenges for
Data Physicalization,” in CHI 2015 - Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems. ACM, 2015,
pp. 3227–3236.