Content uploaded by Tristan Braud
Author content
All content in this area was uploaded by Tristan Braud on Oct 11, 2017
Content may be subject to copyright.
When Augmented Reality meets Big Data
Carlos BERMEJO, Zhanpeng HUANG, Tristan BRAUD, and Pan HUI
System and Media Laboratory,
Department of Computer Science and Engineering,
The Hong Kong University of Science and Technology,
Clear Water Bay, Kowloon, Hong Kong
Email: cbf@cse.ust.hk, soaroc@cse.ust.hk, braudt@ust.hk, panhui@cse.ust.hk
Abstract—We live in an era where we are overloaded with
data, and this can be the key for gaining rich insights about
our world. Augmented reality (AR) enables us the possibility to
visualize and analyse the growing torrent of data in an interactive
canvas. We can display complex data structures in simpler and
more understandable ways that was not possible before. Big Data
is a new paradigm results from the myriad data sources such as
transactions, Internet, social networks, health care devices and
sensor networks. AR and big data have a logical maturity that
inevitably will converge. The tread of harnessing AR and big data
to breed new interesting applications is starting to have a tangible
presence. In this paper, we explore the potential to capture
value from the marriage between AR and big data technologies,
following with several challenges that must be addressed to fully
realize this potential.
I. INTRODUCTION
The development of Internet, social network, mobile devices
have had an impact in the amount data transmitted and shared
across the globe. Besides, we are facing a new Business era
where data-driven decisions are better decisions.
Human intuition tells us it will be easier to make sense of
and interact with information if it is merged with the physical
world [15]. As a modality to display information by overlaying
virtual content on the current view of the world around us,
AR enhances the way we acquire, understand, and display
information without distraction from the physical world [10].
Over the past few years, AR has been progressing by leaps
and bounds in terms of technology and its applications.
However, even though technologies such as sensing, track-
ing, and displaying have improved, AR application patterns
broadly follow the lines of prototypes and demonstrations,
such as virtual pop-up objects on 2D markers, and sample
data visualization. A major impediment to AR adoption is the
lack of data sources described by MacIntyreetal [15]. AR is
data-hungry and requires more data than most applications.
Apart of the application-specific content that users see and
interact with, AR needs to feed applications knowledge about
its surroundings and descriptions about how it relates to
the application data. Imperative environmental information
may include geospatial coordinates and models of nearby
buildings, features and semantic descriptions of objects, and
linkage between physical and virtual content. Previous works
acquired data from the legacy database, but the data may
be incomplete or out-of-date due to sparse sensing and the
absence of persistent maintenance. The first AR prototype was
developed by Sutherland in 1960s [21], but it only started to
draw attention in the last two decades. A typical AR includes
three characteristics defined by Azuma [1]: Combines the real
and the virtual; Interactive in real time; Registered in 3-D.
Applications such as tourism [19], advertisement1, educa-
tion [5], and assembly [7] are looking for a way to supplement
the physical world with virtual content rather than replacing
it with VR applications. Recently, AR’s popularity has grown
on mobile devices. This subset of AR is known as Mobile
AR (MAR). Pokemon GO2, for example, is a well-known
MAR application that offers location-based AR mobile game
experience. The Pokemon GO predecessor Ingress3generated
almost 2 million US dollars the first days after its release.
The high penetration of technologies spanning mobility, so-
cial networks, and the Internet of Things (IoTs) has catapulted
the world in the era of big data. Touted as a game changer, big
data has been identified by the US government as a research
frontier that is accelerating progress across a wide range of
priorities [9]. AR and big data have been around shaping their
own landscapes in various fields for a few years. However, the
intersection of two disruptive technologies has not attracted
much attention yet. The rich insight of big data and novel
display modality of AR is promoting the convergence of AR
and big data. AR has great opportunities to bring innovation to
big data in terms of visualization and interaction. In this paper,
we explore the potential opportunities from the convergence
of both disruptive technologies alongside several challenging
problems that should be addressed. According to a McKinsey
report [16], big data has the potential to reduce product de-
velopment and assembly costs by 50% and increases retailers
operating margins by 60%. It is estimated to create savings of
300 billion dollars in healthcare every year in the US alone.
Big data analysis has been widely used in healthcare4, energy
saving, and financial risk analysis [4].
II. AR-POWE RE D BIG DATA
Interpretation is a major phase of the big data analysis
pipeline, which is critical for users to extract actionable
knowledge from massive and highly complex datasets. As an
1http://www.businessinsider.com/11-amazing-augmented- reality-ads-
2012-1
2http://www.pokemongo.com
3https://www.ingress.com
4https://www.nsf.gov/pubs/2012/nsf12512/nsf12512.htm
intuitive interpretation, visualization transforms plain and bor-
ing numbers into compelling stories to help users understand
the data. In addition, big data analysis requires a human-in-
the-loop collaboration at all stages of the analysis pipeline
[13]. Powerful interactions help to explore and understand the
data more easily and fully, understanding the data is the key
to success in the current data-driven business ecosystem.
A. Data Visualization
With visualization, we turn the big data into a landscape
that we can explore with our eyes. A visual information
map is useful when we are drowning in information. Data
visualization has the ability to take the complex abstract
symbols and transform them into simpler visual concepts that
we can quickly understand. Visualization-based data discovery
tools have already delivered greater customer and market
insights to businesses around the world5. It is estimated by
Gartner that data visualization tools will promote a 30%
compound annual growth rate in 2015 [20]. In the past, we
have employed diverse ways of visualizing data using tabular
presentations, interactive bubble charts, treemaps, heatmaps,
3D data landscapes, and other types of graphics. The data
is generally displayed on flat mediums such as desktops and
more recent mobile screens, which separates the visualiza-
tion from the data source and user context. For instance, a
virtual array of gauges to display temperatures would fail
to fully describe the spatial distribution as the temperatures
are strongly related to real world objects in a physical user
context. In many cases, it is easier to gain insight from the
data if the visualization is embedded in the physical world. Big
data visualization can be enhanced if an AR layer is overlaid
on real-time streaming data or user context. The AR enables
users to be immersed in a world where the data is much more
easily understood if it corresponds explicitly to real content,
as shown in Figure 1. In this Scenario, since data is usually
directly connected with physical context, display simplification
which impairs user’s understanding in traditional display is no
longer required. For instance, to find a book among millions
of others in a library, traditional methods require digital 2D or
3D maps and floating bubbles for coarse positioning. Powered
with AR, users are able to see through walls and shelves
to look for indications, e.g. the highlighting contour of the
book. As an indication is usually located according to a
user’s current view, it will be much easier to find the book.
Collaborating with AR, big data is especially suitable for
in-situ visualization and field diagnosis. For example, with
rapid growth in information building information modeling
(BIM), AR steps in every phase of a building’s life cycle,
from building to maintainance [23]. This enables field workers
to view the on-site feasibility design and assist construction
with virtual simulations and clash detection. It also facilitates
assets management in daily inspection activities based on a
torrent of data from in-built sensors [12]. To achieve success
5http://www.intel.eu/content/www/eu/en/big-data/data-insights- peer-
research-report.html
Fig. 1: Visualization of a numerical flow field with real
buildings makes the influence of the building on wind
movement easily understood. (source: http://emcl.iwr.uni-
heidelberg.de/research sv.html)
with big data visualization, we need to rethink how to mix
digital data with physical world and present the information
to users. Apart from application-specific requirements, other
considerations should be kept in mind. Floating bubbles are
widely used by many AR applications, however, it seems to be
pointless and no improvement on a 2D map [23], especially
when the data content can not be seamlessly integrated into
the real world. Content should be merged into the physical
world in a way that users perceive as a real counterpart,
just as the 1st and Ten system used by ESPN in American
professional football broadcast. To achieve this, visualization
requires immense graphic effort and additional information
for visual occlusion (e.g. something hidden behind a physical
building) and ambient lighting.
B. User Interaction
Today’s world is becoming a canvas for big data from a
wide range of sources, which profoundly influences people’s
perception and understanding of the environment around them.
It requires a user-friendly interface to interact with the digital
world. Traditional user interface design is constrained by
finite physical dimensions. It becomes especially severe when
mobile devices are preferred as a medium to interact with
torrents of data from social media, online transactions, and
telecommunications.
The ability to associate data with the physical world dis-
closes the causality between data and reality. In addition,
mashing up data from various sources dramatically increases
the probability of discovering relevant and interesting things.
Furthermore, not only the user interface (UI) is a key factor in
users engagement for AR applications, but the user experience
(UX), which involves user behaviour and emotions towards
a specific artefact, needs to be considered in any current
application design.
A few works [8] further employed AR as user interface
to redefine and attach new functions to physical objects. It is
even more compelling when AR is integrated with accessories
such as AR glasses and AR contact lenses. Wearable and
invisible accessories reduce device intrusion, which provides
a hand-free interaction experience. Smart glasses and contact
Fig. 2: Left: Sight, a short futuristic film by Eran May-raz
and Daniel Lazo to look into a future AR with retinal lenses.
Data from sensors, apps, and Internet augment current views
(source: http://vimeo.com/46304267); Right: A prototype of
bionic contact lenses with a built-in LED for virtual display
source7
lenses6, such as Google Glass, made huge strides in this area.
With implanted and invisible accessories, AR allows users
to interact with data securely. As the user interface is only
seen and manipulated by the user alone, it reduces the risk
of data and privacy leakage. Data can be viewed through
the AR interface as personal media or by multiple users
in a collaborative way. As a personal information center, it
collects data from various sources, and displays it on intangible
interface without physical constraints. In the collaborative
mode, multiple users share the same data set and view it from
their own angle.
AR applications can enable a new approach to provide
data visualization of complex Big Data structures in an easier
manner and provide a better experience for users to interact
with it. Besides, AR can be a very usable canvas to visualize
big data sets as we can see from Sci-fi movies such as Avatar
(Figures 2, 3).
III. BIG -DATA-DRIVEN AR
Big data and AR have shaped new business regimes that
were irrelevant in the past, but the landscape is undergoing
a seismic shift with advances in technology convergence and
connectivity. A majority of AR applications are constrained
in an enclosed or pre-compiled environment due to a limited
dataset that is either not available or too sparse to use. As
rapid penetration of mobile devices, social networks, and IoT
are generating considerable amounts of data, it makes sense
for big data to enable AR to be more feasible for practical
use.
A. Retail
AR has been in use in the retail business for the last 5 years.
A majority of applications such as Junaio and Wikitude AR
browsers overlay geospatial-related data on current view to
offer general information. A few AR apps promote virtual ad-
vertisements to catch customers’ eyes. However, it is frustrat-
ing to boost customers’ interest in products if their behaviors
and preferences are not taken into account. Without adequate
6http://www.engr.washington.edu/facresearch/highlights/ee contactlens.
html
Fig. 3: Large data visualization and interaction among multiple
users in the science fiction movie Avatar directed by James
Cameron, which may be portrayal of future user interface with
AR. (source: http://www.avatarmovie.com/index.html)
information from customers, AR is less attractive for practical
use and more like a gaudy, flashy technology. The mobility
trend preferred for purchase and social communication has
led to the birth of digital consumers8, which also brings us
to the era of product digitalization. Digitally active consumers
have changed their mode of transaction, communication, and
purchase decision making. Consumers use mobile devices for
online transactions, and social media for advice on what to
buy and where to shop, leaving trails of their performances,
states, and decision. It is obviously valuable to explore the
large amounts of data to understand consumers’ shopping
preferences and behavior, which helps to tailor promotions and
recommendations to customers. However, it can be even more
promising when it is combined with AR technology. Harnessed
with big data, AR promotes vertical retail to individual con-
sumers. A conscious and activated shopping context have an
impact on customers’ mental presentation [2], helping them to
be active, informed and assertive in their shopping decisions.
When the new eye gazing and facial expression technolo-
gies, and other physiological measurements eventually gain
a tangible presence, it will enable us to better understand
customers’ focus and emotions to provide more accurate
recommendations and advertisements. A few works have used
eye-tracking glasses to collect customer point of gaze in-
formation for shopping behavior analysis. According to a
Marks & Spencer report [24], people using mobile shopping
channels spend eight times as much as people shopping in
stores. Meeting consumers where they are is the key to
future consumer engagement [22]. With the advantages of high
mobility and consistent virtual content provided by big data,
AR breaks physical constrains to enhance the virtual shopping
experience anywhere and any time.
B. Tourism
In terms of environmental awareness, AR is concerned about
presenting contextual information and assisting in daily activ-
ities, which is particularly helpful when people are unfamiliar
with the environment around them. By highlighting interesting
8http://www.businessinsider.com/11-amazing-augmented- reality-ads-
2012-1
features or bringing history to life, AR provides intuitive
means to enhance a touring experience. As travel is normally
associated with geo-spatial exploration, most AR applications
for travel guides are based on geo-spatial information. A
user’s position is tracked using GPS and built-in sensors, and
it is then used to search and locate multimedia information
from data sources such as point of interest (POI) databases,
geocoded Tweets, and Flickr.
Trends such as the fast deployment of sensor networks and
the growing use of mobile devices and social networks are
generating large amounts of both structured and unstructured
data. Aggregating and compiling the redundant fragmented
data helps us to build a detailed and complete environmental
model, which enables AR to understand users’ surroundings
better even in an open and unfamiliar environment. According
to the Business Insider’s survey9, intelligent recommendation
is regarded as the most attractive expectation of tourism. As
the world is getting smarter with big data, it comes with the
ability of tracking and measuring tourists’ needs and behaviors
to ensure a responsive and intelligent trip experience. For
instance, personalized travel guide information is overlaid on
the tourists’ current view to avoid being distracted from tourist
spots and getting lost. Native language signs are automatically
translated into readable words which are overlaid on the orig-
inal places. To go a step further, information such as locations
of nearby rest sites and restaurants can be recommended
according to tourists’ needs based on walking distance and
time.
C. Health Care
When we are making life and death decisions, immediate
access to relevant and necessary information is of the utmost
importance. AR importance in healthcare industry is attributed
to its ability to instantly in-situ display relevant information
when required. AR has been used in the medical field for
nearly ten years. We have seen AR’s powerful ability of x-ray
vision, which has been used to provide contextual cues for
diagnosing patients and learning tools for medical students by
projecting computerized tomography (CT) scans or medical
images on a current view10. In one example, images of veins
are overlaid on a nurse’s view of a patient’s hand to help
the nurse insert the IV in one painless attempt. In another AR
application developed by German research institute Fraunhofer
11, a digital overlay of key blood vessels are displayed on
the iPad when the doctor holds the embedded camera on a
patient’s body to avoid accidentally cutting them. Although the
early examples have proven AR’s ability to change the health-
care landscape, AR’s great lifesaving potential for healthcare
industry can not be fully presented without big data support.
Without adequate data sets, AR is merely for medical
education purposes and a bedside manner test. A decision
is strongly made based on doctors’ hunches and experience
9http://www.businessinsider.com/infographic-how-technology- will-
eliminate-travel-frustration- 2012-8
10https://www.youtube.com/watch?v=VF2mKavngHE
11http://sixrevisions.com/user-interface/the- future-of- user-interfaces/
Fig. 4: Corning’s vision of a future operation
room augmented with remote presence. (source:
https://www.youtube.com/watch?v=VF2mKavngHE)
rather that the data itself. According to a survey by Man-
hattan Research12, 72% of physicians routinely use computer
tablets every day. With the substantially increasing usage
of tablets among physicians and digital care devices among
patients, paper prescriptions and manual health records are
being replaced by electronic health records (EHRs). Patient
data is being digitalized, leading to a flood of digital data
from which medical decisions that previously were based on
guesswork and experience can be made based on data itself
(Figure 4). Creating a virtual viewfinder to display a pertinent
health record enables the doctor to quickly access valuable
information in the context of patients. In-suit visualization of
historical illnesses or tissue damage over patients themselves
helps the doctor understand his or her patients better. In future,
AR can even visualize a virtual operating room for doctors at
different places to diagnose a patient in a collaborative way.
Working closer with ourselves in daily life, AR can be
significantly important for self-tracking health, as wearable
devices that could sense heart rate, blood oxygen, or even
cholesterol level become available to everybody. With each
one of us becoming a walking data generator [17], we would
be able to keep track of our own basic health statistics. AR
displays real-time notifications to allow us to immediately
understand our own health condition, and it may even make
suggestions based on health statistics and diet.
D. Public Services
The government is responsible for providing public services,
which both produce and consume large amounts of data. As
the largest spender in any economy, the government has the
most diverse channels for collecting data from public services,
which includes transportation, social services, national secu-
rity, defense, and environmental stewardship, among others.
According to a joint global survey by Bloomberg Business-
week and SAP [18], 81% of the questioned believe that public
sector will inevitably be transformed by big data. The govern-
ment can use big data to provide better services to citizens.
Public services, such as transportation and security protection,
12http://www.prnewswire.com/news-releases/new-study-reveals-that-
physicians-embrace- patient-self- tracking-202522041.html
will be more efficient and productive if they are delivered di-
rectly to individuals in their own contexts with AR technology.
For instance, the upcoming traffic flow displayed on a screen
will help drivers avoid traffic accident. Personal information
overlaid on passengers will enable security specialists to
very quickly verify identification and reduce screening traffic.
Augmented government [3] has been proposed to improve
government services with AR technology in a few US public
sectors. The AR strategy for government service delivery
will be more promising if it is supported by big data from
surveillance systems, mobile and sensor networks, and social
media. As sensors and wireless networks continue covering
vehicles, it is much easier to collect massive traffic statistics
to make us aware of the traffic situation. For example, cars
present within a range of distance can share GPS positions,
speed, and direction information with a vehicular Ad-hoc
network (VANET). AR can display the information in front
of drivers for performing thread assessment and predicting
any potential car crashes. In particular, by harnessing the x-
ray vision capability, drivers can see through buildings or
vehicles to watch for vehicles positioned in their blind spots.
Overlaying essential information such as driver’s license and
vehicle’s location and speed directly over the vehicle will also
help traffic police to quickly determine whether the driver
violated any traffic rules. A similar strategy can be employed
by security agencies to rapidly identify suspects. The city is
being digitalized by ubiquitous sensor and social networks,
which makes it transparent for city managers to look into.
To give an example, in a civil engineering maintenance work
scenario, a virtual image of a subsurface infrastructure can
be superimposed on a field workers’ vision of the site to
enable them rapid perception of the underground network
layout. Field workers can collaborate from different aspects by
giving contextualized views to each supporting role. Individual
view is personalized and annotated for each worker’s context,
such as electrical-line view for the electrician and plumbing-
line view for the plumber, which harnesses the collective
intelligence of all roles to improve efficiency. In another
example, a virtual bird’s eye view directly overlaid on an
emergency staff’s vision will greatly assist in the search and
rescue of persons trapped in a burning or collapsed building.
As civil infrastructure such as electrical and water supply
networks are getting smarter with IoT, torrents of data can
be used for in-suit visual analysis without field damage by
using AR technology.
IV. CHALLENGES
We have seen an emerging shift in mindsets of big data and
AR from industry and business, but there are still significant
barriers to overcome. Several barriers, such as intrusive dis-
play, battery life, and highly fragmented data, are practical.
Some barriers are conceptual. We should first address these
challenges before we can see their full potential in action.
Converging big data and AR brings practical technical chal-
lenges from both sides. Heterogeneity, scale, and complexity
are problems with big data that impede the process of all
phases from data acquisition to aggregation and analysis.
AR applications are also constrained to extensive calibration,
incomplete reference modelling, and poor environmental sens-
ing. A comprehensive discussion of the technical problems
from each side is out of the scope of this paper. Interested
readers can instead refer to Huang’s [10] and Labrindis’ [13]
papers for details. Herein we explore emerging practical tech-
nological problems and common conceptual barriers induced
by converging both technologies.
A. Timelines
AR applications generally require real-time performance to
guarantee fluent user interaction, which requires immediate
analysis results (20 to 7 ms) from Valve software study13.
However, large-scale data analysis usually takes longer times
due to the voluminous and highly fragmented data. The
problem becomes considerably more challenging when the
trend of minimization in AR devices conflicts with the growing
volume and fragmentation of big data. The cloud is able
to store a large amount of data and handle computationally
intensive tasks within a fixed time cap. A few applications [11]
have proven cloud’s ability to meet requirements in a timely
fashion. In addition, offloading computation and data storage
enables client-side AR devices to be small and sustainable
enough without intrusion.
B. Interpretation
By integrating big data with AR, we acquire data analysis
ability. However, the ability to analyze data is of limited value
if the results can not be understood by AR. Users of big
data analytical systems are data scientists while AR users are
customers without much technical background. The present
data analysis pipeline has rather been designed explicitly to
have a human in the loop because many patterns are obvious
for humans but difficult for computer to understand [13], how-
ever AR prefers to be intuitive and used without interruption.
Analysis results require interpretation in the context of AR. For
instance, the output of a customer behavior analysis system is
normally customer stats, but AR is responsible for how to
use the stats. AR should be able to interpret the results as
preferential information so as to provide a recommendation
to a customer’s specific context. Although big data is good at
discovering correlations, especially subtle corrections that are
not possible from a small data set, it does not tell us which
correlations are meaningful, while AR requires semantically
meaningful information to relate to the users’ context. There
is no easy way for big data and AR to intelligently interpret
for each other. However, a collaborative effort to embrace
AR content and provide native APIs for AR to interpret
semantically-tagged data from all data generators is a possible
solution. A standard data format such as Augmented Reality
Markeup Language (ARML) [15] is an essential step in the
right direction.
13http://blogs.valvesoftware.com/abrash/ latency-the-sine-qua-non-of-ar-
and-vr/
C. Privacy
There is a growing concern about privacy in the context of
both AR and big data. In order to provide a personalized rec-
ommendation, AR generally requires to access users’ personal
information and record their location. Studies show that users’
identities and their movement patterns have a close correlation
[6]. Even though people pay attention to personal information,
an attacker can infer private information from their location
information [13]. Although the data is fragmented and incom-
plete from individual sources, it is interrelated and includes
frequent patterns and redundant knowledge. When data is
aggregated from numerous data sources, hidden relationships
and models can be extracted by data analysis. Privacy is
both a technological and sociological problem. Forceful laws
and regulations may be required to avoid inappropriate use
of personal data and malicious AR applications. From the
technical standpoint, differential privacy is a possible way
of accessing data with a limited privacy risk, however the
information is reduced too far to be useful in practice. Not
only that, it is ill-suited for dynamically changing data.
V. CONCLUSION
In this paper rather than just an up-to-date survey of big data
and AR technologies, we have broadened our outlook to merge
them to breed new applications. The factors for promoting
the convergence of the two technologies also created some
challenges not previously experienced by either technology.
The key principle is to combine the main features of each
technology as we devise novel applications using both. Al-
though the technologies are still in their infancy, attention has
always been a necessary component of the convergent strategy
of AR and big data. While the concepts are not unfamiliar to
us, most participants have not yet harnessed their full potential.
Only when we take full action can we get a real picture of
what it will take for big data and AR to move from being just
hype to becoming real game changers.
ACKNOWLEDGMENT
This research has been supported, in part, by General
Research Fund 26211515 from the Research Grants Council
of Hong Kong and the Innovation and Technology Fund
ITS/369/14FP from the Hong Kong Innovation and Technol-
ogy Commission.
REFERENCES
[1] Ronald T Azuma. A survey of augmented reality. Presence: Teleoper-
ators and virtual environments, 6(4):355–385, 1997.
[2] Benedict GC Dellaert, Theo A Arentze, and Harry JP Timmermans.
Shopping context and consumers mental representation of complex
shopping trip decision problems. Journal of Retailing, 84(2):219–232,
2008.
[3] C Doolin, A Holden, and V Zinsou. Augmented government: Transform-
ing government service through augmented reality. Deloitte Consulting
LLP. URL: http://www. deloitte. com/assets/Dcom-UnitedStates/Local%
20Assets/Documents/Federal/us fed augmented government 0606, 13,
2013.
[4] Mark Flood, HV Jagadish, Albert Kyle, Frank Olken, and Louiqa
Raschid. Using data for systemic financial risk management. In CIDR,
pages 144–147, 2011.
[5] Rubina Freitas and Pedro Campos. Smart: a system of augmented reality
for teaching 2 nd grade students. In Proceedings of the 22nd British HCI
Group Annual Conference on People and Computers: Culture, Creativity,
Interaction-Volume 2, pages 27–30. British Computer Society, 2008.
[6] Marta C Gonzalez, Cesar A Hidalgo, and Albert-Laszlo Barabasi. Un-
derstanding individual human mobility patterns. Nature, 453(7196):779–
782, 2008.
[7] Steven J Henderson and Steven K Feiner. Augmented reality in the
psychomotor phase of a procedural task. In Mixed and Augmented
Reality (ISMAR), 2011 10th IEEE International Symposium on, pages
191–200. IEEE, 2011.
[8] Valentin Heun, Shunichi Kasahara, and Pattie Maes. Smarter objects:
using ar technology to program physical objects and their interactions.
In CHI’13 Extended Abstracts on Human Factors in Computing Systems,
pages 961–966. ACM, 2013.
[9] John P Holdren, Eric Lander, and H Varmus. Report to the president
and congress: Designing a digital future: federally funded research and
development in networking and information technology. Executive Office
of the President and Presidents Council of Advisors on Science and
Technology, 2010.
[10] Zhanpeng Huang, Pan Hui, Christoph Peylo, and Dimitris Chatzopoulos.
Mobile augmented reality survey: a bottom-up approach. arXiv preprint
arXiv:1309.4413, 2013.
[11] Zhanpeng Huang, Weikai Li, Pan Hui, and Christoph Peylo. Cloudridar:
A cloud-based architecture for mobile augmented reality. In Proceed-
ings of the 2014 workshop on Mobile augmented reality and robotic
technology-based systems, pages 29–34. ACM, 2014.
[12] Javier Irizarry, Masoud Gheisari, Graceline Williams, and Bruce N
Walker. Infospot: A mobile augmented reality method for accessing
building information through a situation awareness approach. Automa-
tion in Construction, 33:11–23, 2013.
[13] Alexandros Labrinidis and Hosagrahar V Jagadish. Challenges and
opportunities with big data. Proceedings of the VLDB Endowment,
5(12):2032–2033, 2012.
[14] Doug Laney. 3d data management: Controlling data volume, velocity
and variety. META Group Research Note, 6:70, 2001.
[15] Blair MacIntyre, Hafez Rouzati, and Martin Lechner. Walled gardens:
Apps and data as barriers to augmenting reality. IEEE computer graphics
and applications, 33(3):77–81, 2013.
[16] James Manyika, Michael Chui, Brad Brown, Jacques Bughin, Richard
Dobbs, Charles Roxburgh, and Angela H Byers. Big data: The next
frontier for innovation, competition, and productivity. 2011.
[17] Andrew McAfee, Erik Brynjolfsson, Thomas H Davenport, DJ Patil, and
Dominic Barton. Big data. The management revolution. Harvard Bus
Rev, 90(10):61–67, 2012.
[18] Joe Mullich. Closing the big data gap in public sector. SURVEY
REPORT— Real-Time Enterprise,(Sep. 2013), 2013.
[19] Dieter Schmalstieg and Daniel Wagner. Experiences with handheld
augmented reality. In Mixed and Augmented Reality, 2007. ISMAR 2007.
6th IEEE and ACM International Symposium on, pages 3–18. IEEE,
2007.
[20] Dan Sommer, Rita L Sallam, and James Richardson. Emerging technol-
ogy analysis: Visualization-based data discovery tools. Gartner, June,
17:3, 2011.
[21] Ivan E Sutherland. A head-mounted three dimensional display. In
Proceedings of the December 9-11, 1968, fall joint computer conference,
part I, pages 757–764. ACM, 1968.
[22] P Udhas, S Mittal, and N Chandrasekaran. Six converging technology
trends: Driving a tectonic shift in the business-consumer ecosystem.
KPMG, India, 2013.
[23] David Wilcox, Marvin Johnson, and Peter Carrato. Augmented reality:
Bringing bim to life. Journal of Building Information Modeling no. Fall,
2012:18–19, 2012.
[24] Z Wood. Marks and spencer gambles on bringing inter-
net age to the shop floor. The Guardian Online. Retrieved
from: www. theguardian. com/business/2012/sep/02/marks-andspencer-
multichannel-shopping, 2012.