Conference PaperPDF Available

Virtual Museum 'Takeouts' and DIY Exhibitions - Augmented Reality Apps for Scholarship, Citizen Science and Public Engagement


Abstract and Figures

This paper presents an Augmented Reality (AR) project for the curation of virtual museum ‘takeouts’ and DIY exhibitions. The project’s outputs include novel AR app technology demonstrators to support co-design with museum users and stakeholders - the goal being to create useful and easy-to-use AR apps for scholars, citizen scientists and the interested public. The apps were designed for users to create, display, animate and interact with exhibitions of selected 3D artefacts that could, for example, reflect academic specialisms for sharing with fellow researchers, support curators in exhibition planning or enable friends and students to share eclectic favourites from museum vis-its. The overarching project ambition was to create AR apps to support research, engagement and education, and to enable interactive and personalized visualizations of individual artefacts as well as reconstructed forms. As presented in the paper, these forms are exemplified in the AR apps with 3D models of a cuneiform envelope and its tablet contents, viewable either as i) separate artefacts or ii) in their reconstructed enveloped form, with the AR apps enabling animated opening and ‘X-ray views’ of the contents within. In this way, the apps can enable users to visualize individual objects and reconstructions that could, for example, incorporate artefacts held in different museums.
Content may be subject to copyright.
Virtual Museum ‘Takeouts’ and DIY Exhibitions
Augmented Reality Apps for Scholarship, Citizen Science
and Public Engagement
Sandra Woolley1[0000-0002-7623-2866], James Mitchell2[0000-0001-6051-2567],
Tim Collins3[0000-0003-2841-1947], Richard Rhodes4[0000-0002-6955-7216],
Tendai Rukasha5[0000-0003-3378-0843], Erlend Gehlken6[0000-0003-0332-4759],
Eugene Ch’ng7[0000-0003-3992-8335] and Ashley Cooke8
1,2,4,5 Keele University, Staffordshire, UK
3 Manchester Metropolitan University, Manchester, UK
6 Goethe-Universität, Frankfurt am Main, Germany
7 Nottingham Ningbo University, Ningbo, China
8 National Museums Liverpool (World Museum), Liverpool, UK
Abstract. This paper presents an Augmented Reality (AR) project for the cura-
tion of virtual museum ‘takeouts’ and DIY exhibitions. The project’s outputs in-
clude novel AR app technology demonstrators to support co-design with museum
users and stakeholders - the goal being to create useful and easy-to-use AR apps
for scholars, citizen scientists and the interested public.
The apps were designed for users to create, display, animate and interact with
exhibitions of selected 3D artefacts that could, for example, reflect academic spe-
cialisms for sharing with fellow researchers, support curators in exhibition plan-
ning or enable friends and students to share eclectic favourites from museum vis-
its. The overarching project ambition was to create AR apps to support research,
engagement and education, and to enable interactive and personalized visualiza-
tions of individual artefacts as well as reconstructed forms. As presented in the
paper, these forms are exemplified in the AR apps with 3D models of a cuneiform
envelope and its tablet contents, viewable either as i) separate artefacts or ii) in
their reconstructed enveloped form, with the AR apps enabling animated opening
and ‘X-ray views’ of the contents within. In this way, the apps can enable users
to visualize individual objects and reconstructions that could, for example, incor-
porate artefacts held in different museums.
Keywords: Augmented reality, Digital heritage, Virtual reconstruction.
1 Introduction
Augmented Reality (AR) offers new and exciting opportunities to enrich museum and
art gallery visits and provide additional interactions and experiences for visitors [1-4].
AR can also provide digital heritage visualization opportunities, for example, for ar-
chaeological finds [5] and reconstructed artefacts such as glass plates [6], as well as
The final publication is available at Springer via
DOI: 10.1007/978-3-030-73043-7
affording the potential for engaging educators and student learners in cultural heritage
experiences [7].
Notable museum AR examples include Saltzburg’s Museum of Celtic Heritage
‘Speaking Celt’ historical AR avatar [8], Cleveland’s Museum of Art ArtLens AR app
that provides enriched tour experiences of displayed artworks [9] and The Franklin In-
stitute’s AR app that enables visitors to interact with virtual 3D models of Terracotta
Warriors [10]. These AR apps enrich the in-museum experiences of displayed artefacts
[11, 12] but, like other heritage AR apps, they do not support interaction with artefacts
that are not on display or enable the creation of virtual exhibitions. More broadly,
amongst art and digital heritage AR apps, there is scope for improved personalization
and better ways of revealing non-visible components and communicating artefact in-
formation [13].
The Virtual Museum ‘Takeouts’ and DIY Exhibitions project [14] evolved from The
Virtual Cuneiform Tablet Reconstruction Project [15] whose original ambitions in-
cluded support for virtual access to museum artefacts and the creation of tools to sup-
port artefact reconstruction [16] as exemplified by the virtual reconstruction of the At-
rahasis cuneiform tablet [17-19]. The aims of the Virtual Museum ‘Takeoutsand DIY
Exhibitions project were (i) to create technology demonstrators that could supplement
the co-design of useful and easy-to-use AR apps aimed at benefiting diverse user groups
that include scholars, citizen scientists and the interested public, and (ii) to provide in-
teractive, informative and personalized views of individual artefacts and virtual recon-
structions. The creation of technology demonstrators was important because co-design
improves technology outcomes [20] and functional technology demonstrators can sup-
port the process by helping to inspire ideas [21]. In addition, technology demonstrators
are useful when some user groups may be unfamiliar with a new technology.
2 AR Apps for Museum ‘Takeouts’ and DIY Exhibitions
The requirements for the AR apps were to incorporate functionality to demonstrate:
virtual presentations of museum artefacts that may or may not be on physical display,
the collection and arrangement of virtual life-sized artefacts in augmented reality
alternative point cloud and wire mesh views of artefacts for interest and for educa-
tional insights into the structure of 3D models,
the optional display of artefact information,
artefact rotation to provide all-around views,
the option to display individual artefacts or to view them in their assembled or re-
constructed forms.
2.1 3D-Model Acquisitions
The models for the AR apps were acquired in 2018 and 2019 from National Museums
Liverpool (World Museum), UK using the Virtual Cuneiform Tablet Reconstruction
Project photogrammetric turntable system [22] and an Einscan-SP 3D scanner with
Discovery Pack. The models included a ‘shabti’ figurine and cuneiform tablets.
Shabtis are small funerary figurines that were placed in Egyptian tombs to act as
servants for the deceased. The shabti scanned for use in the app is over 2,500 years old
and was once owned by Florence Nightingale who spent several months in Egypt in
Cuneiform tablets are early written records from Mesopotamia, created by making
impressions in clay tablets. The tablets scanned for the app included one with an ac-
companying envelope bearing cylinder seal impressions. This envelope originally se-
cured a silver purchase recorded on a tablet within (which had been removed and the
envelope resealed). The scanned cuneiform tablets are included in the collection of
World Museum Liverpool tablets published in [23] and in the Cuneiform Digital Li-
brary Initiative (CDLI) database [24].
2.2 AR App Development
The Android AR app was developed using the Processing for Android AR Java library,
which itself uses the Google ARCore library. The iPhone AR app was coded in Swift
with Xcode using ARKit. The apps were tested in development by researchers and test-
ers accessing pre-release versions in Google Play test track and the Apple App Store
Test Pilot. Both apps require minimum device specifications; Android’s ARCore, at the
time of writing, requires Android 8.1 or later [25] and the iPhone app requires iOS11
and an A9 or later processor [26]. Example screenshots of the Android and iPhone apps
are shown in Fig. 1.
ARKit and Swift for iOS.
ARKit is Apple’s library for iOS device augmented reality development, supporting
motion tracking, surface detection and light estimation. It uses the smartphone’s camera
to identify and track features in the environment and estimate ambient light so that it
can position and illuminate selected objects in the camera view of the environment.
When running, ARKit extracts features from the camera images and builds a topo-
graphic map of the real world.
Xcode is Apple’s iOS Operating System Integrated Development Environment
(IDE) [27]. Xcode (IDE) version 11.7 (11E801a) was used to create the prototype ap-
plication with Swift being the predominant coding language. The app was developed
using SwiftUI (iOS 13). This allows for the building of user interfaces for any Apple
device using just one set of tools and APIs
Fig. 1. Android and iPhone AR app screen examples. For Android: (i) initial Android splash
screen, (ii) a positioned shabti and (iii) help menu explaining interface buttons. Below for iPh-
one: (iv) the splash screen, (vi) positioned shabti artefact with information display ON and (v) a
screenshot from the opening/closing animation of the cuneiform envelope that reveals the cu-
neiform tablet within.
ARCore and Processing for Android.
ARCore [28] is Google’s library for Android smartphone augmented reality devel-
opment, supporting motion tracking, surface detection and light estimation. It uses the
smartphone’s camera to identify and track features in the environment and estimate
ambient light so that it can position and illuminate selected objects in the camera view
of the environment. When running, ARCore extracts features from the camera images
and builds a 3D map of the real world. A collection of features located on a horizontal
or vertical plane is known as a ‘trackable’ and can have virtual objects attached to it
using an ‘anchor’. The Android AR app was created using ‘Processing for Android’
[29] which enables the creation of Processing ‘sketch’ programs for Android devices
and includes support for ‘AR in Processing’ [30] via the ARCore library.
Processing for Android has an advantage over development using Android Studio in
terms of ease-of-use and the simplicity of the code. A rudimentary demonstration AR
Processing app can be produced using less than 30 lines of code whereas an equivalent
Android Studio project would require the creation and navigation of a folder structure
containing 50 or more different files. However, the ease of getting started in Processing
for Android should not trivialize the coding required of full app development. At the
time of writing, the current version of the AR Museum app comprises six files that add
up to approximately 1200 lines of code (much of which is required for the loading and
interpretation of 3D object files).
There are also limitations to the Processing for Android approach; not all of the fea-
tures of the ARCore library are exposed to programmers by the Processing interface.
For example, depth mapping, augmented images and cloud anchors [28] cannot (at the
time of writing) be utilised in Processing. Additionally, the multiple ‘activities’ made
available by Android Studio are not available in Processing for Android. For example,
implementation of an initial 2D splash screen could be easily achieved in Android Stu-
dio by creating a new activity. In Processing, however, an overlay is needed to obscure
the main activity. The Processing splash screen screenshot is shown in Fig. 1(i).
To enhance the functionality of the app, user interface control buttons overlaying the
3D display were added. No library functions for this are provided. Normally, imple-
menting a 2D overlay on a 3D view in Processing would simply require the ‘camera’
to be placed in its default position meaning a 3D coordinate of (x, y, 0) would corre-
spond to a screen coordinate of (x, y); then, the controls can be drawn using standard
2D graphics functions. When using the AR library, however, camera positioning is dis-
abled so a lower level approach was required. Instead of utilizing Processing’s camera
placement function, the OpenGL projection matrix coefficients were directly reset to
achieve the same effect.
3 Design Decisions and View Options
Several design decisions were required during app development. For example, An-
droid’s ARCore library enables the detection of any flat surfaces, which would allow
objects to be impossibly attached to walls or suspended from ceilings. To simplify the
surface detection process, only horizontal surface detections were permitted. Additional
decisions were required to create a functional user interface. This was achieved using
intuitive icon designs based on past research [16]. The icons were placed away from
the centre of the camera view and artefact information was placed so that it remained
in view of its ON/OFF control. Improving the usability of user experience of AR apps
[31] is an important goal of further research and co-design. In addition to improving
the interface to the objects, it is also important to guide users in the basics of getting
started with the use of AR apps. For example, making recommendations on ambient
light levels, smartphone movement and surface selections because dim lighting, rapid
camera movement and smooth featureless surfaces all make surfaces difficult to detect.
As shown in Fig. 1, artefacts were placed on labelled plinths with clickable infor-
mationi’ buttons in both the Android and iPhone apps to toggle the display of artefact
information. Fig. 1(ii) shows the Android interface and (iii) shows the help screen (ac-
tivated by the top right ‘?’ help button) indicating the function of each of the interface
3.1 3D Model Views
One of the education goals of the apps included insights about the underpinning 3D
graphics of virtual objects. For this purpose, we implemented alternative artefact view
options to enable users to see low-resolution point cloud views of models and wire
mesh views, as well as photographically rendered views, as shown in Fig. 2.
Fig. 2. 3D model views. (i) low-fidelity point cloud view, (ii) wire mesh view and (iii) 3D pho-
tographically rendered view.
For the iPhone AR app, which did not implicitly support point cloud and mesh views
of objects, 3D models were created comprising small 3D sphere ‘points’ and meshes
made from narrow cylinder ‘wires’, respectively, to give the appearance of the views.
An additional supporting webpage multi-view interface [32, 33] was also created for
viewing from desktop computers and other devices, as shown in Fig. 3. The web page
also provides links to 3D file downloads for artefact models as both .ply and .stl formats
and in low- and medium-resolutions, for example, for printing artefacts at different
sizes as shown by the 3D printed shabti models in Fig. 4. Models were also made avail-
able for the cuneiform tablet [32].
(i) (ii)
Fig. 3. Interactive desktop browser interface with 3D multi-view options [33].
(i) Photographically rendered and (ii) wire mesh view.
Fig. 4. 3D prints of the shabti figurine. Transparent clear resin prints at full and reduced sizes.
3.2 Exhibition Views and Interactions
As shown in the example screenshots in Fig. 5, users can (i) arrange artefacts on de-
tected surfaces (with or without information display) and (ii) select an automated posi-
tioning format to arrange artefacts in line with each other.
The apps also support interaction with the artefacts, for example, with a toggle arte-
fact “rotate” function in the Android app.
Fig. 5. Views of eclectic Android app exhibition arrangements (i) user-positioned artefacts with
information display ON and background grid OFF and (ii) Exhibits viewed with automatic line-
up ON, information display OFF and background grid ON.
3.3 Individual and Reconstructed Artefact Views
Fragmented artefacts can be separated within collections or between museum collec-
tions that may be many kilometres apart [17]. For example, joining pieces of cuneiform
tablets have been found separated by 1,000 kilometres, in the British Museum and the
Musée d’Art et d’Histoire in Geneva [18, 19]. AR apps can enable the viewing of indi-
vidual fragmented or component artefacts or their constructed forms. For example, as
shown in Fig. 6, cuneiform envelopes and their tablet contents can be shown separately
or assembled with the contents (as it would have originally been) inside the envelope.
In Fig. 6 (i) the iPhone app animation opens and closes the envelope to reveal the con-
tents, in (ii) the Android app, on close inspection, reveals the enveloped contents to the
Fig. 6. Reconstructed views showing animated and ‘X-ray’ inside views, respectively for the
cuneiform tablet envelope contents in (i) the iPhone AR app and (ii) the Android AR app.
4 Opportunities and Future Development
4.1 Opportunities
3D models of artefacts offer new opportunities for curators in museum exhibition plan-
ning and design. Models provide accurate data for the arrangement of artefacts within
the fixed space of a showcase, giving a greater visual impression before installation,
conceivably reducing the need for last-minute changes in the days before opening. This
is of particular advantage when creating exhibitions that bring together artefacts from
numerous institutions and private collections. There is a greater call on museums to
extend the legacy of short-lived temporary exhibitions beyond the usual publication of
a catalogue. 3D models could be used to enhance a virtual tour of the gallery with in-
scribed artefacts such as cuneiform and shabtis, offering greater intellectual stimulation
for virtual visitors.
4.2 Future Development
The apps and their 3D models make demands of smartphone storage space and, in the
current apps, the example models are downloaded as part of the app installation. Ideally,
the app and models would be separate downloads and new models would be down-
loadable individually when selected. This would speed up updates to the app and allow
for new models to be downloaded more efficiently. In museum contexts this could be
via a local Wi-Fi service to reduce the burden on mobile data transfer limits.
In physical exhibition spaces, QR codes could reveal the details and locations of
available AR ‘takeout’ artefacts, enable easy selection and download, and provide links
to more information and multimedia resources. To engage and incentivize visitors, the
app could support social media sharing of artefacts, and user achievements could be
recognised for artefact collection milestones.
Achieving the functionality of the AR app with a minimal and uncluttered interface
was one of the main challenges of the app development. Further work and co-design
with different users and stakeholder groups is needed to evolve the apps from technol-
ogy demonstrators to fully functioning apps. Ideally, the apps would be supported by
infrastructure and data repositories to enable the download of 3D artefacts at appropri-
ate scale and resolution, and in suitable formats. Ideally, the apps would also support
language options, for example, from Google Translate, so that exhibit information dis-
plays could be shown in the user's chosen language. A further enhancement to this
would include the option for text to speech delivery of the information.
Acknowledgements: The development of the app was supported by Keele University
Faculty of Science Awards. The authors thank Ash Leake, from the School of Compu-
ting and Mathematics at Keele University, for the 3D printing of artefacts. The authors
also wish to thank National Museums Liverpool for permissions and support in achiev-
ing the acquisitions and apps.
1. Ding, M.: Augmented reality in museums. Museums & augmented reality–A collection of
essays from the arts management and technology laboratory, 1-15. (2017)
2. Morozova, A., How to use augmented reality in museums: examples and use cases,,
last accessed 2020/09/13.
3. González Vargas, J.C., Fabregat, R., Carrillo-Ramos, A., Jové, T.: Survey: Using augmented
reality to improve learning motivation in cultural heritage studies. Applied Sciences, 10(3),
897. (2020)
4. Challenor, J., Ma, M.: A review of augmented reality applications for history education and
heritage visualisation. Multimodal Technologies and Interaction, 3(2), 39. (2019)
5. Fernández-Palacios, B.J., Rizzi, A. and Nex, F., 2012, October. Augmented reality for ar-
chaeological finds. In Euro-Mediterranean Conference (pp. 181-190). Springer, Berlin, Hei-
6. Abate, A.F., Barra, S., Galeotafiore, G., Díaz, C., Aura, E., Sánchez, M., Mas, X. and Ven-
drell, E., 2018, October. An augmented reality mobile app for museums: virtual restoration
of a plate of glass. In Euro-Mediterranean Conference (pp. 539-547). Springer, Cham.
7. Tzima, S., Styliaras, G. and Bassounas, A., 2019. Augmented reality applications in educa-
tion: Teachers point of view. Education Sciences, 9(2), p.99.
8. Breuss-Schneeweis, P.: “The speaking celt”: augmented reality avatars guide through a mu-
seum - case study. In Proceedings of the ACM International Joint Conference on Pervasive
and Ubiquitous Computing: Adjunct, 1484-1491 (2016)
9. Alexander, J.: Gallery one at the Cleveland Museum of Art. Curator: The Museum Jour-
nal 57(3), 347-362 (2014)
10. Terracotta Warriors meet augmented reality at The Franklin Institute,, last ac-
cessed 2020/09/13.
11. Serravalle, F., Ferraris, A., Vrontis, D., Thrassou, A. and Christofi, M., 2019. Augmented
reality in the tourism industry: A multi-stakeholder analysis of museums. Tourism Manage-
ment Perspectives, 32, p.100549.
12. Damala, A., Stojanovic, N., Schuchert, T., Moragues, J., Cabrera, A. and Gilleade, K., 2012,
October. Adaptive augmented reality for cultural heritage: ARtSENSE project. In Euro-
Mediterranean Conference (pp. 746-755). Springer, Berlin, Heidelberg.
13. Pucihar, K.Č. and Kljun, M., 2018. ART for art: augmented reality taxonomy for art and
cultural heritage. In Augmented Reality Art (pp. 73-94). Springer, Cham.
14. The augmented reality museum app homepage,
Museum.html, last accessed 2020/09/13.
15. The Virtual Cuneiform Tablet Reconstruction Project homepage, https://virtualcunei-, last accessed 2020/09/13.
16. Woolley, S.I., Ch’ng, E., Hernandez-Munoz, L., Gehlken, E., Collins, T., Nash, D., Lewis,
A., Hanes, L.: A collaborative artefact reconstruction environment. In: Proceedings of the
British HCI Conference. BCS. Sunderland, UK, 3rd-6th July, 2017.
17. Woolley, S.I., Gehlken, E., Ch’ng, E., Collins, T.: Virtual archaeology: how we achieved
the first long-distance reconstruction of a cultural artefact, The Conversation, UK (Art and
Culture), 28 Feb 2018.
the-first-long-distance-reconstruction-of-a-cultural-artefact-91725, last accessed
18. Gehlken, E., Collins, T., Woolley, S.I., Hanes, L., Lewis, A., Munoz L.H., Ch’ng, E.:
Searching the past in the future - joining cuneiform tablet fragments in virtual collections,
63rd Rencontre Assyriologique Internationale, 2017.
19. Collins, T., Woolley, S., Gehlken, E., Ch'ng, E.: Computational aspects of model acquisition
and join geometry for the virtual reconstruction of the Atrahasis cuneiform tablet. In 23rd
International Conference on Virtual System & Multimedia (VSMM), 1-6. (2017)
20. Steen, M., Manschot, M., De Koning, N.: Benefits of co-design in service design projects,
International Journal of Design, 5(2), (2011).
21. Ciolfi, L., Avram, G., Maye, L., Dulake, N., Marshall, M.T., van Dijk, D., McDermott, F.:
Articulating co-design in museums: Reflections on two participatory processes. In Proceed-
ings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social
Computing, 13-25 (2016).
22. Collins, T., Woolley, S.I., Gehlken, E., Ch’ng, E.: Automated low-cost photogrammetric
acquisition of 3D models from small form-factor artefacts. Electronics, 8(12)-1441, 2019.
23. Cripps, E.L., Sargonic and presargonic texts in the World Museum Liverpool. Archaeopress,
24. Cripps, E.L. Short history of the cuneiform collection in the World Museum Liverpool,, last accessed 2020/09/13.
25. ARCore supported devices,,
last accessed 2020/09/13.
26. Apple Augmented Reality,, last accessed
27. Apple, 2020. Xcode - Apple Developer. [online] Apple Developer. Available at: https://de- [Accessed 14 September 2020]
28. ARCore Overview,, last accessed 2020/09/13.
29. Processing for Android,, last accessed
30., last accessed 2020/09/13.
31. Dey, A., Billinghurst, M., Lindeman, R.W., Swan, J.: A systematic review of 10 years of
augmented reality usability studies: 2005 to 2014. Frontiers in Robotics and AI, 5, 37 (2018).
32. 3D interactive cuneiform tablet,, last ac-
cessed 2020/09/13.
33. 3D interactive shabti model,, last accessed,
last accessed 2020/09/13.
... Preferences for heritage experiences are increasingly more visually oriented, with virtual reality considered as an engaging medium for heritage learning (Ch'ng et al., 2020). However, stepping into connected contextual 'other world' experiences can be afforded by a variety of digital experiences. ...
... For example, a virtual 3D gallery tour and a virtual 3D environment experience may not connect with each other, nor with the physical experience. Figure 3 (middle) summarises the connected virtual experiences prototyped by the VCTR AR Museum apps (Woolley et al., 2020). As shown in the Android AR app screenshot in Figure 4, the app demonstrates the enablement of museum visitors (virtual and physical) to acquire, curate, exhibit, share and interact with eclectic 'takeaway' 3D artefact models. ...
Conference Paper
Full-text available
In this paper we reflect on the interplay and the disconnects between real and virtual heritage experiences, and the fragmented nature of digital experiences. We consider the important engagement potential that virtual interactions bring to small less visible artefacts, like clay cuneiform tablets, and, with case study examples, we imagine museums of the future where engagements unite, blend and reinforce rich heritage experiences.
... Additionally, prototyping an Augmented Reality (AR) app for Android smartphones can be a daunting undertaking when using the Android Studio IDE, but a very simple AR app can be achieved with just 20 lines of code using Processing for Android. For this reason, we used Processing to create the 'Augmented Reality Museum App' (Woolley et al. 2020) shown in Figure 3 that was developed as part of the Virtual Cuneiform Tablet Reconstruction Project (VCTR 2021). ...
Conference Paper
Full-text available
Art undergraduates increasingly engage in the design and coding of digital media and generative art, for example, using the open-source Processing library and Integrated Development Environment (IDE) created for visual artists. An equivalent cross-over, of art and creativity into traditional computer science programmes, is not so evident despite a shortfall of skilled graduates in the digital media sector. This paper considers art for computer scientists and outlines a Keele University Animation and Multimedia module that uses media programming not only as a vehicle for first-year undergraduates to practice new Java programming skills, but also as an opportunity for artistic exploration and creative expression.
... It involved mid-career and senior academics with a range of research experience from just out of early career stage to those who'd been involved in HCI research before direct manipulation interfaces and WYSIWYG (What You See Is What You Get). Committee members had excellent complementarity as can be seen from indicative publications: Raymond Bond (Torney et al., 2016, Bond et al., 2019, Alan Dix (1992, 2007), Tom Flint (2016, 2018), Lynne Hall (2016, 2020, Gavin Sim (2003, 2015, Sandra Woolley (2019Woolley ( , 2020. ...
Full-text available
Cultural Heritage (CH) refers to the representation of historical places and traditional customs of a specific city or country. Its principal aim is to transmit to future generations how their ancestors lived, and what their customs and buildings were like, etc. Nowadays, there are different technology systems and research investigations that are focused on CH education that use augmented reality (AR), virtual reality (VR), and mixed reality (MR). The aim of this document is to specifically identify if the use of AR improves students’ motivation to learn about topics related to CH. To this end, studies from different databases and specific journals, along with those concerning technology systems, were evaluated, and comparisons were made between them. Additionally, the aspects that should be considered in future research to improve student motivation and technology systems were identified.
Full-text available
The photogrammetric acquisition of 3D object models can be achieved by Structure from Motion (SfM) computation of photographs taken from multiple viewpoints. All-around 3D models of small artefacts with complex geometry can be difficult to acquire photogrammetrically and the precision of the acquired models can be diminished by the generic application of automated photogrammetric workflows. In this paper, we present two versions of a complete rotary photogrammetric system and an automated workflow for all-around, precise, reliable and low-cost acquisitions of large numbers of small artefacts, together with consideration of the visual quality of the model textures. The acquisition systems comprise a turntable and (i) a computer and digital camera or (ii) a smartphone designed to be ultra-low cost (less than $150). Experimental results are presented which demonstrate an acquisition precision of less than 40 µm using a 12.2 Megapixel digital camera and less than 80 µm using an 8 Megapixel smartphone. The novel contribution of this work centres on the design of an automated solution that achieves high-precision, photographically textured 3D acquisitions at a fraction of the cost of currently available systems. This could significantly benefit the digitisation efforts of collectors, curators and archaeologists as well as the wider population.
Full-text available
Augmented reality is a field with a versatile range of applications used in many fields including recreation and education. Continually developing technology spanning the last decade has drastically improved the viability for augmented reality projects now that most of the population possesses a mobile device capable of supporting the graphic rendering systems required for them. Education in particular has benefited from these technological advances as there are now many fields of research branching into how augmented reality can be used in schools. For the purposes of Holocaust education however, there has been remarkable little research into how Augmented Reality can be used to enhance its delivery or impact. The purpose of this study is to speculate regarding the following questions: How is augmented reality currently being used to enhance history education? Does the usage of augmented reality assist in developing long-term memories? Is augmented reality capable of conveying the emotional weight of historical events? Will augmented reality be appropriate for teaching a complex field such as the Holocaust? To address these, multiple studies have been analysed for their research methodologies and how their findings may assist with the development of Holocaust education.
Full-text available
A common conclusion of several studies is that augmented reality (AR) applications can enhance the learning process, learning motivation and effectiveness. Despite the positive results, more research is necessary. The current work aims to study the degree of diffusion of AR technology and teachers’ opinion about the need for continuous training, the process of creating 3D models, and the feasibility of AR applications development by teachers and students in school settings. Teachers are the common element in every different educational system and play a key role in the integration and acceptance of technology in education. Qualitative research was conducted in February 2019 in rural and suburban areas of North-Western Greece on secondary education teachers of different specialties and the results showed that AR applications development is feasible under certain conditions, including the limitation of the curriculum as the main negative factor and the teacher’s personality and the desire for co-operation among teachers of different specialties as positive factors.
Full-text available
Augmented Reality (AR) interfaces have been studied extensively over the last few decades, with a growing number of user-based experiments. In this paper, we systematically review 10 years of the most influential AR user studies, from 2005 to 2014. A total of 291 papers with 369 individual user studies have been reviewed and classified based on their application areas. The primary contribution of the review is to present the broad landscape of user-based AR research, and to provide a high-level view of how that landscape has changed. We summarize the high-level contributions from each category of papers, and present examples of the most influential user studies. We also identify areas where there have been few user studies, and opportunities for future research. Among other things, we find that there is a growing trend toward handheld AR user studies, and that most studies are conducted in laboratory settings and do not involve pilot testing. This research will be useful for AR researchers who want to follow best practices in designing their own AR user studies.
Full-text available
The epic of Atrahasis is one of the most significant pieces of ancient Babylonian literature. It describes a creation myth, a great flood and the building of an ark, that significantly pre-dates a similar account in the Bible. The epic has survived millennia on clay tablets inscribed with cuneiform script. But the third tablet of one of the most complete surviving copies is broken. This potential join has been hypothesised for over 50 years, but never physically confirmed. Now, using 3D computational geometry, there is no longer a need to manoeuvre the physical fragments into the same room. Instead, we built 3D virtual models of the fragments and demonstrated that they join precisely. This is the first time that a long-distance virtual reconstruction of a cuneiform text has been achieved. (Virtual archaeology: how we achieved the first long-distance reconstruction of a cultural artefact, S. Woolley, E. Gehlken, E. Ch'ng and T. Collins, The Conversation, UK (Art and Culture), 28 Feb 2018
Full-text available
Existing augmented reality (AR) taxonomies act as generic classifiers of AR systems and do not provide any insights into technology adoption for a given use context. For this reason, this chapter proposes an alternative activity-based taxonomy method that is designed for technology adopters and aims to highlight how well are opportunities created by advances in technology really utilized in a specific context. The proposed method is evaluated on the case study of AR technology adoption in the context of art and cultural heritage. In this process, we build an AR taxonomy for art and cultural heritage which was used to classify 86 AR applications in this domain. The results of classification provided meaningful technology adoption insights, such as: (i) general lack of support for personalization and communication activities of visiting a museum where applications fail to exploit AR potential such as providing support for: context aware bookmarking, artistic expression of the visitor (e.g. enabling visitors to curate augmentations for the exhibited artefacts), sharing the visit experience of “I was here”; (ii) low quality of support for analytical activity where applications fail to show interesting information such as information that is there but cannot be seen by the naked eye; (iii) low quality of support for sensual activity where provided augmentations fall short of extending the art form or the artistic experience.
Conference Paper
Full-text available
The epic of Atrahasis is one of the most significant pieces of ancient Mesopotamian literature. The account has survived millennia on sets of clay tablets inscribed with cuneiform script; a sophisticated early writing system comprising signs formed from wedge-shaped impressions. The third tablet belonging to one of the most complete copies of the Atrahasis epic is broken. For over fifty years, one fragment, held in Geneva, was believed to join with another held in London. However, due to their 1000 km separation, the join had never been physically tested. This paper contributes a technological account of the successful virtual joining of the fragments, the first ever long-distance virtual join of its type.
In places of tourist interest and attractions, such as museums, Augmented Reality (AR) is an emerging technology that enhances (through additional digital contents) and leverages visitor experience creating opportunities for an array of immediate and peripheral stakeholders. However, to achieve this, both researchers and managers need to better understand how to effectively co-create value through the involvement of different stakeholders and their interconnected relationships. Thus, we analysed three interrelated streams of literature (digital innovation, tourism management and stakeholder theory) and we developed a conceptual paper that sheds light on AR in museums. An in-depth analysis of the topic allowed us to develop theoretical propositions and applications on the subject, in particular from a multi-stakeholder perspective. Finally, our research proposes a preliminary conceptual model that highlights the need for the identification of the roles and interactions of museum's stakeholders towards a more digitalised museum experience through AR.