ArticlePDF Available

Emulation as a strategy for the preservation of games: the KEEP project

Authors:
  • E-ARK Consortium

Abstract

Game preservation is a critical issue for game studies. Access to historic materials forms a vital core to research and this field is no different. However, there are serious challenges to overcome for preservationists in terms of developing a strategic and inclusion programme to retain access to obsolete games. Emulation, as a strategy already applied by major developers and the gaming community, is introduced and the KEEP project, designed to create an open emulation access platform is described. Author Keywords Games, preservation, emulation, archiving
Breaking New Ground: Innovation in Games, Play, Practice and Theory. Proceedings of DiGRA 2009
© 2009Authors & Digital Games Research Association (DiGRA). Personal and educational classroom use of this paper is allowed,
commercial use requires specific permission from the author.
Emulation as a strategy for the preservation of games: the
KEEP project
Dan Pinchbeck, David Anderson, Janet Delve,
Getaneh Otemu, Antonio Ciuffreda
University of Portsmouth
School of Creative Technologies, Middle Street,
Portsmouth, PO1 2DJ
dan.pinchbeck@port.ac.uk
Andreas Lange
ComputerSpieleMuseum
Marchlewskistr. 27 D- 10243 Berlin
lange@computerspielemuseum.de
ABSTRACT
Game preservation is a critical issue for game studies.
Access to historic materials forms a vital core to research
and this field is no different. However, there are serious
challenges to overcome for preservationists in terms of
developing a strategic and inclusion programme to retain
access to obsolete games. Emulation, as a strategy already
applied by major developers and the gaming community, is
introduced and the KEEP project, designed to create an
open emulation access platform is described.
Author Keywords
Games, preservation, emulation, archiving
WHY GAME PRESERVATION MATTERS TO GAME
STUDIES
The preservation of digital games is of vital importance to
game studies. As with any other field, a record of the
historical development of the medium and access to specific
artifacts within this should not be underestimated. This
extends beyond classic or important titles, to the vast
numbers of less well known or critically lauded games
released over the last thirty years.
Preservation of games tends to be piecemeal. National
libraries and archives do not currently have systematic
strategies for collection and certainly not for preserving
access and runtime functionality. Private collectors and
fansites have played a major role in providing access but
once again, these are not systematic and access remains a
problem. This paper splits the issue into two major
components: the necessity for creating archives of digital
games; and the barriers to successfully preserving access to
these.
Software development is rarely carried out from the ground
up, that is, without the re-use and adaptation of existing
technologies and techniques, and this is particularly evident
in the games industry. Dynasties of build engines,
middleware and plug-ins form an essential map of the
history of the medium. As an example of this, consider the
Source engine, used for titles such as Half Life 2 [24].
Source arose from the GoldSrc Engine, itself derived from
the original Quake engine [10]. It is thus linked into an
engine dynasty with the entire Quake series, dating back to
1996. Quake 4, however, was built using the idTech 4
engine, part of another dynasty which originates in the
original Doom engine [11]. Adoption of both engines has
been limited in comparison to a third dynasty, the Unreal
engine, whose most recent ‘children’ include Bioshock [1],
Unreal Tournament 3 [5], Gears of War [6] and Turok.
What makes one engine more frequently adopted than
another should be an interesting question for scholars
interested in the development of games from a variety of
perspectives from development processes, economics and
licensing to design, functionality and specific approaches
graphics, audio, multiplayer options and artificial
intelligence.
To extend this example, now consider an approach a
scholar may take to addressing this question. On one hand,
a film researcher interested in the influence of A Trip to the
Moon [17] could rely on secondary data: interviews, pieces
written by other scholars or journalists or critics, examples
of derivative material. But it is a highly questionable
approach to not actually engage with the primary data itself:
to not examine A Trip to the Moon as a piece of media.
Likewise, an understanding of the Unreal engine is always
going to be limited if the actual objects built using the
technology are not examined. Recent publications have
argued that the playing of games constitutes an essential
aspect of game studies [2], and beyond this there is the
under-represented but equally vital process of reverse
engineering build tools and component parts to understand
how they function. A robust understanding of why and how
the Unreal 1.5 engine was selected, adapted and applied,
and what impact this process had upon the final object that
is Deus Ex [12] becomes extremely difficult without access
to this primary data. On one level, verifying secondary data
is critical to avoid error, on the other, we cannot assume a
comprehensive exposition of the object will be available –
indeed, this is a completely unrealistic assumption to make.
The requirement to preserve access to both games
themselves and their build tools (and raw data) extends far
beyond the technology driven research suggested by this. It
has been argued, for example, that any cultural
interpretation of game elements should rest within the ludic
and technological constraints and circumstances that
underpin the objects themselves – that there is room for
structuralism as well as culturalism in game studies,
regardless of the dubious outcomes of the so-called
ludology/narratology debates. For example, the
visualization of agents in games is impacted by the
difficulties in real-time graphical representation. Hard
armor is less processor intensive than soft, flowing cloth; a
leather catsuit is much easier to deal with than Jedi robes.
Games are, after all, products of economy, whether the
constraints are monetary or technological. When asking
why games look as they do, then, understanding what they
are capable of, in system terms, is at least as important as
any cultural reading.
Running alongside this is the content-driven evolution of a
medium, including archetypal mechanisms of play as well
as symbolic and semantic representations of game elements.
Searching for an explanation of why Birth of a Nation was
shot in black and white and not engaging with the brute fact
that only black and white film was available is self-
evidently ridiculous; a similar process of understanding
historical constraints upon design is critical for game
studies. Content and construction are fundamentally
interwoven in this medium and in order to properly
understand historical objects, we must preserve access to
both the primary data and build tools. This is therefore in
addition to the power of using historical games as tools in
game education. Assigning students the task of developing
according to the constraints of an older, simpler system and
giving them access to the objects developed in this period to
understand how solutions were reached has clear pedagogic
power. This should be especially relevant now, with the rise
of mobile gaming suddenly undercutting the graphics arms-
race and returning us to games that have more in common
with Manic Miner [4] than Assassin’s Creed [23]. Likewise,
in a medium where the shoulders of giants form the basic
building blocks of development, understanding how
evolution has occurred by assessing historical artifacts
allows us to not only trace clear problem/solution pathways,
but to question assumed methodologies by identifying their
origins and processes of adoption. In short, to move
forwards with any aspect of game studies without keeping
the preservation of, and access to, historical artifacts is
near-sighted, self-defeating and, considered against the core
activities of other fields of media, art and cultural studies,
palpably absurd.
SPECIFIC ISSUES IN GAME PRESERVATION
Having said that, games are particularly difficult to preserve
and it is perhaps no surprise that so little has been achieved
when the full complexities of what it means to archive a
game are considered, let alone the technical problems with
retaining runtime functionality. In this section, we will offer
an illustration of just how complex capturing a full game
actually is, and offer some indication of the quantity of
supplementary data that may be of interest surrounding an
archived game, before moving on to how the KEEP project,
in particular, is tackling the second problem.
Assuming for a moment, that the core technical issue of
platform obsolescence can be bypassed, what exactly does
it mean to archive a game such as S.T.A.L.K.E.R.: Shadow
of Chernobyl [7]? The game was notoriously buggy when
released and six separate patches were subsequently
released to fix most major issues and add additional
functionality that was omitted in the initial release. On top
of this, the game could be purchased in disc format, or
digitally downloaded, each of which required separate
patches. Localisation meant translation of the substantial
text-based dialogue trees. A Collector’s Edition box
contained not only the standard game manual but additional
print such as a Zone Map and Survival Guide, and a DVD
containing supplementary images, text and video files. The
multiplayer option meant the establishment of servers, both
official and unofficial, which contain data about the history
of the online aspect of the game, not to mention potential
information about how these online games have been
played and whether they fit any generalisable pattern of,
say, deathmatch behavior (itself an understudied and
important question in game studies). Alongside all of this
official data, S.T.A.L.K.E.R. has been adapted and altered
by the modding community, adding new assets, tweaking
and fixing code and, in the case of Kanyhalos’ Oblivion
Lost [13], subject to major revision. If all this wasn’t
enough, the community of gamers has also added reviews,
discussions, walkthroughs, forum arguments, cheats and
hacks, not to mention that the proprietary XRay engine
developed for the game has evolved along with the sequels,
Clear Sky [8] and the forthcoming Call of Pripyat [9]. This
is not only a vast body of data surrounding a single object
(which, in itself requires 10Gb of hard disc space to store),
but it raises profound questions of what should be
prioritized in terms of preservation. For games ported to
several platforms the problem increases for each variation
on the game. It is necessary, for example, to preserve all
versions of the release – digital and DVD-ROM based – in
the original form, or final, patched, version? If the patches
are deemed important, as they presumably should be for
any scholar interested in the shift in development practice
towards releasing clearly unfinished games, then how are
these to be stored and what relationship should they have to
the artifact itself? Archivists are faced with a stark choice:
collect and archive everything, including multiple versions
of the same object; or make decisions about excluding
material strategically to make the process more feasible,
2
and risk consigning what may one day be important data to
the rubbish bin. Finally, there is the question of the XRay
engine itself. Unlike many FPS games, which fall into
dynasties of engines, GSC Gameworld created a proprietary
engine for the game, including features that do not exist in
these others, such as the dynamic A-Life engine. The issues
with intellectual property will be covered in a later section,
but for the moment, it is worth noting that alongside final
products and supplementary data, the tools and build data
for games is of equal value to future researchers.
Understanding, for example, how the X-Ray engine
functions; issues and advantages in developing using it
compared to other build engines; its use of middleware and
plug-ins; its handling of AI and rendering; the relationship
between scripted sequences, sandbox design and diegetic
and gameplay construction, all have a profound value to
scholars as well as future educators and developers. Put
another way, being able to access pre-compiled game data
enables a far deeper, greater understanding of the game as a
media artifact than simply playing or studying the final
build. Thus, game preservation should not only aim to
capture games and their surrounding data, but, wherever
possible, the tools and assets used to create them.
Technically, the problems do not get any easier.
S.T.A.L.K.E.R., like any other game, is reliant upon an
operating system with the correct system specifications to
run. The issue here is self-evident: operating systems, along
with hardware components, are superseded and become
obsolete. This is as true for consoles as home computers.
Obsolescence is a major issue in the preservation of any
digital artifact. Historically, the primary solution has been
to migrate such artifacts to current platforms to enable
continuing access to them. Migration effectively means
altering the code of an object to enable it to be rendered on
a non-obsolete platform [25]. However, migration
inevitably accelerates the process of bit-rot, or data
degradation, meaning that the life-span of migrated objects
is generally reduced. Further, migration is highly
inefficient, as every time it is required, every object must be
migrated individually [21, 22]. Equally, the process of
migrating an artifact such as a game is substantially more
complicated than a simpler object such as an electronic
document, or even audiovisual data. Put simply, migration
is of limited value in digital preservation generally, and of
extremely limited value in the preservation of games.
EMULATION AND GAMES
The alternative to migration is emulation, meaning that the
environment used to run the game in its original format is
recreated virtually on a contemporary platform. The focus
therefore is on the platform or enabling technologies
required the run the object, not on the object itself This is
left untouched, which has advantages in terms of not
contributing to data degradation. It also means that dealing
with obsolescence becomes an issue of creating new
emulators for platforms as they become obsolete, so large
numbers of objects can be served by more generic
emulators, streamlining the process of preserving access.
A number of issues have been put forward as arguments
against emulation as a core strategy for preservation. For
example, Phelps & Watry [18] have contended that a major
block is that emulation prevents searching within
documents. Whilst this is a concern, it demonstrates the
slant in digital preservation towards textual material and has
far less problematic implications for games. The complexity
of emulation systems (particularly for users, where lack of
technical knowledge may be prohibitive) is a bigger
problem. Do if the fact that the overwhelming majority of
current emulators rely upon specific platforms which are
equally vulnerable to obsolescence, and potential issues
with stacking emulators to reach an object. In other words,
emulating an environment from which to emulate another
environment within which to run an object is theoretically
possible but remains largely untested. Finally, as Bearman
[3] notes, many of the target environments for emulation
are locked within copyrights even after obsolescence. This
is particularly true for games console emulators. It is clear
that emulation is certainly not a ‘magic bullet’ solution, as
Bearman caricatures Rothman’s approach. However, the
notion of migrating games, even without considering the
general problems with migration, is clearly not feasible.
Even the most cursory conversation with developers about
the technicalities of porting games would make that
explicit.
It is not surprising, therefore, that emulation has been
adopted by the games preservation community more widely
than for other digital objects, with a large number of
solutions already existing in the public domain. The
MAME architecture, which enables the emulation of large
numbers of arcade games, is well established and well-
known [16]. There are a substantial number of console
emulators also in existence, although the majority of these
rely on hacked BIOS to function and therefore infringe
copyright law. Emulation is also used actively by legitimate
platform developers: Sony’s PlayStation 3 contains a PSX
emulator, as does the PSP, and the Nintendo Wii Store
offers access to a wide selection of previous console titles
via emulation and the purchase of a bespoke hardware
controller. Ensuring backwards compatibility through
emulation makes sound financial sense as it extends the
shelf-life of intellectual property and it seems likely that
this policy will continue.
The biggest problem, however, with all current emulators is
their own obsolescence, as each is built for a specific
platform and thus vulnerable to this being superseded. It is
this issue that the KEEP project aims to tackle directly.
THE KEEP ARCHITECTURE
In January 2009, the KEEP (Keep Emulation Environments
Portable) project was launched. Funded through the
European Commission’s Framework 7 program, KEEP is
being developed by an international consortium: the
3
national libraries of France, Germany and the Netherlands;
Tessella (UK/NL) and Joguin SAS (FR), software
developers specializing in preservation; project consultants
CrossCzech (CZ); the ComputerspieleMuseum (DK); the
University of Portsmouth (UK) and the European Game
Developers Federation. The first phase of the project,
lasting until 2012, aims to develop a prototype of an
emulation access platform to enhance the preservation of
digital objects, with a particular focus on digital games.
Unlike current emulator systems, KEEP is not built upon a
specific platform technology, but a virtual machine. This
follows the conceptualization of such a system by
Rothenberg [22] and Lorie [14]. The OLONYS system,
developed by Joguin SAS, is a series of virtual machines
stacked in order of complexity that will interface with, and
support a modular emulation framework [15]. Thus, at root,
KEEP is far more future proof than current solutions. It also
has the advantage of offering multiple emulation solutions
to any given artifact within a single, user-friendly interface,
allowing both bespoke manual configuration of the
emulation process and a more automated and simple means
of accessing artifacts for users not requiring this. On one
hand then, KEEP benefits from less reliance upon any given
platform and a modular architecture that enables
independently developed emulators to function within its
framework (meaning that existing components can be
integrated). On the other, it enables archivists and
researchers to bypass the traditionally complicated process
of installing and running emulation software.
Alongside the emulation access platform, KEEP will also
develop a transfer tools framework to enable new objects to
be integrated with the system. The existing archives of the
Bibliothèque nationale de France, Koninklijke Bibliotheek,
Deutsche Nationalbibliothek and Computerspiele Museum
will form the initial core of the available KEEP archive, but
this framework will establish a means for new objects to be
added. Further, research is currently being carried out to
supplement current metadata standards for archiving with
emulation metadata to ensure high compatibility with
existing international archives. Part of this process is
evaluating and enhancing existing metadata systems to
ensure maximum compatibility with games.
KEEP’s focus is on retention on existing emulation work
and enabling archives and users to transfer objects to the
KEEP architecture as seamlessly as is possible. Whilst
conceptually and technologically advanced, it is, at root, a
deeply pragmatic solution to an extremely difficult
problem.
If successful, the impact of KEEP upon game studies will
be profound. Although initially limited to localized access
in three European countries, a second phase of the project
(once proof of concept is established) may be to roll the
architecture out to other archives internationally, and to
explore potential public release of the system so individual
users can transfer their obsolete media into KKEP and
retain access. This is one means of circumventing the
normal copyright problems, as KEEP therefore functions as
an enabling technology for ‘home archiving’ to supplement
access to archived artifacts held in national storage.
In terms of both research and education, a modular and
open emulation platform goes some way to addressing the
difficulties of archiving the large bodies of material
surrounding most games. For example, the Dioscuri
emulator, which emulates x86 hardware and can run
Windows98 [26], has the potential to enable game patches
or build engines to be run within this native environment. In
other words, emulating hardware to provide native access to
the functionalities of obsolete operating systems provides
access to many of the other tools and data surrounding a
game, rather than simply providing access to the final
object itself. Equally, a modular architecture means that a
variety of emulators, each with particular strengths and
functionalities may be selected to access a particular object.
So a user seeking to simply emulate S.T.A.L.K.E.R. to
access the game may opt for an emulator that sacrifices
additional O/S functionality for increased performance,
whereas another looking specifically at the codebase behind
the game may choose to dispose of advanced graphics
emulation in favour of alternate functionality.
There is a caveat in all of this, of course. The emulators
themselves need to be written, and the KEEP consortium is
working closely with the existing emulator community to
try and maintain a high level of adaptability with the
emulation access platform and current emulators. The
metadata extensions require a careful balancing act between
what is both pragmatic and compatible for archives to
implement, and the high-level information required for the
modular emulation architecture to function intelligently.
There are outstanding issues with copyright protection in
regard to games that a legal study is exploring. Nor does
KEEP directly solve the issue of the large potential body of
supplementary information surrounding each game.
However, it is the first systematic, large-scale attempt to
solve the technical problems with access retention for this
medium and, as such, presents a major potential benefit to
the games research and education community.
CONCLUSION
Game studies requires systematic archiving of historical
titles. Otherwise it runs a serious risk of data loss. Personal
collections, fan archives such as Abandonia or Home of the
Underdogs and the rerelease of old IP through digital
distribution such as Playstation Store cannot and should not
be counted on to ensure access is protected for obsolete
titles. The preservation of games is a very difficult issue,
partially due to the large bodies of secondary artifacts
surrounding each release; partially due to big fixes and
patches, and partially due to the technological challenges of
ensuring run-time functionality. Emulation is the only real
solution to this challenge, but an open access architecture
based upon a virtual machine is the only means of future
4
5
proofing these emulators from the same cycle of
obsolescence as faces the original media. Hardware
emulation and a modular framework not only enables
original titles to be run, but offers access to codebases,
build engines, middleware and game assets, all of which, it
has been argued, are of potential equal value to future
scholars.
The KEEP project, like emulation in general, is not a
‘magic bullet’ solution. It cannot ease the secondary artifact
burden, nor can it resolve the problems with the
continuation of copyright post-obsolescence. What is does
offer, however, is the best current solution to retaining
access to obsolete games into the future. For this reason, it
should be of interest to any researchers interested in the
past, or the future, of our medium and field.
REFERENCES
1. 2K Boston + 2K Australia. Bioshock. 2K Games: PC.
2007
2. Aarseth, E. Playing Research: Methodological
approaches to game analysis, Melbourne DAC - the 5th
International Digital Arts and Culture Conference (2003).
3. Bearman, D. Reality and Chimeras in the Preservation of
Electronic Records. In D-Lib Magazine vol 5 no 4, (April
1999)
4. Bug-Byte. Manic Miner. Bug-Byte: ZX Spectrum, 1983
5. Epic Games. Unreal Tournament 3. Midway: PC, 2007
6. Epic Games. Gears of War. Microsoft Game Studios:
Xbox360, 2006
7. GSC Game World. S.T.A.L.K.E.R.: Shadow of
Chernobyl. THQ/GSC: PC, 2007
8. GSC Game World. S.T.A.L.K.E.R.: Clear Sky. Deep
Silver: PC, 2008
9. GSC Game World. ST.A.L.K.E.R.: Call of Pripyat. PC,
Q42009
10. id Software. Quake. id Software: PC, 1996
11. id Software. Doom 3. Activision: PC, 2003
12. Ion Storm. Deus Ex. Eidos Interactive: PC, 2000
13. Kanyhalos. Oblivion Lost. Uncommercial modification
for S.T.A.L.K.E.R.: Shadow of Chernobyl. PC, 2007.
Available from www.oblivion-lost.net
14. Lorie, R.A. Long-term archiving of digital information.
IBM Research report, IBM Almaden Research Center, San
Jose, Almaden, 2000
15. Joguin, V. Emulating emulators for long-term digital
objects preservation: the need for a universal machine.
Emulation Expert Meeting 2006, (The Hague, The
Netherlands).
16. MAME: Multiple Arcade Machine Emulator (1997-
2009). Information available at http://mamedev.org
.
Retreived 21/7/2009.
17. Méliès, G. A Trip to the Moon. 1902
18. Phelps, T.A., & Watry, P. A no-compromises
architecture for digital document preservation. ECDL,
2005: 266-277.
19. Propaganda Games. Turok. Touchstone: PC, 2008
20. Raven Software. Quake 4. Activision: PC, 2004
21. Rothenberg, J. Avoiding Technological Quicksand:
Finding a Viable Technical Foundation for Digital
Preservation. A Report to the Council on Library and
Information Resources. Washington: Council on Library
and Information Resources, 1999
22. Rothenberg, J. Using Emulation to Preserve Digital
Documents. The Netherlands: Koninklijke Bibliotheek,
2000
23. Ubisoft Montreal. Assassin’s Creed. Ubisoft: PC, 2008
24. Valve. Half Life 2. VU Games: PC, 2004
25. van der Hoeven, J., vanWijngaarden, H. Modular
emulation as a long-term preservation strategy for digital
objects. In: Proceedings of the 5
th
International Web
Archiving Workshop (IWAW05) held in conjunction with
the 8th European Conference on Research and Advanced
Technologies for Digital Libraries (Vienna, Austria,
September 2005
26. van der Hoeven, J.; Lohman, B.; and Verdegem, R.
Emulation for digital preservation in practice: The
results. International Journal of Digital Curation vol.2, no.
2 (2007), pp123– 132.
... Migrating the quantity of code required to ensure runtime viability of a modern game is simply impractical (Pinchbeck et al., 2009). Dondorp and van der Meer (2003) conclude that of Quake (id Software 1996) that " Rebuilding such a game is a gruesome operation that might easily compare to the complexity of emulation of the computing platform. ...
... Likewise, Media Molecule's Little Big Planet (2008) is less a game than an engine for the construction and sharing of user-generated content. More discussion of associated objects can be found in Pinchbeck et al. (2009) and Lowood et al. (2009). Barwick (2009), Lowood et al. (2009), Gieske (2002), and others have begun the process of understanding why games have been ignored by preservationists for so long, but at least the need is now generally recognized. ...
... They will be dealt with here in reverse order. Migrating the quantity of code required to ensure runtime viability of a modern game is simply impractical (Pinchbeck et al., 2009). Dondorp and van der Meer (2003) conclude that of Quake (id Software 1996) that " Rebuilding such a game is a gruesome operation that might easily compare to the complexity of emulation of the computing platform. ...
Article
Full-text available
The KEEP project is the first of its kind to seriously research the kind of emulation based on a virtual machine as put forward by Lorie (2002). In addition to creating such a virtual machine, a number of other supporting tools and techniques are also being developed as part of this EC FP7 project. One of these is an emulation metadata data model with a dual purpose: first, for use as the basis of a database that forms part of the Emulation Framework that will run on the KEEP Virtual Machine (KVM); and, second, for use as the core of an emulation metadata standard, envisaged to be taken up by the wider community. This paper is thus very much geared toward a practical discussion of emulation. However, before the digital preservation community will consider emulation as a viable option compared to migration; it is imperative that the polarized positions exemplified by Rothenberg and Bearman are carefully analyzed, deconstructed, and, where necessary, set aside. In this way, some options that have previously been dismissed out of hand can be allowed to resurface, and their relative merits be reconsidered. The second part of the article comprises a detailed investigation of the technical environment necessary to emulate a given digital object. The technical environment data, thus obtained, is then used to create the core of the emulation metadata model. The article concludes with a consideration of video games' metadata, as games represent the most complex digital objects planned to be emulated as part of the KEEP project.
... Over the past few years, the subject of digital game preservation has moved up the research agenda with the 'Preserving Virtual Worlds' project (see McDonough et al., 2010), the Independent Game Developers Association Game Preservation Special Interest Group's white paper (Lowood, 2009), and the European KEEP 1 project (see Pinchbeck et al., 2009) among a growing number of projects turning their attentions to matters of capturing the complexities of gaming environments, arresting media decay and "bit rot," and emulating obsolete gaming platforms. The UK's National Videogame Archive (NVA) is one such project and its work is the focus of much of this article. ...
... (Monnens, 2009b) Among game studies scholars, the idea that digital games are vulnerable and impermanent is one that has only comparatively recently begun to gain ground. Indeed, 2009's Digital Games Research Association annual conference was the first to include a panel of papers dedicated to matters of game preservation, (see Barwick et al., 2009;Lowood et al., 2009;Pinchbeck et al., 2009;Newman & Woolley, 2009). ...
Article
Full-text available
The subject of digital game preservation is one that has moved up the research agenda in recent years with a number of international projects, such as KEEP and Preserving Virtual Worlds, highlighting and seeking to address the impact of media decay, hardware and software obsolescence through different strategies including code emulation, for instance. Similarly, and reflecting a popular interest in the histories of digital games, exhibitions such as Game On (Barbican, UK) and GameCity (Nottingham, UK) experiment with ways of presenting games to a general audience. This article focuses on the UK's National Videogame Archive (NVA) which, since its foundation in 2008, has developed approaches that both dovetail with and critique existing strategies to game preservation, exhibition and display. The article begins by noting the NVA's interest in preserving not only the code or text of the game, but also the experience of using it – that is, the preservation of gameplay as well as games. This approach is born of a conceptualisation of digital games as what Moulthrop (2004) has called "configurative performances" that are made through the interaction of code, systems, rules and, essentially, the actions of players at play. The analysis develops by problematising technical solutions to game preservation by exploring the way seemingly minute differences in code execution greatly impact on this user experience. Given these issues, the article demonstrates how the NVA returns to first principles and questions the taken-for-granted assumption that the playable game is the most effective tool for interpretation. It also encourages a consideration of the uses of non-interactive audiovisual and (para)textual materials in game preservation activity. In particular, the focus falls upon player-produced walkthrough texts, which are presented as archetypical archival documents of gameplay. The article concludes by provocatively positing that these non-playable, non-interactive texts might be more useful to future game scholars than the playable game itself.
... Dans la section « Foire aux questions » (FAQ) du site web de MAME, les développeurs affirment qu'« émuler une autre plateforme est parfaitement légal en soi. Le cas a fait juris-études analysent ce flou juridique (Farrand, 2012 ;Conley et al., 2004 ;Pinchbeck et al., 2009), mais malgré cela, davantage d'attention doit être portée aux conséquences rhétoriques et idéologiques des pratiques historiques et archivistiques suscitées par des projets comme MAME. ...
Article
Full-text available
The paper is devoted to the analysis of the emulator called “Multiple Arcade Machine Emulator” (MAME), a piece of software developed with the intent of recreating the inner functioning of arcade video games. While MAME allows players to re-play extinct games, its rhetoric and positioning towards the history of the medium should be discussed. On the one hand, MAME’s claim to authenticity significantly downplays the role of materiality in the experience of playing arcade video games. Moreover, MAME’s focus on exact reproduction triggers a number of questions regarding the value and extent of archival, archeological and preservative practices within the context of arcade video games. The paper will eventually confront MAME’s implicit rhetoric of “video game code as an entity that speaks truth” (Murphy, 2013), with a more nuanced reading of the experience of playing a video game, encompassing materiality, hardware and the social context.
... This shift in understanding also enables preservation institutions and research organizations to begin to address the practical preservation requirements that need to be fulfilled in order to preserve such complex objects for future generations. The experience of maintaining access to old software like computer games using emulation [5,12] helps to understand the envisioned process. ...
Article
Full-text available
Digital objects are often more complex than their common perception as individual files or small sets of files. Traditional methods of preserving these complex objects such as migration may not be suitable for maintaining access to them in an economically and technically feasible way. Many of today’s preservation scenarios would benefit from a change in our understanding of digital objects. Instead of focusing on single digital files or small groups of files as they are commonly conceived of, computer systems in full should be considered. The preservation community could benefit from widening its collecting scope to include complex objects such at scientific desktops, databases, machines running networked business processes or computers of famous people such as authors or politicians. Such objects are not just interesting in their own right but also have to potential to provide a more immersive and contextually rich experience than simpler digital object. In this paper we describe a workflow to be used for replicating installed application environments onto emulated or virtualized hardware, we discuss the potential for automating steps in the workflow and conclude by addressing some of the possible issues with this approach. We focus on the x86 architecture but we also discuss considerations for generalising the workflow to work with a wider range of architectures such as Macintoshes or Android smartphones.
... It should be noted that Quake represents a relatively straightforward game with no network dependencies, and straightforward I/O devices. P r e -P r i n t There are various European-based digital preservation projects attempting to build viable models for emulation of complex digital interactive systems [27], but these projects are still in their early stages and hence difficult to evaluate. More generally, emulation is problematic in that its products are imprecise in ways that are difficult to measure {c.f., [13]}. ...
Conference Paper
Full-text available
Videogames and other new media artifacts constitute an important part of our cultural and economic landscape and collecting institutions have a responsibility to collect and preserve these materials for future access. Unfortunately, these kinds of materials present unique challenges for collecting institutions including problems of collection development, technological preservation, and access. This paper presents findings from a grant-funded project focused on examining documentation of the creative process in game development. Data includes twelve qualitative interviews conducted with individuals involved in the game development process, spanning a number of different roles and institution types. The most pressing findings are related to the nature of documentation in the videogame industry: project interviews indicate that the game development process does produce significant and important documentation as traditionally conceived by collecting institutions, ranging from game design documents to email correspondence and business reports. However, while it does exist, traditional documentation does not adequately, or even, at times, truthfully represent the project or the game creation process as a whole. In order to adequately represent the development process, collecting institutions also need to seek out and procure numerous versions of games and game assets as well as those game assets that are natural byproducts of the design process like gamma and beta versions of the game, for example, vertical slices, or different renderings of graphical elements.
Article
Full-text available
Le jeu vidéo a la particularité d’impliquer non pas uniquement des créateurs, mais aussi et surtout des joueurs, regroupés parfois en communautés extrêmement organisées sur Internet. Ces communautés ont souvent une dévotion suffisamment forte vis-à-vis de leurs objets qu’ils en deviennent les témoins privilégiés, des acteurs de premier plan lorsque vient le temps d’écrire l’histoire du médium vidéoludique. Les communautés de joueurs tiennent très certainement un rôle de médiateur historiographique dans la conservation mémorielle du patrimoine vidéoludique. Cependant, ce rôle reste encore à être mieux compris. Quelle information historique est-elle récoltée par les adeptes de jeux vidéo et entretenue dans leurs archives? Comment les pratiques archivistiques des joueurs nous permettent-elles de préciser l’historiographie du jeu vidéo? Afin d’envisager et de relever les enjeux que représente une historicisation du jeu vidéo élaborée dans la perspective des joueurs, cet article propose de consolider une méthode de travail micro-historique nommée l’« histoire de la jouabilité ». Videogame has the particularity to involve not exclusively creators, but also and especially players, sometimes grouped in rigorously organized communities over the Internet. The remarkable devotion of these communities for their objects situate them as privileged witnesses and core historiographical actors to document, preserve and write the history of videogames. In that regard, player communities and their collective archives play a crucial role as curator and mediator of the videoludic heritage. However, this role still needs to be better understood. What conceptualization of history is put forward by videogame fans and their archives? How players’ archival practices allow us to reevaluate the historiography of video games? In order to explore and undertake the challenges that represent a videogame history developed from the perspective of players, this article consolidates a micro-historical method called a « history of gameplay ».
Article
Full-text available
Recent attention to the question of preservation and exhibition of video games in cultural institutions such as museums indicates that this media form is moving from being seen as contentious consumer object to cultural heritage. This empirical study examines two recent museum exhibitions of digital games: GameOn 2.0 at the National Museum of Science and Technology in Stockholm (TM), and Women in Game Development at the Museum of Art and Digital Entertainment, Oakland (MADE). The aim is to explore how games are appropriated within such institutions, and thereby how they are configured as cultural heritage and exhibitable culture. The study uses actor-network theory in order to analyse heterogeneous actors working in conjunction in such processes, specifically focusing on translation of games and game culture as they are repositioned within museums. The study explores how games are selectively recruited at both institutions and thereby translated in order to fit exhibition networks, in both cases leading to a glossing over of contentious issues in games and game culture. In turn, this has led to a more palatable but less nuanced transformation of video games into cultural heritage. While translating video games into cultural heritage, the process of making games exhibitable lost track of games as culture by focusing on physical artefacts and interactive, playable fun. It also lost track of them as situated in our culture by skimming over or ignoring the current contentious nature of digital games, and finally, it lost track of games as being produced and experienced in a particular context, or games of culture.
Conference Paper
Until now, emulation of legacy architectures has mostly been seen as a tool for hobbyists and as technical nostalgia. However, in a world in which research and development is producing almost entirely digital artifacts, new and efficient concepts for preservation and re-use are required. Furthermore, a significant amount of today's cultural work is purely digital. Hence, emulation technology appeals to a wider, non-technical, user-group since many of our digital objects cannot be re-used properly without a suitable runtime environment. This article presents a scalable and cost-effective Cloud-based Emulation-as-a-Service (EaaS) architecture, enabling a wide range of non-technical users to access emulation technology in order to re-enact their digital belongings. Together with a distributed storage and data management model we present an implementation from the domain of digital art to demonstrate the practicability of the proposed EaaS architecture.
Conference Paper
Preservation of complex, non-linear digital objects such as digital art or ancient computer environments has been a domain reserved for experts until now. Digital culture, however, is a broader phenomenon. With the introduction of the so-called Web 2.0 digital culture became a mass culture. New methods of content creation, publishing and cooperation lead to new cultural achievements. Therefore, novel tools and strategies are required, both for preservation but in particular for curation and presentation. We propose a scaleable architecture suitable to create a community driven platform for preservation and curation of complex digital objects. Further, we provide novel means for presenting preserved results including technical meta-data, and thus, allowing for public review and potentially further community induced improvements.
Article
The problem for curators and archivists of digital games is that the games are inherently unstable. As a range of commentators have explored, gameplay in digital games often takes quite unexpected, unpredictable and emergent directions as players probe at the boundaries of rules and systems. For those engaged in the archiving, curation and exhibition of digital games, a clear challenge comes from the contingency of playing preferences, and that various ‘playings’ might be differently informed by prefigurative and regulatory materials such as walkthroughs, FAQs and reviews. However, before we get to considerations of the ways it is (re)configured through the performance of play, in fact regardless of whether it is ever played at all, we should recognise that a digital game is a potentially changing, unstable object. In fact, it is typically better thought of as a growing and mutating collection of many objects. This article centres less on ideas of performance or the changing nature of the game at play but rather concentrates on the instability of the fabric of the game to-be-played-with. As media that are routinely ported (transferred and translated) to different operating systems and platforms with differing hardware and software capabilities, and patched (updated to fix bugs or modify gameplay mechanics), digital games simply cannot be conceived of as static objects or texts. To demonstrate the impact of porting and patching on the instability of digital games, this article draws on an analysis of Sega’s Sonic the Hedgehog series.
Article
Full-text available
In recent years a lot of research has been undertaken to ascertain the most suitable preservation approach. For a long time migration was seen as the only viable approach, whereas emulation was looked upon with scepticism due to its technical complexity and initial costs. In 2004, the National Library of the Netherlands (Koninklijke Bibliotheek, [KB]) and the Nationaal Archief of the Netherlands acknowledged the need for emulation, especially for rendering complex digital objects without affecting their authenticity and integrity. A project was started to investigate the feasibility of emulation by developing and testing an emulator designed for digital preservation purposes. In July 2007 this project ended and delivered a durable x86 component-based computer emulator: Dioscuri, the first modular emulator for digital preservation.
Article
Full-text available
L'estudi de l'estetica dels jocs es una practica recent que abasta menys de dues decades. A diferencia de les teories de jocs en matematiques o ciencies socials, que son molt mes antigues, els jocs es van convertir en objecte d'estudi per a les humanitats nomes quan els videojocs i jocs d'ordinador es van tornar populars. Aquesta falta d'interes continuada pot semblar estranya, pero nomes si considerem que els jocs tradicionals i els jocs d'ordinador son intrinsecament similars, la qual cosa no es aixi. Podem intentar explicar aquesta manca assenyalant que les elits estetiques i teoriques que conreen l'analisi d'objectes artistics dels mitjans (literatura, arts visuals, teatre, musica) solen considerar els jocs com una cosa trivial i popular. Pero aixo no explica el fet que els estudis estetics sobre jocs siguin possibles en l'actualitat, i fins i tot en alguns entorns academics, s'estimulin i rebin suport amb beques. Que ha passat per provocar aquest canvi? Text complet (PDF) Node complet (PDF)
Conference Paper
Full-text available
The Multivalent Document Model offers a practical, proven, no- compromises architecture for preserving digital documents of potentially any data format. We have implemented from scratch such complex and cur- rently important formats as PDF and HTML, as well as older formats includ- ing scanned paper, UNIX manual pages, TeX DVI, and Apple II AppleWorks word processing. The architecture, stable since its definition in 1997, ex- tends easily to additional document formats, defines a cross-format docu- ment tree data structure that fully captures semantics and layout, supports full expression of a format's often idiosyncratic concepts and behavior, en- ables sharing of functionality across formats thus reducing implementation effort, can introduce new functionality such as hyperlinks and annotation t o older formats that cannot express them, and provides a single interface (API) across all formats. Multivalent contrasts sharply with emulation and con- version, and advances Lorie's Universal Virtual Computer with high-level architecture and extensive implementation.
Article
There is as yet no viable long-term strategy to ensure that digital information will be readable in the future. Digital documents are vulnerable to loss via the decay and obsolescence of the media on which they are stored, and they become inaccessible and unreadable when the software needed to interpret them, or the hardware on which that software runs, becomes obsolete and is lost. This report explores the technical depth of the problem of long-term digital document preservation, analyzes the inadequacies of a number of ideas that have been proposed as solutions, and elaborates the emulation strategy. The central idea of the emulation strategy is to emulate obsolete systems on future, unknown systems, so that a digital document's original software can be run in the future despite being obsolete. Contents of this report are as follows: (1) Introduction (stating the digital preservation problem and introducing the emulation strategy); (2) The Digital Longevity Problem; (3) Preservation in the Digital Age; (4) The Scope of the Problem; (5) Technical Dimensions of the Problem; (6) The Inadequacy of Most Proposed Approaches; (7) Criteria for an Ideal Solution; (8) The Emulation Solution; (9) Research Required for the Emulation Approach; and (10) Summary. (Author/AEF)
Article
The use of emulation as a preservation strategy is often looked upon with skepticism. Although it may be the only way to render digital objects authentically in the future, emulation is thought to be too technically challenging and therefore too expensive and time-consuming. This line of thought has thus far prevented emulation from being developed for preservation purposes. However, the National Library of the Netherlands (KB) and the Nationaal Archief of the Netherlands are of the opinion that emulation-based preservation can be worth-while and needs further development and testing. This paper presents the results of a preliminary phase of the emulation project conducted at the KB. It explains and evaluates existing emulators and proposes a new model for emulation called modular emulation. This model provides the basis for the development of a working prototype in the future.
Article
This report considers the problem of how to preserve digital documents, given the fact that their formats quickly become obsolete, as do the programs that originally interpreted those formats and the computers on which those programs ran. This problem is investigated from the perspective of the deposit library community, though the issues and solutions discussed here also apply more broadly to the full range of digital data, documents, records, and other artifacts that are used by other kinds of libraries, archives, government agencies, commercial organizations, and individuals. The report attempts to identify and illuminate the root of this problem and, more specifically, discusses the theoretical and practical issues involved in using emulation (a proven computer science technique in which one computer is used to reproduce the behavior of another computer) as a way of preserving authentic, accessible, usable digital documents. Although it is aimed at the deposit library community, this discussion should also be of interest to members of the general library community, as well as archivists, government recordkeepers, and preservationists concerned with the great potential for loss of information that has emerged as an unexpected and unfortunate aspect of the digital age. [5] SUMMARY The increasing use of digital technology to produce documents, databases, and publications of all kinds has led to an impending crisis resulting from the absence of available techniques for ensuring that digital information will remain accessible, readable, and usable in the future. Deposit libraries as well as other libraries, archives, government agencies, and organizations must find ways to ensure the longevity of digital artifacts or risk the loss of vast amounts of information and hu...
Bug-Byte: ZX Spectrum
  • Manic Bug-Byte
  • Miner
Bug-Byte. Manic Miner. Bug-Byte: ZX Spectrum, 1983
Emulating emulators for long-term digital objects preservation: the need for a universal machine
  • V Joguin
15. Joguin, V. Emulating emulators for long-term digital objects preservation: the need for a universal machine. Emulation Expert Meeting 2006, (The Hague, The Netherlands).
Assassin's Creed. Ubisoft: PC
  • Ubisoft Montreal
Ubisoft Montreal. Assassin's Creed. Ubisoft: PC, 2008 24. Valve. Half Life 2. VU Games: PC, 2004