ArticlePDF Available

Archaeology of the Amsterdam digital city; why digital data are dynamic and should be treated accordingly

Authors:

Abstract

One of the major initiatives in The Netherlands promoting the use of the Internet by private individuals was De Digitale Stad (DDS), which is the Amsterdam digital city. DDS was launched in January 1994 and soon evolved from an elementary bulletin-board-like system to a full blown virtual city with squares, houses, post-offices, cafés and a metro. Archaeology of the digital city makes it clear that there is no beaten track for preserving and, after two decades, unwrapping “born digital” material. During the research to reconstruct the digital city two routes were tried, one emulating the old system, another replicating it. The outcome, together with the harvest of two working systems, is a lesson, a concern and an appeal. From the experience of reconstructing digital heritage, we draw pragmatic lessons. Tools for digital archaeology are tried and contemplated. The lessons, however, do not unequivocally support the use of the notion “archaeology.” The concern is one of the social responsibilities. Web archaeology, being part of contemporary history, confronts the researcher with such issues as privacy and the ethics of “young” data. A case is made for treating digital data dynamically.
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=rint20
Internet Histories
Digital Technology, Culture and Society
ISSN: 2470-1475 (Print) 2470-1483 (Online) Journal homepage: https://www.tandfonline.com/loi/rint20
Archaeology of the Amsterdam digital city; why
digital data are dynamic and should be treated
accordingly
Gerard Alberts, Marc Went & Robert Jansma
To cite this article: Gerard Alberts, Marc Went & Robert Jansma (2017) Archaeology of the
Amsterdam digital city; why digital data are dynamic and should be treated accordingly, Internet
Histories, 1:1-2, 146-159, DOI: 10.1080/24701475.2017.1309852
To link to this article: https://doi.org/10.1080/24701475.2017.1309852
© 2017 The Author(s). Published by Informa
UK Limited, trading as Taylor & Francis
Group
Published online: 07 Apr 2017.
Submit your article to this journal
Article views: 727
View Crossmark data
ORIGINAL ARTICLE
Archaeology of the Amsterdam digital city; why digital data
are dynamic and should be treated accordingly
Gerard Alberts
a
, Marc Went
b
and Robert Jansma
b
a
Korteweg de Vries Institute, University of Amsterdam, Amsterdam, The Netherlands;
b
Graduate School of
Computer Sciences, Vrije Universiteit/Universiteit van Amsterdam, Amsterdam, The Netherlands
ARTICLE HISTORY
Received 5 December 2016
Accepted 19 March 2017
ABSTRACT
One of the major initiatives in The Netherlands promoting the use of
the Internet by private individuals was De Digitale Stad (DDS), which
is the Amsterdam digital city. DDS was launched in January 1994
and soon evolved from an elementary bulletin-board-like system to
a full blown virtual city with squares, houses, post-ofces, caf
es and
a metro. Archaeology of the digital city makes it clear that there is
no beaten track for preserving and, after two decades, unwrapping
born digitalmaterial. During the research to reconstruct the digital
city two routes were tried, one emulating the old system, another
replicating it. The outcome, together with the harvest of two
working systems, is a lesson, a concern and an appeal. From the
experience of reconstructing digital heritage, we draw pragmatic
lessons. Tools for digital archaeology are tried and contemplated.
The lessons, however, do not unequivocally support the use of the
notion archaeology.The concern is one of the social
responsibilities. Web archaeology, being part of contemporary
history, confronts the researcher with such issues as privacy and the
ethics of youngdata. A case is made for treating digital data
dynamically.
KEYWORDS
Web archaeology; De
Digitale Stad; emulation;
replica; research ethics;
privacy; digital heritage;
dynamic treatment of digital
archives
Introduction
Participants in the digital city had an avatar. DDS, De Digitale Stad, that is the Amsterdam
digital city, was much more than an Internet server, if only because the community shap-
ing it had not settled for a denite meaning. The systems conveyed a sense of community
building, and although there was not one but many communities, there was this one basic
sense of being a virtual citizenexpressed by the avatar. The DDS-team was self-con-
scious enough to adorn its second anniversary with a full backup of the system to be
studied by archaeologists in a distant future: The FREEZE, 1996.
1
Two decades do not create a great distance. However, we do embark on the archaeol-
ogy of DDS. Following the actors of 1996, the effort to read and resuscitate vintage digital
material may be called archaeology.The metaphorical expression does not come with-
out repercussions. First, the lack of distance poses problems of contemporary history,
hardly associated with the notion of archaeology. Second, notions like digital archaeology
CONTACT Gerard Alberts G.Alberts@uva.nl
© 2017 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group
This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License
(http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium,
provided the original work is properly cited, and is not altered, transformed, or built upon in any way.
INTERNET HISTORIES, 2017
VOL. 1, NOS. 12, 146159
https://doi.org/10.1080/24701475.2017.1309852
or web archaeology suggest a kinship to a media archaeology or to an archaeology of
knowledge in the sense of Foucault (1969) or media theory (Ernst, 2013; Presner, Shepard,
& Kawano, 2014, p. 84ff), which is hardly explored here. The present contribution is about
digital, but very material, old tapes. Getting hold of the vintage tapes is one thing, reading
them and making sense of the content quite another. Different tools were tried and devel-
oped. In the perspective of accessing the data for historical research and possibly museum
presentation, major issues arise. Ethical issues, not unusual for contemporary history, hit
the digital archaeologist in the face. More specic questions related to the technological
aspect of DDS impose themselves, issues of security and integrity of the data. The ques-
tion of privacy, perhaps not strictly related to technology, poses itself with new urgency.
Archaeologies, emulation vs. replica
The effort to reconstruct the digital city and have it in operation almost naturally proceeds
along two routes: emulation and a replica (simulation). The two notions of emulation and
replica have a long evolution of shifting meanings in art history. Here they are taken after
their, equally unstable, meaning in computer science (Smith & Nair, 2005, Chapter 2). The
resulting contradistinction adds a nuance, informed by computer science, to the discus-
sion on emulation in electronic art (Jones, 2004).
Emulation is the effort to run the original code on a new platform. This does not come
without compromise. The DDS-system as preserved does not easily un-freeze. It will not
run, primarily because the physical systems supporting it are not readily available. Migra-
tion to present day systems is tedious, but feasible. Getting the legacy software opera-
tional requires adaptations. The authenticity-question, as to what system one is actually
running, always remains.
Replica, the look-alike remake by present day means, seemingly discards the authentic-
ity-question one is obviously not running the real system and focuses on presence to
the user, rather than on the system.
The difference is that the mimicking is on a different level and from a different perspec-
tive. The original sense of emulation is that of one artist out of admiration mimicking an
other. Here one computer system is thought to mimic the other. The agency is with the
system; it is placed on level with the artist in the original sense of emulation. The per-
spective is that of the system; the boundary is between hardware and software, or
between layers of software. In a replica, the perspective is that of the user. Irrespective of
what happens under the hood of the system, its surface performance for the senses is
what counts. The boundary is between the user and the system. No sharp distinction is
assumed between user and system, or between hardware and software. The crucial point
here is that considering something either an emulation or a replica is a matter of perspec-
tive and comes with diverging expectations. Criteria vary from reconstructing the opera-
tion of the system to recreating the user experience. Recreation implies making the
experience present, and is in that historiographical sense presentism. Does the archaeolo-
gist in her interpretation identify (with) the code or with the user? It seems that archaeolo-
gists by default choose the rst; we are preoccupied with the authenticity of the code. We
are, but in fact we show the alternative route as well.
What approach to choose and which tools to develop is contingent upon the goals set
for such a project, which in turn depends on the context. Heritage has a community on
INTERNET HISTORIES 147
the donating end, a group strong and dedicated enough to not throw away the material
remnants, and on the receiving end, a group feeling strongly about the value of these
materials. In fact there are several groups caring for the DDS heritage and their goals vary.
For the joy of a one-time replay a system may be red up and run with loose ends. For
public access, by contrast, for example a museum exhibit, integrity and security of the sys-
tem and privacy of personal data are key issues. The mere consideration of the latter pur-
pose was one of the motives inspiring the alternative approach of building a new system
from scratch with the same functionality, a replica.
Dynamic approach
If preserving and reconstructing data may be called web archaeology, what are the
scoopsand brushesin the digital practice? For the most part tools of digital archaeol-
ogy require manual calibration and application; automated procedures are in their
infancy. Working with legacy digital material brings home one crucial insight: whether the
unearthed objects are data, scripts or full blown software; their archaeology involves get-
ting the code to work. Born digital material is dynamic. Executing a script may yield differ-
ent outputs each time it runs. Static material does not react, let alone react differently.
The archaeologist will not be satised with images or screenshots.
This is in stark contrast to the existing practices of Web Archiving, viz. to preserve
snippets of the Internet as pages, as snapshots. The maturing of the eld of Web Archiv-
ing has been well captured in the volume edited by Julien Masan
es (2006). The pioneer-
ing work of the Internet Archive and its Wayback Machine from the late 1990s has been
broadened and institutionalised on a national scale in many countries. National Archives
and Media Archives have automated their harvesting with crawlers and lters. The
materials they gather, however, are static, or rather, are treated statically. Even if web-
sites contain code, they are saved simply as pictures of pages. Today, the Internet
Archive does more. It replays the harvested pages as much as it can. Our plea is to rein-
state the pages as they were born, not starting from the resulting page, but from the
server. Given the dynamic character of born digital material the archaeological
approach should be dynamic. To put it differently: such material is called born digital
to emphasise that its symbols are not just text. The text is considered as working code.
And for working code, emulation and replica offer themselves as feasible approaches,
each with their own tools.
De Digitale Stad: a local history
De Digitale Stad, the Amsterdam digital city, exemplied the electronic social network. It
facilitated the exploration of all the possibilities to connectis, including almost inciden-
tally access to the Internet. It was designed with the city metaphor in mind. On 15 Janu-
ary 1994, the digital city opened its gates. Beyond the practice of earlier FreeNets, De
Digitale Stad appealed to its users to adopt the metaphor and create a true community.
It allowed the users to be citizensor netizensand enter the unknown world of the
Internet.
The project was initially funded for 10 weeks by the city of Amsterdam, on the assump-
tion of bridging the gap between local politics and the ordinary citizens. The number of
148 G. ALBERTS ET AL.
subscriptions skyrocketed. After the rst 10 weeks, with the project clearly growing bigger
than anyone had anticipated, DDS acquired further funding to continue beyond the initial
experiment (Castells, 2001; Lovink, 2002; Rustema, 2001, pp. 4267).
The rst version of DDS was a bulletin board system (BBS), a static menu offering the
user a choice of line numbers to continue towards further pages. Imagination was an
essential asset for the user to walk the streets of the digital city.
The second version made a major step to change from the Gopher communication
protocol, used in version 1, towards the newer HTTP protocol still in use today on mod-
ern web pages. Within weeks, yet another version of DDS was released. Through a major
overhaul this version 3 had become a truly interactive system, embodying the metaphor
of a city. The overarching metaphor was further detailed by such facilities as post
ofce,city squareand caf
e.These allowed the citizensto navigate the city more
intuitively. Users could fetch their mail at an email facility called post ofce, they could
set-up their own homepages called housesthat were reachable by traveling across
squares, those were web pages linking between each other, or they could hang out in
acaf
e, which we nowadays see as a chat room. The city metaphor was introduced and
promoted the Internet as a common, a public space, which it hardly was in 1994/1995,
when network services were mostly available in universities, libraries and as private facil-
ities in large companies.
DDS grew amidst optimist expectations, expectations of technology having a demo-
cratisation effect. In the sense of spreading the technology itself among larger section of
society it certainly had this impact. Hope that the technical facilities by themselves would
bridge the gap between politics and public, and thus solve the representation and legiti-
matisation problems of politics soon evaporated. Such hopes were certainly played out in
acquiring the initial subvention of DDS by the city of Amsterdam. The idea of a push-but-
ton direct democracy is a recurring dream, also in DDS circles. In the same vein but stron-
ger and more specic for DDS were the expectations of an emerging community and a
new kind of sociability. Conceiving DDS as a commons, as a public sphere, deeply moti-
vated many of the early actors. Their motivation was even strengthened when in those
very years in the US the Internet rapidly evolved in a commercial connectivity with a new
kind of economy (Aspray & Ceruzzi, 2008). Lovink (2002) observes that not one but many
communities shaped around the digital city and created their own niches subcultures.
From recent research into the themes of the caf
es and lists in DDS, one may infer that the
electronic facilities did in cases serve as vehicles of emancipation.
2
Dennis Beckers and
Peter van den Besselaar have in a series of publications shown the dynamics of the various
groups involved in DDS-initiatives (Beckers & Besselaar, 1998; Besselaar & Beckers, 1998).
Thus, while Lovink (2002) characterised the divergence between the communities as cul-
tural differences, Beckers and Besselaar were more political in their analysis by pointing at
divergence through conicting interests.
At a more fundamental level studies in sociology, STS and media students have hinted
at new kinds of sociability emerging in such connected commons as DDS. Castells, famous
for his trilogy on the Information Age (1996), in The Internet Galaxy (2001) presented DDS
and many other case studies as vistas on new forms of society. The more radical approach
in this direction was Howard Rheingolds(1993) effort to continue where D
urkheim and
Weber have left us with Gemeinschaftand Gesellschaft.Reinder Rustema (2001) emu-
lates Rheingolds search for a societybeyond Gesellschaft by the example of De Digitale
INTERNET HISTORIES 149
Stad. Sociology has not yet come to conclusions, but without any doubt DDS is part of the
empirical material to be reected upon. Media studies on their part show that the indus-
trial society is superseded by the platform society (Van Dijck, Poell, & Waal, 2016)of
which then DDS is not a forerunner or prime example.
Local frost and defrost
On 1516 January 1996, the DDS servers were down for most of the night to allow for a
full backup of De Digitale Stad. A full 1-on-1 disk copy of all the servers running DDS was
created on 3 Digital Linear Tapes (DLT). DDS congratulated itself with a city frozen in time,
preserved to be studied by archaeologists in a distant future: the FREEZE. Further heri-
tage material was gathered at gravediggers partiesand the Amsterdam Museum
installed a small exhibit on DDS.
In restoring old data, it soon came to light that the package would not simply unwrap,
or defrost. The DLT Tapes holding the FREEZE did not easily render their content. After a
good deal of searching for auxiliary hardware, the tapes had been read and converted
into the more common format of compressed gzipped tarball (.tar.gz). Initial attempts to
extract this tarball of 10 Gb failed because, for no apparent reason, it exceeded the avail-
able storage. After several tries, each time with more storage available, the les were
nally extracted to a network attached storage (NAS) with 12 Tb of free space. The size of
the completed extraction revealed why earlier attempts failed: the data lled little over
2.2 Tb of storage space. When searching for the cause why a 10 Gb .tgz le extracted to
over 2.2 Tb (220x its own size), we detected four corrupted les, each over 500 Gb in size
quite possibly the effect of a decompression bomb.Omitting these four les, the extrac-
tion returned to reasonable size, approximately 35 Gb. The project DDS 3.0 operational
worked from these cleaned les.
Sockets
However, in an effort to understand what had gone awry in handing down the legacy les,
and to make sure that no major parts of the original les were missing, we made a detour
going back to the original servers, still extant in the Amsterdam Museum. Other than the
FREEZE the content of these servers was not strictly dated, let alone of the same date as
the FREEZE. Because of its historical signicance, the original, but not necessarily opera-
tional, hardware is preserved at the store of the Amsterdam Museum. For the purpose of
our research project, we were granted access. For just this one occasion the original hard
drives were retrieved from these servers and put back.
In order to read the 20-year-old hard drives, vintage equipment was needed, with
the sockets of the cables as major obstacles. The original server had eight hard
drives, connected using three different cable sockets. The interface was SCSI, which
is a parallel interface subject to different standards. The solution was found with the
help of a former system administrator digging up the tting connectors from an old
drawer. To read out the content of the hard disks a Linux live USB was set-up. The
disks were connected one at a time. A full-disk-image was made and preserved
carefully.
150 G. ALBERTS ET AL.
This sidestep of the project greatly improved our understanding of the legacy material
being studied, specically the hardware. The newly retrieved content read from the origi-
nal servers, showed sufcient overlap with the FREEZE of 1996 to conrm the adequacy of
the cleaned les, but being of different date, was not included in our current reconstruc-
tion project.
Avatar generator
In January 2015, after the defrost and clean-up of the FREEZE had been achieved,
rst forays into the data were started. Since there was little indication of the struc-
ture of the stored le, except that the system was an old Sun SPARC system, inves-
tigations began with mapping the folder structures and sizes, and listing the
installed software. The rst observation was that the system was not as systematic
as one might naively assume. In particular, there were no systematic locations for
source code, if preserved at all, going withtheinstalledcode.Harshlessonforthe
archaeologist: with the programs installed and running in 1996, there will not even
be an indication of source code being preserved and, if so, where it might be
stored in the system. Another harsh lesson, even the non-techies have to learn
some jargon.
Software appears in various modes, basically source codeand binary.Pro-
grammes are written by the programmer with the help of a whole factory of tools
written in a programming language. This written version is called source code.
This is the version one usually refers to when discussing programs. Source code is
just text, it does not work by itself. A program in source code is translated to ton
a system, in our case on the Sun SPARC station. The result of translation is an exe-
cutableor binarycode. This is the version that does the work. Hence, the digital
archaeologist looks for the binariesto see what a system can do, which engines
are available.
The tool doing the translation from source code to binary is called translator or com-
piler.In some special cases and only for specic languages, a tool exists which can do the
reverse translation, back from binary to source code, a de-compiler.
Memories
In exploring the les, it was of great help that former citizensvividly remembered the
avatars. This clue from oral history set the challenge of locating the relevant software.
With the introduction of DDS 3.0, the system had become truly interactive and logging in
became more than having access. Every user would now have an avatar representing him
or her whilst walkingthrough the digital city. A program would generate a small icon-
like image representing the user, an avatar. Somewhere on the server there must be such
a piece of software performing that function, the avatar generator. In 1995, for lack of
memory space and operational speed, the system would not allow for pictures or other
complex images to be inserted. Therefore, the avatar generator created simple but effec-
tively distinctive images for every user, varying on a pattern inspired by a character from
the Muppet show, Beaker.
INTERNET HISTORIES 151
Exploration of the frozen data yielded no such le as avatar or avatar generator. A fur-
ther clue was to search for the programs governing registration to the system, because
that was the procedure including the creation of an avatar. This lead to locating Apache
and its original conguration le, from there the original registration page was traced. The
registration scripts had been written in CGI PERL, easily readable, and thus the steps of
registration could be traced.
Further recollections from oral history suggested that these icon images were never
called avatars, but DoDoS, in an apparent play on the name of the system DDS. And
Dodois the extinct bird from Mauritius. So, the avatar generator would revive the extinct
bird. The script, now located, had the name Dodo.cgi. This shows the development of ter-
minology of the past 20 years. Where nowadays avataris the appropriate jargon, it
shows that this term was not as pervasive back in the day.
Soft lesson for the archaeologist: follow the challenges set by oral history.
The reconstruction project
With the avatar generator resuscitated in 2015, and several other chunks of software
brought to life in 2016, we could not resist the temptation of trying to get the whole sys-
tem back into operation, project DDS 3.0 Operational.The project group soon split up
and worked in two opposite directions. One part of the research focused on what seemed
to be the most obvious thing to do, viz. to try and run the original software again. The
other part focused on the idea of reinstating the user experience, regardless of the
machinery behind the screen. This second route, to replicate the digital city, leads to
rebuilding the system from scratch and working per modern technology and standards.
Preservation of the digital city
The goal of the FREEZE had been to preserve De Digitale Stad as it had existed and run
two years after its introduction, to ensure for future generations the possibility to experi-
ence and to study the early days of the web. However, simply backing up ones data does
not automatically result in preservation. A 1-on-1 copy may be historically accurate, but
such a copy loses much of its attraction if it cannot be run, if the context of the original
production server is missing. DDS ran on the Solaris SPARC computer architecture.
Because Sun SPARC has become proprietary software and hardware from Adobe, with
prohibitively high licensing costs, virtualisation seemed the road towards reconstructing
152 G. ALBERTS ET AL.
an adequate remplacant context. It proved to be not that easy, since emulation of the
SPARC architecture on a different system has usually low performance, particularly on the
most common architecture in the 2010s, the x64 architecture.
Emulation
The one research direction was to revive De Digitale Stad in its original state, from the per-
spective of the software running it. Like a hyena going for the innards, the software
archaeologist goes looking for the software that was run, the software that was executed
to create the performance of the system. The search is for the executable les, or bina-
ries.Being familiar with the operating system, mostly Unix, and knowing how to identify
and search for these les, binaries, archaeologists use such instructions as grepand
ndto trace the location of the executables in the FREEZE.
Lesson of exclusion: only close familiarity with the legacy system and its operating sys-
tem will allow one to do the work of a web-archaeologist. To a large degree this is tacit
knowledge.
Knowing which programs did in fact do the job is not enough to run them again. Bina-
ries are executable only in a specic context, in this case the SPARC context.
At this point the web archaeologist, in general the software archaeologist, bereft from
straightforward automatic tools like a Virtual Machine doing the work, must revert to more
subtle and more individual methods. For the DDS 3.0 Operational project, a pragmatic deci-
sion was made with far reaching consequences. In spite of the dependencies i.e. the points
where the programmer had created constructions particular to the specic machine we
chose to emulate on an x64 architecture. The consequence being that the binaries on the old
system (Sparc) are of no use and one has to create new binaries for the new system (x64). We
had to go back to the source codes, the programmes as written, and would in that mode of
the programme have to deal with the dependencies each individually. The task at hand was
now to nd the original versions of the software running in 1996 and compile these pro-
grams anew for the operating systems coming with the x64 architecture. Fortunately for this
project a good deal of the source code, even if not systematically preserved, was retrieved
scattered throughout the FREEZE. For future archaeology there is a use for systematic tools of
de-compilation, in cases where there is no source code is available at all.
The positive side of emulating on an x64 architecture that this system is so common
that one may expect it to survive for the foreseeable future. We are good, not for archaeo-
logical stretches of time, but at least for one or two generations. One may hope that the
present work of getting a version of DDS running needs not be repeated from scratch in
20 years from now.
The downside is that corners of DDS with the heaviest dependencies are hard to restore.
In particular, the system for authentication, logging in, was most specic for the SPARC archi-
tecture and has therefore been left out of the present emulation. By consequence the avatar
generator, Dodo.cgi, was found and brought back into operation, but the programs constitut-
ing its context, logging in, are not. So, we can play and make avatar-wallpaper today, but we
cannot have our DoDo walk through the system for us. When the former systems administra-
tor was consulted on how to emulate this feature which he had in fact programmed his
answer was: consider not to.In terms of DDS 3.0 as it ran in 1996, one can only visit the city
INTERNET HISTORIES 153
in the guestmode. In other parts of the DDS program functionality was restored by prag-
matic patches. Thus, this emulation comes with compromise.
Along the route of emulation, the digital city has been restored as close to the original
project as possible. In as far as it functions, it does revive with a feel of authenticity so
say DDSs former inhabitants. As an extra benet the emulation preserves code of histori-
cal interest and allows comparison to modern standards. It reveals the challenges as the
DDS developers perceived them and the answers they chose when designing the early
pieces of Internet. It makes the look and feel of born digital media accessible, including its
inner workings and some of the thoughts behind it.
Replica
The other approach was to revive the digital city from the point of view of the end user,
replicating the original system as closely as possible using current technologies of 2016.
This was feasible because, compared to todays projects, the size of the digital city was rel-
atively small and quite straightforward. Although providing static images only, the Way-
back Machine did show the appearance of DDS to the end user. The programming for the
replica was done in parallel to, and strongly inspired by, the emulation. While missing out
on historical accuracy in the back-end software, the replica proves the feasibility of creat-
ing a user interface very near to the original, and in practice indistinguishable from the
emulation so say again DDSs former inhabitants. The user will walkthrough the city
without noticing it is not the original software. In that sense the experience is effectively
preserved.
Major advantages of the replica are, beyond its technical maintainability and sustain-
ability, its security. If one were to consider the creation of a publicly accessible version of
DDS 3.0, for example as a museum exhibit, a replica would offer a doable solution;
whether feasible in terms of museum practice, remains to be seen. As long as mainte-
nance is kept up a replica can be secured and it could be lled with part of the legacy con-
tent upon authorisation by the, former, users.
Tools
The archaeologists brushes
If preserving and reconstructing data may be called web archaeology, what are its brushes
and spades? Digging up the digital city has in large part been a manual labour. In fact
relied heavily on the tacit knowledge typical of craftsmanship. It required familiarity with
Unix and other operating systems. Key element in the tacit knowledge is practical insight
in the way server systems were usually built up, preferably joined with expertise in todays
systems. It took the joint forces of former system administrators who had not forgotten
their trade and were willing to share, and archaeologists who are able to absorb. The latter
are talented computer science graduates using some of their academic lessons and
heavily relying on their experience of working as system administrators to pay for their
studies. Their concerted effort has allowed for a handmade reconstructionof DDS.
But in a digital environment, should one not expect systematic and automated tools?
Some tools do exist in the realm of software archaeology. For example, source code is
154 G. ALBERTS ET AL.
compiled into executable code. For some situations, e.g. programs written in high level
programming languages, tools do exist for the reverse process: decompilation to recon-
struct source code from executable les. Further automated tools are dearly wanted, like
excavation tools to help to recognise the various types of les.
Beyond the archaeological metaphor
As far as craftsmanship goes, the metaphor of archaeology sounds attractive and serves
well. But in fact, what we are describing here has little to do with archaeology proper, but
all with handing over, i.e. heritage through living tradition. It was not Pompeji but Amster-
dam 1996 with its inhabitants still around today.
Once access is gained, from 2015 through unfreezing the FREEZE and now by its
dynamic reconstruction, research is not solely focused on software. The oor is open for
an analysis of the content. Approach and tools are quite different from the above.
Whether historical, sociological, anthropological or phrased by media studies, the further
research questions involve a completely different set of tools. Technologies of searching
through data of ltering and of visualisation are available in the computer sciences and
the data sciences. Some of the more sophisticated tools are being developed in a branch
of data science going under the name of forensics.
In every part of the research on DDS, even in the most technical niches of the recon-
struction process, the archaeologicalresearch is mingled with dialogue. Oral history
helps. More than that, the intermingling reminds the researcher that in fact the historical
approach is the umbrella underneath which it all makes sense: the dusting and scooping
and tting fragments together. Our archaeology of the web has all the benets and the
pitfalls of contemporary history. The takeaway message is that in spite of what the recon-
structed operationality of the software may suggest to some, the historical distance
remains. It is in the very process of reconstruction that the archaeologist is reminded of
the ineradicable historical distance.
Personal data
Manipulating such young dataon people acting as inhabitants of DDS two decades ago,
and existing as fellow citizens today, may well produce a moment of shivers. The archaeol-
ogist nds herself swimming in a pond of personal information. Applied to the study of
young datathe metaphor of archaeology is brutally misleading. The work is social sci-
ence or contemporary history and carries with it the social responsibilities tied to such sci-
ences. Accessing young data poses major privacy issues.
In 1994 a major shift occurred that allowed people to share intimate experiences with
the click of a button. Privacy was a difcult question and people did not, nor could they,
predict the long-term implications of posting their information. Google, Facebook,
Amazon and many other tech companies use these data to their advantage. Where users
used to be proud citizensor stakeholders in the commons, today they are seen as nat-
ural resources for those companies having evolved into platforms. The meaning of data
has shifted dramatically in the past 20 years, from information bearing, to monetary gain.
The notion of privacy has changed even more.
INTERNET HISTORIES 155
Purpose limitation
As a researcher in web archaeology, one will quickly nd oneself dealing with legal and
ethical concerns regarding the privacy of historical subjects. The web was rst and fore-
most a communication medium and as such is lled with personal information of its users.
Personal information is not just the information directly linked to one individual person.
Any piece of information, that could possibly be linked to an individual, counts as personal
information. Linking data to a username would link all these data to an individual, once
that username has been linked to a person. By consequence, most of the web is subject
to privacy laws, as most of the content is about, or created by, individuals regardless of
whether they can be directly identied through this content.
The ensuing legal considerations may vary greatly depending on national or state laws.
And since the Internet hardly stops at national borders, legal considerations are compli-
cated even further. Inside of the Netherlands, when handling personal data, the concept
of purpose limitation(Dutch: doelbinding) is a core concept of the privacy legislation. Its
purport is that personal data should only be used for the purpose for which they were
acquired (Ketelaar, 2000). Medical data are to be used for health care only and not by
insurance companies; income data gathered for taxes should not be shared beyond tax
administration, etc. An exception to this rule is made if the purpose is historical research.
However, leaning on this exception will only turn the legal consideration into an ethical
one, which it already was.
By publishing web archaeological nds, the researcher will encroach on the privacy of
the group being studied. And due to the freshness of the sources in the FREEZE, chances
are that the group being studied is still alive and might object to the personal data being
spread. Therefore, researchers must weigh the potential harm their research might cause,
against the potential gain for society of including the personal data under consideration.
The more so while, as researchers, we are inherently biased towards publishing. Therefore,
other researchers must be consulted, and in an ideal world ethical committees should be
installed overseeing projects of web-archaeology (Markham & Buchanan, 2012). In
the DDS case, our provisional measure has been that any access to the retrieved data for
the purpose of research is given under an agreement of condentiality between the
researcher and the Amsterdam Museum, procuring the source material.
Security
If for historical research, privacy issues may be addressed in similar ways as other research,
with an extra caveat, because of the rapid changes in Internet practices, the discourse
takes a different turn in museum context. Suppose the purpose were to create a public
exhibit out of legacy web sources, not only should the above reticence towards the publi-
cation of personal information be taken into account, but technical matters need to be
considered as well.
Systems built in the 1990s were not created with todays practices of collecting data in
mind. Technologies of protection have evolved accordingly. The legacy systems, even if one
wanteddearly,couldnotpossiblybemadesecureandsafetothestandardsofthetwenty-
rst century. This thought has considerably strengthened the inspiration to build a replica
next to the emulation, which in terms of safety and security must be judged hopeless.
156 G. ALBERTS ET AL.
Treating digital heritage dynamically
The established practice of archiving the legacy of the web is to store snapshots, that is
take a momentary image of a website and download it. This can be done automatically by
so-called crawlers and yields enormous haystacks of information on the history of the
Internet. Not only the Internet Archive and the Wayback Machine operate like this, national
libraries and archives have adopted and standardized this approach (Mason 2007). The
Internet Archive will replay the pages thus harvested. In doing so, they are reducing web-
sites to mere pages. But the Internet is not a book, its sites are generated dynamically, be
it at a pace of once per day or ten thousand times per second. Upon return the visitor will
nd a new thing. Michel Serres (2015) reminds us what a parochial way of organising our
knowledge it is to put it page by page, now that we could liberate ourselves from that for-
mat thanks to the very Internet. As early as 2003 Helen Tibbo (2003, p. 16) observed the
dynamic nature of the web:
A related problem is the Webs dynamic nature. Web archiving initiatives can only preserve
snapshotsof sites or domains at the expense of their dynamism, rather like insects trapped
in amber. Once snapshots of Web content are located outside the active Web, it is arguably
missing one of its most characteristic properties.
The appeal was picked up by Michael Day in (2006, p. 193). We urge to take conse-
quence of the dynamic nature and call for an adequate, dynamic approach. Far from inci-
dental, the complexity and dynamism of the web reect its digital nature. And the way
we conceive of this heritage should change accordingly. The web should be seen not just
as text and image. It is working text: code or software. Preserving the web, thus conceived,
may well burst the frame of archiving. In that sense web archaeology is a truly new eld,
an extension of archiving proper.
The dynamic character of a websites content expresses the underlying code. Whether
simple CGI scripts, markups or complicated software, executable les lend the web its
dynamic character. This code is the working text.Without it the documents as the user
sees them are different, static. The dynamic approach to web heritage implies to take the
text on the surface inclusive of the underlying code, its context. To be able to contain the
context of a web page one cannot assume static snapshots as a solution. To properly pre-
serve web content for future research, it must be stored with its dynamic elements in
mind.
Notes
1. In Dutch gedeponeerd in een archief ter bestudering voor archeologen in een verre toekomst,
in De digitale stad bestaat (bijna) twee jaar. Post on the DDS, 1996. https://hart.amsterdam/nl/
page/37138/gevondenfreeze http://web.archive.org/web/20100830120819/ http://www.alme
dia.nl/DDS/Nieuws/freeze.html
2. Unpublished reports of student work.
Acknowledgments
History of Digital Culturesis a regular graduate course by Gerard Alberts in the joint MSc Computer
Sciences programme of University of Amsterdam and Free University Amsterdam. DDS 3.0 opera-
tionalwas a special course in 2016 taken by Marc Went, Robert Jansma, Ronald Bethlehem, Tim
INTERNET HISTORIES 157
Veenman, Kishan Nirghin, Millen Mortier and Thomas Koch. Reports on this work are available on
[Re:DDS 1.0]. The authors further extend their gratitude to Tjarda de Haan and the team at Amster-
dam Museum for support throughout the project and hosting the presentation, to Waag Society for
facilitating our meetings, to Jesse de Vos and Dennis Beckers for guest lectures, to Theun van den
Doel for tireless moral support, and most of all to former system administrators Michael van Eeden
and Paul Vogel for immediate help and for sharing their implicit knowledge of the systems. We
gratefully acknowledge the Digital Preservation Coalition for distinguishing The Digital City Revives,
of which our DDS 3.0 operational is a part, with the 2016 National Archives Award for Safeguarding
the Digital Heritage.
Disclosure statement
No potential conict of interest was reported by the authors.
Notes on contributors
Gerard Alberts is an associate professor for history of computing and history of
mathematics at the Korteweg - de Vries Institute for Mathematics at the Uni-
versity of Amsterdam. He is editor of the Springer Series History of Computing
and a member of the editorial board of IEEE Annals of the History of Comput-
ing and of this journal.
Marc Went is a master student at the joint MSc Programs Computing Sciences
and Information Sciences of the Vrije Universiteit Amsterdam and University
of Amsterdam.
Robert Jansma is a master student at the joint MSc Program Computing Scien-
ces of the Vrije Universiteit Amsterdam and University of Amsterdam. He is
also a research assistant at the Amsterdam Museum involved in sustainably
preserving De Digitale Stad in the project DDS herleeft.
ORCID
Marc Went http://orcid.org/0000-0002-1133-5805
Robert Jansma http://orcid.org/0000-0003-1121-4189
References
Aspray, W., & Ceruzzi, P.E. (Eds.). (2008). The internet and American business. Cambridge, MA: MIT
Press.
158 G. ALBERTS ET AL.
Beckers, D., & Besselaar, P. (1998). Sociale interactie in een virtuele omgeving: De Digitale Stad [Social
interaction in a virtual environment: The Digital City]. In Informatie & informatiebeleid 16-4.
Besselaar, P., & Beckers, D. (1998). Demographics and sociographics of the digital city. In T. Ishida
(Ed.), Community computing and support systems Lecture notes in computer science (Vol. 1519,
pp. 108125). Berlin: Springer Verlag.
Castells, M. (1996). The information age: Economy, society and culture (Vols. 3). Cambridge, MA:
Blackwell.
Castells, M. (2001). The internet galaxy: Reections on the internet, business, and society. Oxford: Oxford
University Press.
Day, M. (2006). The long-term preservation of web content. In J. Masan
es (Ed.), Web archiving (pp. 177
199). Berlin: Springer Verlag.
Ernst, W. (2013). Digital memory and the archive. Minneapolis: University of Minnesota Press.
Foucault, M. (1969). Larch
eologie du savoir [The archaeology of knowledge]. Paris: Gallimard.
Jones, C. (2004, June). Seeing double: Emulation in theory and practice. TheEerl King case study. Paper
presented at the electronic media group. Annual Meeting of the American Institute for Conserva-
tion of Historic and Artistic Works. Portland, OR.
Ketelaar, F.C.J. (2000). Elke handeling telt. archiefdiensten en de wet bescherming persoonsgeg-
evens [Every act counts. Archival agencies and the law on the protection of personal data]. Neder-
lands Archievenblad,104,1823.
Lovink, G. (2002). Dark ber: Tracking critical internet culture. Cambridge, MA: MIT Press.
Markham, A., & Buchanan, E. (2012). Ethical decision-making and internet research. Recommendations
from the AoIR ethics working committee (version 2.0). AoIR. Retrieved from https://aoir.org/reports/
ethics2.pdf
Masan
es, J. (Ed.). (2006). Web archiving. Berlin: Springer Verlag.
Mason, I. (2007). Virtual preservation: How has digital culture inuenced our ideas about
permanence? Changing practice in a national legal deposit library. In M.V. Cloonan & R. Harvey
(Eds.), Preserving cultural heritage, Library trends, 56-1 (summer 2007) (pp. 198215) [Special issue].
Presner, T., Shepard, D., & Kawano, Y. (2014). HyperCities. Thick mapping in the digital humanities.
Cambridge, MA: Harvard University Press.
Rheingold. H. (1993). Virtual community, homesteading on the electric frontier. Reading, MA: Edison
Wesley.
Rustema, R. (2001). The Rise and fall of DDS: Evaluating the ambitions of Amsterdams digital city
(Unpublished masters thesis). University of Amsterdam, Amsterdam. Retrieved from http://rein
der.rustema.nl/dds/rise_and_fall_dds.pdf
Serres, M. (2015). Thumbelina: The culture and technology of millennials. London: Rowman & Littleeld
International.
Smith, J., & Nair, R. (2005). Virtual machines: Versatile platforms for systems and processes. San
Francisco, CA: Elsevier.
Tibbo, H.R. (2003). On the nature and importance of archiving in the digital age. Advances in
Computers, 57,167.
Van Dijck, J., Poell, T., & Waal, M. (2016). De platformsamenleving. strijd om publieke waarden in een
online wereld [The platform society. The struggle on public values in an online world]. Amster-
dam: Amsterdam University Press.
INTERNET HISTORIES 159
... As noted from the literature, while the aspect of urban smartness may have been conceived earlier, it became vivid in 1974 in Los Angeles when the local government of the day initiated the use of data to influence certain aspects of its urban administration [5]. Later, in 1994, Amsterdam attempted to install smart government structures by adopting a 'virtual city' concept [95]. However, in the 2000's, as noted in Figure 2 (above), notable possibilities of incorporating digital solutions in cities began to gain traction due to gradual growth of the ICT sector, catalysed by the fourth industrial revolution. ...
... However, it was noted in different fora [94], that cities that had embraced the smart agenda had some relief. This was especially evident in cities where the aspect of smart urban governance had gained some roots, as some services continued to be offered virtually albeit in limited scopes compared to normal [95]. Conversely, Kitchin [52] questioned the technical and practical efficacy of surveillance technologies used for contact tracing, quarantine enforcement, social distancing/movement monitoring, and symptom tracking (e.g., smartphone apps, facial recognition cameras, biometric wearables, smart helmets, drones, and predictive analytics), and examined their implications for civil liberties, governmentality, surveillance capitalism, and public health. ...
Article
Full-text available
The concept of smart cities peaked in 2015, bringing an increased influx of ‘smart’ devices in the form of the Internet of Things (IoT) and sensors in cities. As a result, interest in smart urban governance has become more prevalent in administrative, organisational, and political circles. This is sustained by both local and global demands for an increased contribution to the goals of sustainability through urban governance processes in response to climate change urgencies. Cities generate up to 70% of global emissions, and in light of societal pressures for more inclusivity and democratic processes, the need for sound urban governance is merited. Further knowledge on the theme of smart urban governance is required to better understand the trends and knowledge structures and better assist policy design. Therefore, this study was undertaken to understand and map the evolution of the concept of smart urban governance through a bibliometric analysis and science mapping techniques using VOSviewer. In total, 1897 articles were retrieved from the Web of Science database over 5 decades, from 1968 to 2021, and divided into three subperiods, namely 1978 to 2015, 2016 to 2019, and 2020 to early 2022. Results indicate that the overall emerging themes across the three periods highlight the need for citizen participation in urban policies, especially in relation to smart cities, and for sustained innovation for e-participation, e-governance, and policy frameworks. The results of this study can aid both researchers exploring the concept of urban governance and policy makers rendering more inclusive urban policies, especially those hosting technological and digital domains.
... Later, as the fourth industrial revolution emerged (in the 2000s), different technologies started to be developed, allowing technology service providers to collaborate with municipalities to use data generated and collected from different urban fabrics, finding solutions for diverse urban challenges (Allam, 2020d). This was for instance witnessed in Amsterdam, where the first virtual digital city is argued to have been created (Alberts et al., 2017). However, it was the work of IBM and Cisco in the mid-2000s that brought real breakthroughs in the use of technology in cities to render them 'smart' (Allam & Newman, 2018). ...
Chapter
The history of cities dates back to the agrarian revolution, where due to an increase in agricultural productivity, people started to be attached to specific locations near food production areas, the main aim being to increase liveability, reduce transportation cost and allow for storage of surplus products. As a result, those locations became centres for trade and settlement for people offering different goods and services. During this period, technologies were rudimentary and not as advanced as they are today, but still prompted people to pool together in specific locations, which eventually grew to become towns and cities (Maisels, 1993). After the emergence of the Industrial Revolution, the role of cities continued to be more prominent, where they became centres for economic development, marketplaces for diverse products, homes for factories and manufacturing and others. This prompted an increase in opportunities – economic, social and tourism activities – and attracted a large influx of people (Allam, 2020c). As a result, existing towns transformed into cities, while cities grew and became even larger. But, just like in the case of the cities during the agrarian revolution, the common denominator was the presence of technology
... Referring to the city management strategies that yield technology-based solutions for more efficient execution of public services, the concepts digital city introduced by Dutton (1987) and wired city coined by Osborne and Gaebler (1992) have underpinned today's smart city terminology. Data and information extracted via digital technologies to be efficient governance tools in Amsterdam, accepted as the world's first city to adopt smart city practices (Alberts, Went and Jansma, 2017), have brought a diverse dimension to the technology-city interaction. The exemplary success of Amsterdam and IBM's Smarter Planet Strategy (2008) has contributed significantly to the interest in smart city research since the early 2000s. ...
Article
Full-text available
The present study sought an answer to the question, “What kind of challenges do local governments in Turkey confront while implementing data- and knowledge-driven smart city strategies?”. It seems noteworthy to explore tacit links between such implementation challenges through a field study employing an exploratory design. Thanks to the original theoretical framework enriched with empirical findings, this research is expected to bring practical and theoretical contributions to the smart city literature. The data were gathered through face-to-face, semi-structured interviews with 23 personnel of Sakarya Metropolitan Municipality (SMM), which has become the very first local government in Turkey to have introduced a smart city strategy and action plan. In this field research employing a single case, the content analysis technique was utilized to interpret the findings. Accordingly, SMM is faced with basically data-driven difficulties such as data security, poor technological and physical infrastructure, insufficient budget and high costs, dubious legal regulations and bureaucracy, resistance to change, lack of human resources and high turnover, and digital divide while implementing its smart city strategy.
... Kota Amsterdam juga turut akan dibuat versi digitalnya dengan digitalisasi kota beserta segala bentuk bangunan heritage yang menyertainya agar jika dibutuhkan rekonstruksi ulang dapat dihasilkan ukuran dan bentuk replika yang serupa dengan aslinya. Proyek ini sudah mulai dirancang dengan memulainya secara parsial untuk membangun data digital yang dinamis dengan target akhir produk kota Amsterdam digital yang memiliki perbandingan data yang serupa dengan aslinya (Alberts et al., 2017). ...
Article
Full-text available
Nusantara terkenal memiliki sejarah peradaban yang agung. Penemuan-penemuan arkeologi dan kajian yang membuktikannya banyak dijumpai baik dalam museum, artikel ilmiah, buku, maupun media digital. Informasi dan pengetahuan tentang kebudayaan masa lampau tersebut sangat penting sebagai pelajaran bagi bangsa Indonesia dalam memahami karakter budaya dan nilai luhur yang ditinggalkan oleh nenek moyang di bumi nusantara ini. Generasi saat ini lebih menyukai mengkonsumsi informasi secara digital dibandingkan dengan membaca buku. Informasi visual menjadi penting saat alih wahana media informasi. Kurangnya informasi sejarah, pengetahuan mengenai karakter budaya, dan nilai-nilai luhur bangsa Indonesia akan memicu timbulnya bencana identitas bangsa. Bangsa Indonesia akan menjadi bangsa yang tidak memiliki ciri khas atau budaya yang unik lagi. Oleh karena itu strategi pencegahan kurangnya pemahaman identitas bangsa perlu dirancang dan dikaji formulasinya yang sesuai dengan karakteristik bangsa Indonesia dengan keragamannya. Penelitian ini akan memformulakan kajian literatur secara sistematis mengenai perkembangan strategi adaptasi media digital dan basis data arkeologi di dunia yang akan dipaparkan dan ditelaah kesesuaiannya bagi informasi yang ada di Indonesia. Media digital yang tak lekang waktu dan dapat diakses dimana saja sangat strategis sebagai solusi. Sehingga hasil dari penelitian ini dapat menjadi acuan penerapan strategi digital bagi pengembangan informasi, media, dan basis data arkeologi di Indonesia.
Article
Introduction. Regardless of the perturbations and transformations of the external, a person realizes himself and presents himself to others in the direction of the so-called subjectivity. The topic of subjectivity, including the social identity of the subject, is widely represented in modern academic discourses. However, already in the first decade of the XXI century in the context of social and humanitarian reflections on the subject, the idea of the so-called digital subject was raised. In comparison with its biosociocultural counterpart, the digital subject occupies a specific position, a position between the real world of physical objects and the information space. Therefore, reflections on the social identity of a modern person are brought into the digital space, into those of its variants where the digital subject “dwells” and creates narratives about himself. The definitions of “social identity” and “digital subject” proposed in the article will be taken as basic. The purpose of the article is to identify and analyze the key variants of social identity that are presented today in the Internet. Methods. The study is based on general scientific methods, analysis and synthesis, induction, deduction, abstraction. The analysis of variants of social identity in the Internet was carried out primarily on the basis of content analysis, the method of interpretation and a systematic approach. Scientific novelty of the study. The article presents options for designing social identity in the digital space, e-mail and instant messengers, contextual advertising, social networks; the specifics of designing in each of the named options are analyzed. Results. The exposition of options for designing social identity on digital material opens up a spectrum of different forms of being a subject in the Internet. To the greatest extent, these forms are associated with the communicative nature of a person. Increasingly, the social subject is turning to digital communication options, which include instant messengers, social networks and e-mail. It seems that in the 21st century, when the Internet has its own specific signs, including emoji, the so-called literacy qualification, the ability to communicate via the Internet, is available to anyone. At the same time, the real subject not only leaves its unique imprint on the network; the information he receives from the Internet leaves its mark on his decisions, in a series of diverse elections, from a grocery store to a political leader. Conclusion. In modern academic reflections on the specifics of human existence in the Internet, the idea of a digital subject has taken shape. In the discourses of domestic and foreign researchers, the digital subject is usually represented as a prefabricated construct. It has a connection with the real subject and social groups, but its origin is ideological. According to its purpose, it carries some idea that inclines the real subject to make just such a decision, not another. Hence, digital research acquires epistemological value, more precisely, the search and analysis of options through which a person realizes his identity in the network.
Article
De Digitale Stad was a pioneering website of the early Dutch web. Historical study of its remnants has required extensive system knowledge, which may pose a new challenge for digital humanity researchers. The study of remnants is covered by the metaphor of “archaeology.” In view of software character of web artefacts, the metaphor “web archaeology” is implicitly limiting. An approach is proposed that considers the working character of the software, “software archaeology.” In a case study it is shown by the example of “metadata dating” how the software archaeological approach can be brought to bear on born digital artefacts. The case illustrates the historical value of extracting time-related metadata from digital artefacts, “metadata dating.” Metadata dating is performed on three archives related to De Digitale Stad. By aggregation of time-related metadata, new historical insights into De Digitale Stad have been gained. Whereas the early internet has often been presented as ever changing, novel and revolutionary, the results bring to light strong continuities in the usage patterns shown in De Digitale Stad. Contrary to expectations, it is observed that the citizens of De Digitale Stad, were primarily interested in the preservation of the traditional content in their files, and less in the results of their own coding efforts.
Article
Purpose Rural tourism facilities in Poland were very keen on amateur websites to promote their hospitality services from 2000 to 2018. In most cases, the websites were nonprofessional, hosted on free servers and made by family members or friends of the holding. After search engine algorithms changed in 2015–2019, the websites started to go extinct on a large scale; they were deleted and often replaced with a more modern design and a commercial domain. These resources offered a rare opportunity to gain insight into rural tourism, rural changes and socioeconomic and cultural phenomena. Design/methodology/approach The paper’s objective is to demonstrate with an analysis of archived Polish rural tourism websites that digital cultural artefacts are generated in rural areas. The study was an analysis of selected development attributes of rural tourism websites found in the Internet Archive. The analysis involved those attributes that are important for determining whether a website or content can be considered digital cultural heritage assets. Findings The conclusions demonstrate that rural digital cultural heritage is a set of digital artefacts created in rural areas with their characteristics. Rural digital artefacts are records of ICT, infrastructure, environmental, cultural and socioeconomic changes. Originality/value The “digital assets” of rural areas are yet to be discussed in the context of rural cultural heritage, as a set of artefacts created in these areas and characteristic of them.
Chapter
Research on smart cities has focused on areas such as embedded computing technologies in physical spaces, Information Communication Technology (ICT), and big data analytics. In an international forum on the Internet of Things (IoT), a presenter from Cisco showed that 60% of IoT initiatives failed at the proof-of-concept phase because they did not seriously consider the human factors of organizational culture and leadership. A myopic approach that focuses solely on technology does not address the complex and multi-dimensional needs of cities. This chapter examines an early classical model of citizen participation, the Arnstein step ladder model, which has been used as the foundation for scholarly discourse and planning practice for decades. Today, unprecedented technological innovation has changed the nature and extent of citizen engagement. Through the use of advanced ICT, cities are broadening citizen participation. The specific research questions asked are: (1) Does the traditional ladder of citizen participation apply to digital placemaking; (2) What are the commonalities and differences among digital placemaking initiatives around the world and what lessons can be learned from these practical experiences of digital citizenship; and (3) Is digital citizenship participation inclusive and equitable for all citizens?
Article
Selama pandemi COVID-19, keterbatasan akses publik menjadi pemicu peningkatan pelayanan edukasi sejarah dan budaya, khususnya bagi lembaga perpustakaan dan museum di Indonesia. Inovasi virtualisasi akses publik dengan menghadirkan akses terbuka bagi dokumen digital atau rekaman sejarah yang disajikan melalui jaringan internet. Tujuan penelitian menghadirkan kembali kondisi situs seperti pada masa kejayaannya dengan menggunakan metode visualisasi 3D. Rekonstruksi 3D dilakukan semi manual dengan memanfaatkan data temuan arkeologis di lapangan dan catatan sejarah atau temuan terkait dimensi dan kondisi tekstur artefak. Proses rekonstruksi 3D dibantu peneliti arkeologi Balai Arkeologi DIY, sehingga obyektivitas dan validitas data terjaga secara keilmuan di bidang Arkeologi. Adapun situs yang menjadi obyek penciptaan seni digital adalah Situs Liyangan, Temanggung, Jawa Tengah.Proses penciptaan rekonstruksi 3D landmark Situs Liyangan dilakukan dengan pengumpulan data arkeologis di Balai Arkeologi DIY dan di Situs Liyangan, Temanggung, Jawa Tengah. Proses penciptaan diawali dengan permodelan aset, tekstur, dan perancangan tata letak. Produksi rekonstruksi 3D menggunakan perangkat lunak 3Ds Max dan mesin game Unreal sebagai penyaji antarmuka akses bagi pengguna berikut interaksinya. Hasil tersebut kemudian menjadi bahan evaluasi pengujian dan publikasi karya. Berdasarkan capaian tersebut, maka hasil akhir yang didapat adalah purwarupa rekonstruksi 3D landmark Situs Liyangan.
Chapter
Full-text available
Web archiving initiatives exist to collect ephemeral Web content for use by current and future generations of users. To date, most such initiatives have concentrated on the development of strategies and software tools for the collection of Web content and for providing current access to this content through interfaces like the Internet Archive's Wayback Machine. The International Internet Preservation Consortium (IIPC) is currently building on this legacy with the collaborative development of a set of tools that can be used for the capture of websites and for the navigation and searching of Web archives. The focus on collection strategies and tools is a response to what is perhaps the most significant challenge of the Web from an information management perspective. Its dynamic nature means that pages, sites and even whole domains are continually evolving or disappearing.
Conference Paper
Full-text available
During the last decade, various systems have been created to s upport local communities and shared interest groups. Knowledge about the use, users, and effects of these new systems is needed to inform design and implementation. In this paper we present the results of a survey among inhabitants of the Digital City, a large infrastructure for 'virtual communities'. The number of users, the range of facilities offered in the Digital City, and mutual interaction between the users does increase. At the same time, the original local (Amsterdam) base of the system has disappeared, and today's users are living all over the Nether- lands. The population of the Digital City is fairly homogeneous, and therefore does not reflect the heterogeneous nature of a 'real' city. Use of the Digital City is mainly recreational, and not yet integrated with other aspects of daily life.
Book
In Ernst’s media theory, archaeology becomes archivological analysis that refuses to stay on the interface level. Instead, it reveals the technological conditions of our contemporary techniques of memory and time. The archivological approach focuses on the infrastructure of media historical knowledge. With an extended concept of the archive, a media archaeological and archivological approach to the past means that media can not be made into “historical” objects of research only. Different media systems, from library catalogues to micro-filming, have influenced the content as well as the understanding of the historical remains of the archive itself. Alphabetic writing which has dominated the archive for centuries has dramatically been challenged by signal recording (photography, the phonograph, cinematography) and puzzled the archivists at the beginning of the age of media reproduction. Now, in the digital age, we are faced with further challenges concerning cultural memory, remembering and forgetting. Time is not registered only through historical writing but also through the microtemporality of the machines themselves. Instead of narrative and historical accounts of media history, Archives, Media and Cultural Memory that we need a more medium-specific account of the interaction of past and current media cultures. Media studies is extended into an analysis of their scientific and technological roots, while combining such specificity with exciting insights into contemporary philosophy and media theory.
Article
Digital preservation and archiving stand as grand challenges of the first decade of the 21st century. Our cultural heritage, modern scientific knowledge, and everyday commerce and government depend upon the preservation of reliable and authentic electronic records and digital objects. Issues such as software and hardware obsolescence, media fragility, expensive metadata creation, and intellectual property rights place all of these materials at risk. The archival and computer science professions must come together to solve the complex technical, conceptual, social, legal, and economic challenges that endanger the longevity of all digital objects. Basic archival principles must be built into information creation and management software and information creators must recognize the need to exert responsible custody over the digital information objects they manage. Archiving, and the preservation tools to facilitate it, must become ubiquitous if society is to preserve its memory in the digital age.