Content uploaded by Quirine A.M. Eijkman
Author content
All content in this area was uploaded by Quirine A.M. Eijkman on Nov 17, 2017
Content may be subject to copyright.
Open source intelligence and privacy dilemmas: Is
it time to reassess state accountability?
Quirine Eijkman 1 and Daan Weggemans 2
Introduction
Providing for safety and security is a core task of the state. The rapid development
of technology has, in many ways, affected the dynamics of this responsibility.
Intelligence- and security agencies and the police increasingly rely on information
technology that facilitates the collection of Open Source Information (OSINF). OSINF
forms the basis of Open Source Intelligence (OSINT), which is gathered through
publicly available sources that are unclassified and include sources ranging from
(foreign) newspapers, governmental reports, public data, maps, academic sites to
blogs, social networking sites, apps and web-based communities.
With the evolution of the internet, a vast array of information has become
retrievable with the click of a mouse. In addition to this accumulation of valuable
data, the internet also contains large quantities of personal information, often posted
online by people themselves through social networking sites, blogging or apps.
Individuals regularly share personal information online, which is stored as digital
data in databases or in the cloud. This, in turn, has led to new perceptions about
how this personal data may be used for security and safety purposes. In many areas
the use of OSINF — e.g. the monitoring of different social networking sites, blogs or
apps — is growing significantly. Several research centres and think tanks, both
public and private, have been established solely with the aim to study, coordinate or
develop new approaches to (the gathering of) OSINF and the acquiring of OSINT.
Quoting one of the main themes of the ‘2010 Naked intelligence’ conference; ‘the
gathering of knowledge in a see through world’ has become a prominent aspect
within the security and intelligence industry.3
New strategies for using OSINF are also designed to anticipate national
security threats such as international terrorism. Although the chances are slim that a
terrorist will post his or her selected target location online, these measures are
helpful in monitoring violent extremist views. In 2012 this was confirmed by the
Dutch General Intelligence and Security Service (AIVD) ‘Jihadism on the Web’
report, in which the internet was labelled as the ‘most important medium for the
1 Quirine Eijkman (Phd) is a senior researcher / lecturer at the Centre for Terrorism and
Counterterrorism (CTC), Leiden University-Campus the Hague. Email:
q.a.m.eijkman@cdh.leidenuniv.nl.
2 Daan Weggemans (MSc.) is a researcher / lecturer at the Centre for Terrorism and
Counterterrorism (CTC), Leiden University-Campus the Hague. Email:
d.j.weggemans@cdh.leidenuniv.nl
3 The conference Naked Intelligence, ‘Gathering Knowledge in a See Through World’, origins
from the collaboration of two private corporations in the field of Open Source intelligence
production; Sandstone s.a. and Infosphere AB, 2010, Retrieved 27 January 2013,
http://www.telestrategies.com/ni_10/index.htm.
Quirine Eijkman and Daan Weggemans
Security and Human Rights 2012 no. 4
286
dissemination of these (jihadist) ideologies’.4 Returning to the state’s core task of
providing safety and security for its citizens, we argue that gathering Open Source
Information is a legitimate tool for security governance. As Ben Hayes stated, ‘(…)
security services would be negligent if they didn’t utilize information in the public
domain to inform their work’.5
However, the legitimacy of the growing use of OSINF cannot be derived
solely from the pursuit of security or safety concerns. (Side) effects for human
rights should also be considered. In this article we therefore discuss challenges
associated with the use of OSINF — mainly in relation to the freedom of internet and
the rights to privacy and data protection. From a human rights perspective, the
gathering of OSINT demands proper checks and balances. This is especially
important, when security — and intelligence agencies as well as private companies
use and exchange information. In this article, we (re)assess the checks and balances
for the use and sharing of open source intelligence by, and between, security — and
intelligence agencies and law enforcement agencies. Even though laws, regulations
and policies in relation to OSINF may recognise the need for checks and balances
including the value of the right to privacy, data protection or a fair trial, it is
nevertheless important to review whether the gathering of OSINF online requires
more (state) accountability. Henceforth, in this article we focus on what dilemmas
arise with the collection of open source information by security — and intelligence
agencies and to a lesser extent the police. We thus address the following questions:
Have state accountability mechanisms been able to keep up with the rapidly
increasing practice of open source information gathering and exchange? Are
sufficient safeguards provided to protect human rights?
Open source information in context
After the 9/11 attacks the National Commission on Terrorist Attacks upon the
United States recommended a greater role for OSINT within security agencies.6 The
following institutionalization of open source collection has been called ‘one of the
most high profile reforms (..) aimed at the preventing of terrorists attacks and
avoiding intelligence failures’.7 Meanwhile it is often stated that ‘ninety percent of
intelligence comes from open sources’.8
The definition of open source information, which is at the base of this form
of intelligence, is ambiguous. In this section we review some of the definitions and
4 General Dutch Intelligence and Security Service, ‘Jihadism on the Web’, 2012, p.3. Retrieved
27 July 2012, https://www.aivd.nl/@2872/jihadistisch.
5 B. Hayes, ‘Spying in a see through world: The ‘Open Source’ Intelligence Industry’, in
Statewatch Bulletin, 2010, no. 1, p. 2.
6 National Commission on Terrorist Attacks Upon the United States, The 9/11 Commission
Report, 2004. Retrieved 25 August 2012, http://govinfo.library.unt.edu/911/report/index.htm
7 H. Bean (eds.), No More Secrets: Open Source Information and the Reshaping of U.S.
Intelligence, Praeger, Santa Monica, CA, 2011, p. 12.
8 R.A. Best and A. Cumming, ‘Open Source Intelligence (OSINT): Issues for Congress’,
Congressional Research Service, 2007, p. 7. Retrieved 2 July 2012,
http://www.au.af.mil/au/awc/awcgate/crs/rl34270.pdf.
Open source intelligence and privacy dilemmas
Security and Human Rights 2012 no. 4
287
developments concerning the use of open sources for national — and public
security purposes. Open source information is information that is publicly available.
In other words; what is not ‘confidential’ and out there in the (digital) public
domain. It is the information that anyone can ‘lawfully obtain by request, purchase,
or observation’.9 Examples of open information sources include the media (e.g.
radio, television, newspapers, websites, blogs), official (governmental) reports,
academic sources (papers, conferences, seminars), commercial data and so called
‘gray literature’ such as working papers, unofficial government documents and
surveys.10 In this article we focus on the increased availability of personal open
source information on the World Wide Web (‘www’). Not only online news pages
but also ‘weblogs’, ‘chat rooms’, ‘social networking sites’ including Facebook,
Twitter or Skype and ‘apps’ such as Whatsapp or WeChat, are perceived as
potential valuable sources for intelligence - and security services and the police.
Here, one can, through information technology find unique information about the
lives of millions of (world) citizens.11 Open Source Centre director Doan Naquin
once said ‘We’re looking at YouTube, which carries some unique and honest-to-
goodness intelligence’.12
To acquire this Open Source Intelligence, OSINF (raw data) in the form of an
interview, a photograph, a tweet, etc. is ‘analyzed, edited, filtered and validated’.
Furthermore, the data is linked with different media sources (e.g. the internet,
academic journals, official reports, newspapers, radio and television), in order to
verify, complement and contextualize the collected information.13 As is described
above, an advantage of this data is that it has become widely available nowadays.
Especially with the ‘information explosion’, which is the result of the rapid
development of the internet, obtaining OSINF has become significantly cheaper.
Simultaneously, the frequent use of open sources by security — and
intelligence and law enforcement agencies has been facilitated by legislation. The
Dutch Intelligence and Security Services Act, for instance, states that first open
sources need to be checked before any other methods can be applied.14 Other
countries, though sometimes implicitly, often demand similar approaches to
9 National Open Source Enterprice, Intelligence Community Directive 301, July 2006.
10 A. Sands, ‘Integrating Open Sources into Transnational Threat Assessments’, in J. Sims and
B. Gerber (eds.), Transforming US Intelligence, Washington, D.C., 2005, pp.64-65.
11 For instance, Facebook had at the end of March 2012, 901 million active users. At the first
quarter of 2012 an average of 3.2 billion likes and comments were generated by its users each
day. See: Facebook, ‘Key Facts’, 2012. Retrieved 2 July 2012,
http://newsroom.fb.com/content/default.aspx?NewsAreaId=22.
12 D. Naquin, ‘CIRA Newsletter, Remarks by Doug Naquin’, in CIRA Luncheon, 2007, p.7.
Retrieved 2 July 2012, http://www.fas.org/irp/eprint/naquin.pdf.
13 NATO, NATO Open Source Intelligence Handbook, 2001, p.2. Retrieved 4 July 2012,
http://www.oss.net/dynamaster/file_archive/030201/ca5fb66734f540fbb4f8f6ef759b258c/NA
TO%20OSINT%20Handbook%20v1.2%20-%20Jan%202002.pdf.
14 Wet op inlichtingen en Veiligheidsdiensten 2002 (The Intelligence and Security Services Act
2002), 2002. Retrieved 3 July 2012, http://www.st-ab.nl/wetten/0662_Wet_op_de_
inlichtingen -_en_veiligheidsdiensten_2002.htm.
Quirine Eijkman and Daan Weggemans
Security and Human Rights 2012 no. 4
288
information gathering.15 In addition to the growth of available information, the
number of public - and private organizations concerned with OSINF and OSINT have
increased substantially. In the United States, the National Open Source Center
(NOSC) was opened on 1 November 2005 with the goal to effectively collect
available open source information. In Europe EUROSINT Forum was installed in
2007 to exchange knowledge and experiences between different professional OSINF
analysts. Moreover, initiatives by universities, including the Open Source
Intelligence Exchange at Fairmont State University, USA, or think thanks, such as
Rand Cooperation, respond to the constant demand for new knowledge in this
area.16 But not only new public institutes, training courses or expertise were
created. In addition, the increased outsourcing of information gathering, data
mining and analyzing by private companies has been a major development within
the security industry. This also affects intelligence gathering.17 Various private
entities are becoming more involved in national — and personal security. For
instance, the privately funded company, the OSINT-Group was founded in 2007 and
‘utilizes in each case the best and most relevant sources to respond to established
client needs with sensitive yet unique and important ‘open source
intelligence’ rather than just ‘information’.18 Several other private companies like
Stratfor, Infosphere AB and Sandstone AB have gained substantial OSINF-market
shares as well.19 Simultaneously, the rise of websites like Wikileaks or OpenLeaks,
which facilitate the mass publication of classified information by whistleblowers
further reflect this development.
Open source intelligence dilemmas
The benefits of OSINF are emphasized by security consultants, scientists, the media
as well as the intelligence community. OSINF is cheap and more widely available
than the traditional public information acquired by clandestine services. Moreover,
it also provides extra information which sometimes cannot be gained by other
intelligence sources (e.g. human intelligence). In addition, as a result of the wide
availability of (local) news coverage throughout the internet, the use of online open
sources enables security — and intelligence agencies to be more up-to-date.20
Simultaneously, online open sources may in times of crisis — e.g. a war — be a
more reliable and safe way of acquiring intelligence than by polarized human
15 See for instance: The United Kingdom Intelligence Services Act 1994 or the Australian
Intelligence Services Act 2001.
16 Hayes, p. 5 (see note 5 above)
17 S. Chesterman, ‘We can’t Spy… If we can’t Buy!: The Privatization of Intelligence and the
Limits of Outsourcing ‘Inherently Governmental Functions’’, in The European Journal of
International Law, 2008, no. 5, p. 1057.
18 OSINT-Group, Overview, 2012. Retrieved 2 July 2012, http://www.theosintgroup.com/
overview.html.
19 Hayes, p. 3-5 (see note 5 above)
20 L. Pouchard, J. Dobson and J. Trien, ‘A Framework for the Systematic Collection of Open
Source Intelligence’, 2009, p.1. Retrieved 29 July 2012,
http://info.ornl.gov/sites/publications/files/Pub13152.pdf.
Open source intelligence and privacy dilemmas
Security and Human Rights 2012 no. 4
289
intelligence. The large scale usage of (online) open sources has created new
contexts and perspectives that assist intelligence — and security agencies to better
understand the complexity of certain security developments within local or national
contexts. It enables intelligence — and security agencies to verify (classified)
information with various open media sources and data. Finally, it has been argued
that because in comparison to other sources online information is more widely
available and less secretive, the use of OSINF for intelligence purposes has lowered
the threshold for sharing information between intelligence — and security
agencies.21
Apart from information security, the (side) effects of the use of OSINF for
security – and safety purposes have received little attention in the literature. In this
section we therefore introduce some dilemma’s regarding data collecting, —
processing, — mining and — sharing of open source information. Of course there
are many other dilemmas in relation to the use of OSINF by security — and
intelligence agencies or the police. These include the construction of virtual
personal identities by others22 or the facilitation of more social control by the
state.23 In this article, however, we focus primarily on the collecting, processing,
mining and sharing of open source information retrieved from social networking
sites, blogs or apps.
Collection
Whether the wider range of available digital information will be considered a
blessing or a curse remains to be seen. As a result of an information overload, the
need of intelligence — and security agencies and the police to be more critical
about information has become more evident. As Hamilton Bean states ‘the amount
of available, and potentially useful, information for analysts to consider (…) (is)
increasing to nearly unmanageable levels’.24 The search of intelligence analysts in
the information jungle called the internet has become more difficult and requires
new skills from open source analysts. This kind of expertise needs to be developed
by intelligence — and security agencies and the police.25
A second obstacle is the multiplication of individual sources. This so called
‘echo’ effect occurs when a news-item that appears in one source (e.g. a website) is
21 M.D. Cross, ‘EU Intelligence Sharing & The Joint Situation Centre: A Glass Half-Full’,
Meeting of the European Union Studies Association, 2011, p.10. Retrieved 25 July 2012,
http://www.euce.org/eusa/2011/papers/3a_cross.pdf.
22 E. Morozov, The Net Revolution: How not to liberate the world, London, 2010.
23 M. Hildebrandt, B. Koops and K. De Vries, ‘Where Idem-Identity meets Ipse-Identity.
Conceptual Explorations’, in Future of Identity in the Information Society, 2008. Retrieved 26
July 2012, http://www.fidis.net/fileadmin/fidis/deliverables/fidis-WP7-del7.14a-
idem_meets_ipse_conceptual_explorations.pdf.
24 Bean, p.7 (see note 7 above)
25 See e,g, H. Minas, ‘Can the open source intelligence emerge as an indispensible discipline for
the intelligence community in the 21st century?’, Research Institute for European and
American Studies, 2010.
Quirine Eijkman and Daan Weggemans
Security and Human Rights 2012 no. 4
290
spread among a considerable number of other media sources.26 The risk arises
when a story is framed in differing ways by a variety of sources of which many
only reflect a certain part of the story, sometimes in different ways. This may
mislead OSINF analysts. Consider, for example, a two minute lasting YouTube film
of one man hitting another man on a street. A second film of the incident, which for
the sake of the argument lasted five minutes, showed that the man, who was hit on
the head, had kicked his girlfriend numerous times and subsequently she had
begged for help, had not been posted online but was recorded by someone else on a
mobile phone. If journalists were to write about this incident, they would probably
find the YouTube film and several references on social networking sites and may
be tempted to conclude that the first man was at fault. In turn this creates a reality of
its own and the echo effect could lead to the first man being labelled as the ‘villain’,
whereas in reality the second man carries primary responsibility for the turn of
events.
Personal data: Processing, sharing, mining and storage
Further dilemmas arise with the collection, processing and storage of intelligence
by security — and intelligence agencies and the police. In 2009 the American
Federal Bureau of Investigation (FBI) invested in a private company that specialized
in monitoring social networking sites.27 Similar developments have been reported
about Europol.28 It is not widely known that information from social networking
sites is being gathered and monitored by intelligence - and security agencies and
shared among national - and international actors, but there are many incidents
that suggest this is the case. For example, when in 2012 two British tourists were
detained and deported for tweeting that they were going to ‘destroy America’
during their holiday29, it became clear that Twitter accounts were monitored .30 In
the same year Saudi journalist Hamza Kashgaru31 was deported from Malaysia.
With alleged support of Interpol he had been located there after he fled Saudi
26 Best and Cumming, p.7 (see note 8 above)
27 N. Shachtman, ‘Exclusive: U.S. spies buy stake in firm that monitors blogs, tweets’, 2009.
Retrieved 29 July 2012, http://www.wired.com/dangerroom/2009/10/exclusive-us-spies-buy-
stake-in-twitter-blog-monitoring-firm.
28 European Parliament, ‘Parliamentary Questions, Subject: Wikileaks Global Intelligence files,
Generalised data mining by the US and EU, profiling EU citizens’, 2012. Retrieved 25 July
2012, http://www.europarl.europa.eu/sides/getDoc.do?type=WQ&reference=E-2012-002428
&format=XML&language=EN#def4.
29 Huffingtonpost, ‘British Tourists Detained, Deported For Tweeting ‘Destroy America’’,
2012. Retrieved 2 July 2012, http://www.huffingtonpost.com/2012/01/30/british-tourists-
deported-for-tweeting_n_1242073.html.
30 M. Hosenball, ‘Homeland Security Watches Twitter, social media’, 2012. Retrieved 10
August 2012, http://www.reuters.com/article/2012/01/11/us-usa-homelandsecurity-websites-
idUSTRE80A1RC20120111; For other examples please see Huffingtonpost (2010),’Arrested
over Twitter: 8 Tweets that got people BUSTED’, 25 august 2010. Retrieved on 20 August
2010 via http://www.huffingtonpost.com/2010/08/25/arrested-over-twitter-8-t_n_693866
.html#s130765&title=Man_Arrested_Fined
31 Not his real name.
Open source intelligence and privacy dilemmas
Security and Human Rights 2012 no. 4
291
Arabia due to an ‘insulting’ tweet about the Prophet Muhammed.32 A further
expansion of this monitoring of social networking sites was indicated by an
article in February 2012 that proclaimed the ‘US seeks to mine social media to
predict future’. More specifically, the development of new software was
discussed which enables security - and intelligence agencies to mine information
of social networking sites.33 In a response, Open Source Centre director Patrick
O’Neill stated that ‘we need to see social media as intelligence gathering very
similar to spying’.34
An important dilemma with the processing of the information that is collected
from the social media relates to the storage of large datasets that contain quantities
of digital personal information. Subsequently, ‘data analysis tools (are used) to
discover previously unknown, valid patterns and relationships’.35 Data mining tools
in relation to collected information from e.g. social networking sites can be used by
law enforcement and security — and intelligence agencies to develop risk profiles
and label individuals as potential security risks. For most people this profiling takes
place without the data subject even knowing that he or she is being profiled.36 This
development has led to significant concerns about privacy and data-protection as
well as the right to a fair trial. What if an angry ex-girlfriend posts a Facebook
message that her former boyfriend is a ‘hi-tech terrorist’ and he is subsequently
barred from entering the USA; Will he ever find out why?
By stating ‘just because data is accessible doesn’t mean that using it is
ethical’ Dana Boyd raises one of the major concerns regarding large scale
personal data storage.37 Consider, for example, a youngster who has expressed
radical views about animal rights online and is confronted with this information
ten years later during a job interview for an administrative position in a toy store.
Would it be considered ethical if the employer asked her about it? Points of view
may differ about this: What is in ‘the public domain’ has been redefined to
‘accessible and available for any purpose under any circumstance‘. According to
Boyd we have ‘stripped content out of context, labelled it data, and justified our
actions by the fact that we had access to it in the first place’.38
People who are concerned with privacy and data protection in relation to
data mining of open source information are not afraid, at least initially, of the loss
of ownership over their — digital — personal data. They are primarily concerned
32 Al Jazeera, Malaysia arrests Saudi blogger over tweets, 2012. Retrieved 5 February 2009,
http://www.aljazeera.com/news/asia-pacific/2012/02/20122105349670993.html.
33 M. Wohlsen, ‘US seeks to mine social media to predict future’, Associated Press, 2012.
Retrieved 1 December 2012, http://www.news.com.au/technology/us-seeks-to-mine-social-
media-to-predict-future/story-e6frfro0-1226269477144.
34 P. O’Neill, ‘Spies give way to ‘sexy’ social media’, Federal News Radio, 2012.
35 J.W. Seifert,’Data mining and Homeland Security: An overview. CRS Report for Research’,
Congressional Research Service, 2007.
36 Cf. Hildebrandt et al., p.24 (see note 23 above); Pouchard et al., p.2 (see note 19 above)
37 D. Boyd, ‘Privacy and Publicity in the Context of Big Data’, 2010. Retrieved 27 July 2012,
http://www.danah.org/papers/talks/2010/WWW2010.html.
38 Boyd (see note 37 above)
Quirine Eijkman and Daan Weggemans
Security and Human Rights 2012 no. 4
292
that their data is disconnected from the context in which they intended it to be. To
quote Helen Nissenbaum39, ‘privacy is about expectations about the environment,
and the norms that accompany this environment, in which your information is
shared’. When people share information (e.g. about their health), they share it with
a certain audience of people (a doctor), within a certain environment (the hospital)
where certain norms apply (e.g. doctor-patient confidentiality). Different to this
example is the environment of social networking sites, which deceivingly appear
to be for a selected audience. In reality sites like these are often fully transparent
with many people listing and reproducing your pursuits. We want to upgrade our
statuses and ‘like’ certain pictures on Facebook so others can see who we are, but
this information is usually only intended for a specific audience. Boyd therefore
stresses rightly that ‘Making content publicly accessible is not equal to asking for
it to be distributed, aggregated, or otherwise scaled’.40 The data mining of social
networking sites for security and intelligence purposes is therefore a violation of
privacy.
In addition, the Council of Europe highlighted the risks of automatic data
processing profiling.41 When data for someone is produced based on the data of
others, the data subject ‘a priori cannot suspect the existence of correlation
processes that might result in certain characteristics of other individuals being
attributed to him or her on the basis of a probability calculation’.42 For instance,
when a certain person has been detained at an airport and deported because of his
alleged violent radical views. Other people who perhaps have updated their
Facebook statuses with words that relate to the profile of the deported radical,
may subsequently be treated differently by customs and border control or be
blacklisted as risk passengers, which, in reality, does not correspond to their
actual ‘threat level’. In other words, in addition to its benefits, profiling also
dilutes valuable information concerning implicit personal characteristics that may
be crucial for detecting potential violent extremist. Stories and lives, which take
place outside the realm of social networking sites are not taken into account and
this may lead to the profiling of persons who in reality are no threat (false
positives) - e.g. the case of the British tourists - or not identifying those who
actually pose a threat (false negatives).
Hildebrandt, Koops and De Vries43 also focus on the side effects of
profiling. They state that the process of identity construction might be affected by
profiling based on data mining techniques. Accordingly, profiling may lead to a
different treatment that could affect real-life opportunities. When people are
39 H. Nissenbaum, ‘Privacy as Contextual Integrity’, in Washington Law Review, 2004, no. 1,
pp. 101-139.
40 Boyd (see note 37 above)
41 Council of Europe, The Protection of Individuals with Regard to Automatic Processing of
Personal data in the Context of Profiling, Recommendation CM/REC(2010)13 and explanatory
memorandum, 2010, pp.28-32.
42 Ibid.
43 Hildebrandt et al., p.8 (see note 23 above)
Open source intelligence and privacy dilemmas
Security and Human Rights 2012 no. 4
293
unaware of being profiled or where their personal data is stored, they may change
their behaviour after experiencing the negative consequences of this profiling. This
can be considered a privacy violation; at least when privacy is defined as the
‘freedom from unreasonable constraints on the construction of one’s identity’.44
Open source accountability
Open source information has increased the range of security tools at the disposal of
security — and intelligence officials or police officers. Nonetheless, the side effects
of this new method of intelligence gathering should be balanced by a form of
accountability, which is sufficient both in theory and in practice. To illustrate, in
most Western societies there is strict legislation for phone — or internet tapping,
however for social networking sites or apps this is less evident. Furthermore, many
security officials probably do not see the need for more accountability for OSINT.
They may argue, for example, that social networking sites are part of the public
domain and that therefore anyone is able to access it. Henceforth what is the issue
with monitoring by public — or private security analysts? And, why should they
keep track of what kind of information they collect? The difference, however,
between just anyone and a security official is that OSINT can indirectly or directly
affect someone’s private life or future opportunities. As mentioned before, the use
of OSINF for intelligence purposes has real life consequences. From a human rights
perceptive these side effects should be balanced. Henceforth, state accountability
for the use of OSINF is reviewed in this section.
As a concept accountability has a normative aspect intertwined with notions
of justice, responsibility, integrity, fairness and democracy.45 Simultaneously
accountability is concrete and ‘value free’ and focuses on the ‘obligations to
evidence management or performance imposed by law, agreement or regulation’.46
Blind distinguishes between ‘accountability as the philosophy of government’ and
accountability as the ‘means’ of government.47 In this article we recognize this
difference and differentiate between the process through which politicians or
heads of security — and intelligence agencies or the police inform society about
their plans and actions in terms of open source collection and justify the need to do
so, while the actual behaviour of officials and the outcome analyses by security —
and intelligence agencies or the police are subject to review or sanctions.48
44 P.E. Agre and M. Rotenberg (eds.), Technology and Privacy: The New Landscape,
Cambridge, Massachusetts, 2001, p.7.
45 P.K. Blind, ‘Accountability in Public Service Delivery: A multidisciplinary review of the
concept’, Expert Group Meeting Engaging Citizens to Enhance Public Sector Accountability
and Prevent Corruption in the Delivery of Public Services, Vienna, 2011. Retrieved 21
August 2012, http://unpan1.un.org/intradoc/groups/public/documents/un-dpadm/unpan0463
63.pdf.
46 E.L. Kohler, A Dictionary for Accountants, New Jersey, 1975.
47 Blind, p.4 (see note 45 above)
48 J.M. Ackerman, ‘Social Accountability in the Public Sector: A conceptual discussion’, in
Social Development Papers: Participation and Civic Engagement, 2005, no. 82, p.6.
Retrieved 26 August 2012, http://siteresources.worldbank.org/INTPCENG/214574-
Quirine Eijkman and Daan Weggemans
Security and Human Rights 2012 no. 4
294
This aforementioned form of accountability is characterised by its focus on the rule
of law and good governance, as well as the inclusion of civil society or ordinary
people.49 It could imply that the head of a security — and intelligence agency, or
those politically responsible, not only publically announce the purpose of
collection, processing, mining or sharing of OSINF, but also limit its use to
predefined threats such as national security (e.g. for cyber espionage, international
terrorism). Furthermore, international — and/or national law makers should
determine what the boundaries are (the rule of law) and how data subjects can seek
readdress (internal — or external accountability mechanisms). Finally, the design of
software that enables OSINT and simultaneously emphasizes accountability (data
protection by design) should be modified accordingly. This includes privacy-
enhancing-technologies (PET’s) or transparency-enhancing-technologies (TET’s).50
These measures may be considered as a form of good governance in relation to
balancing the security officials’ use of OSINT.
Apart from the necessity to protect national security interests, which may
interfere with transparency efforts, there are other accountability dilemmas in
relation to OSINF. Ensuring accountability is more complex if the information is not
collected by the security agency itself, but by other public — or private entities. It is
not uncommon for intelligences and security agencies to share information on an
(inter-)national level and since 9/11 it has become more common for law
enforcement agencies to do so as well. This is a recent development because
‘traditionally a distinction exists between collecting intelligence for national
security purposes and gathering evidence for criminal investigations, as they serve
different purposes’.51 Security — and intelligence agencies prefer to keep their
sources confidential, whereas law enforcement agencies ultimately have to share
the case file with the defence council. Nonetheless, OSINT used by intelligence —
and security agencies is also collected and processed by other state agencies and
sometimes shared with international partners and this has affected accountability in
practise. Consider for instance whether security analysts know what the original
source of a piece of information containing personal data was, let alone if those
affected may ever have the opportunity to access or correct it. What if the
personal data is used by the security analyst’s agency for a security clearance
investigation of refugees?
Likewise OSINF and OSINT are also collected by private companies. As is
discussed above, there has been a major growth in private companies involved in
the collection of OSINF and the acquiring of OSINT. The Wikileaks Global
1116506074750/20542263/FINALAckerman.pdf.
49 Blind (see note 45 above); Ackerman (see note 46 above)
50 M. Hildebrandt, ‘Privacy Enhancing Technologies’, Hide Project, Pets 2nd Focus Group,
2011. Retrieved 26 August 2012, http://ebookbrowse.com/hide-fg-privacy-enhancing-
technologies-minutes-20091016-pdf-d113363089; J.J.F.M. Borking, ‘Privacyrecht is een
Code: Over het gebruik van Privacy Enhancing Technologies’ (Privacy is a Code: About the
use of Privacy Enhancing Technologies), Deventer, 2010.
51 Q.A.M. Eijkman and B. van Ginkel, ‘Compatible or Incompatible: Intelligence and human
rights in terrorist trials’, in Amsterdam Law Forum, 2011, no. 4, p.4.
Open source intelligence and privacy dilemmas
Security and Human Rights 2012 no. 4
295
Intelligence Files, for example, reflect how the company Stratfor provides
intelligence to public and private entities including the US Defense Intelligence
Agency.52 One of the released 5.5 million hacked Stratfor emails reveals the
existence, a predictive software system TrapWire of TrapWire Inc. that combines
CCTV images and number plate recognition (ANPR) collected in the public domain
of two America cities.53 Another multinational security company Raytheon has
developed a riot program that mines social networking sites and on the basis of the
outcome is able to track people at their location.54 Evidently, private entities are
able to sell or share the software or the open source information with security —
and intelligence agencies or the police, who probably use it for investigation
purposes.55 These developments take place, while questions of accountability for
the use of the personal data stored on social networking sites remain unaddressed.
As Ben Hayes concludes ‘we must (then) develop the tools and communities
needed to bring them under democratic control’.56
In response to these developments, civil society is in the best position to
hold the state accountable. Regarding the use of open social networking one of the
challenges in terms of holding security officials or their agencies accountable is that
in most cases the data subject has no idea that their online behaviour has been
monitored. When this monitoring leads to differential treatment accountability
issues become more realistic. But who will issue a complaint? Consider if you are
not aware that your ‘denial of an opportunity’ is the result of a private company or
an intelligence agency keeping track of your digital pursuits? To some extent civil
society can hold security agencies accountable by blogging, writing reports,
informing the public or petitioning to public authorities. Nonetheless, OSINT is a
growing business and challenging to monitor for outsiders. Subsequently, some
take more drastic steps such as hacking (governmental) websites or developing
encryption software programs to communicate ‘without anyone watching’.
52 Wikileaks, ‘Stratfor Emails: Wikileaks impact is Stratfor’s bottom line’, The Global
Intelligence Files, 2012a. Retrieved 1 March 2012, http://wikileaks.org/WikiLeaks-Impact-is-
Stratfor-s.html.
53 Parent company Cubic Cooperation. Previously owned by Abraxas Applications, who had
created it under Abraxas Applications Inc.. It is a private company that employs several
former public officials of the Central Intelligence Agency (CIA) and other agencies. See: RT,
‘TrapWire Tied to Anti-Occupy Internet-spy-program’, 2012. Retrieved 28 August 2012,
http://rt.com/usa/news/trapwire-abraxas-cubic-surveillance-251/; C. Arthur, ‘Trapwire
surveillance system exposed in document leak’, in The Guardian Online, 2012. Retrieved 13
August 2012, http://www.guardian.co.uk/world/2012/aug/13/trapwire-surveillance-system-
exposed-leak
54 R. Gallagher, ‘Software that tracks people on social media created by defense firm’, in The
Guardian Online, 2012. Retrieved 10 February 2013, http://m.guardian.co.uk/world/2013/feb
/10/software-tracks-social-media-defence
55 Public Intelligence, ‘Unraveling TrapWire: The CIA-connected global suspicious activity
surveillance system’, 2012. Retrieved 29 August 2012, http://publicintelligence.net/
unravelling-trapwire; Wikileaks, ‘Stratfor Emails’, Wikileaks: The Global Intelligence Files,
2012b. Retrieved 9 August 2012, http://www.wikileaks .org/gifiles/releasedate/2012-08-
09.html.
56 Hayes, p.8 (see note 5 above)
Quirine Eijkman and Daan Weggemans
Security and Human Rights 2012 no. 4
296
Reflections
In this article we argue that the increased use of Open Source Information (OSINF)
and Open Source Intelligence (OSINT) for safety and security purpose needs to be
balanced by assessing what state accountability in a digital world should entail.
Even though security — and intelligence agencies and the police are tempted to
increase their use of social networking sites, tweets, blogs or apps, state
accountability mechanisms have struggled to adapt to the online open source
culture. Consider for instance the dilemma that intelligence is frequently collected,
processed, minded and stored by external entities including foreign security — and
intelligence agencies or private companies. Subsequently, it is reasonable to
assume that the security officials or analysts, who are the end-users, are oblivious
to the original source or its specific context. The use of OSINT requires, despite
the fact that ordinary people are usually unaware of being profiled or where or
how their digital personal data is stored in data bases or ‘the cloud‘, that the
public at large begins to ask more questions. The most important reason being that
despite the safety and security benefits, this information may have real-life
consequences.
State accountability for OSINT should at least require that it is used on the
basis of a law. Furthermore politicians, heads of security agencies or security
officials should proactively inform society about their plans and actions in relation
to OSINF and justify the need to acquire OSINT. This should preferably happen
online and also, if possible, at the — public — entity where the data subjects are
confronted with the outcome. Even though the security official, who collects
information about a person who potentially poses a threat, may be aware that this
particular information needs to be verified, it is reasonable to assume that in real
life no risks are taken.
Furthermore, state accountability should entail that the actual process and
outcome of data collection, — processing, — mining and — sharing is subjected to
review or sanctions. In practice, however, this is challenging: Which entity or who
carries the responsibility? And, what about accountability? On an individual level:
Is it the analyst, the risk profiler, the executive security official etc.? Is individual
accountability in this context realistic or is it simply a matter that when everybody,
the whole chain of security officials who collect, mind, process and store data, is
responsible, nobody is accountable? Formally those entities who collect and store
OSINT are accountable, but in practice it is unlikely that a data subject will ever have
access or correct information (probably for security reasons). Therefore before any
form of new state accountability mechanism is considered, an informed public —
and political debate about the desirability of OSINF accountability by the state
should take place. The use of OSINT for safety and security purposes is a reality, but
(re)assessing state accountability is necessary for its legitimate use by security —
and intelligence agencies and the police.