Content uploaded by Elly Hanson
Author content
All content in this area was uploaded by Elly Hanson on Jan 14, 2020
Content may be subject to copyright.
1
‘Losing track of morality’: Understanding online forces and dynamics
conducive to child sexual exploitation
Hanson, E. (2019). In J. Pearce (Ed.), Child sexual exploitation: Why theory matters, chapter five,
pp 87-116. Bristol: Policy Press.
1. Introduction
It is commonly understood that the internet and digital technologies confer both positive
opportunities and risks to children and young people. This chapter is concerned with the
latter, specifically exploring how the evolution, design and control of the internet and digital
technology have been conducive to child sexual exploitation (CSE) – both CSE involving online
elements and that which does not directly.1 Theorising and understanding this opens up
avenues for more effective prevention and intervention.
Because CSE closely entwines and overlaps with other harms, such as familial sexual abuse,
sexist sexual objectification and sexual harassment, this chapter is also relevant to
understanding the influence of technology on these problems. Indeed one argument made
here is that online factors have amplified CSE in part via their influence on ‘lesser’ issues such
as harassment and objectification. These may contribute to CSE by implicitly reinforcing
harmful gender norms and signalling permission (Thomae and Pina, 2015), whilst also being
pernicious in and of themselves, their impact akin to a dripping tap.
Numerous utopian and dystopian visions have been projected onto the continuing evolution
of digital technology (Naughton, 2012). What these visions often have in common is a sense
of inevitability, and this sense is also shared by those claiming the digital future is impossible
to predict. This chapter is built on, and evidences, a contrasting position: currently technology
(tech) contributes to significant harm to children (notwithstanding its positives), but there is
no reason that this has to continue. Technology has been developed by humans, and so
humans can further shape it, our understanding of it, and our interactions with it, to now
prioritise the rights of children and young people. More specifically, we have agency to make
the online space hostile to sexual exploitation.
The rising awareness of the range and severity of problems amplified or brought into motion
online (including disinformation and online hatred) is currently provoking calls for fundamental
change and a greater determination by governments to more effectively regulate the online
space (see for example the UK government’s recent Online Harms white paper, and the
development of an Age Appropriate Design Code by the Information Commissioner’s Office).
All of this may further lead to increased self-reflection within the tech industry (Zunger, 2018).
This has created a ripe moment in which to understand and more confidently challenge online
practices and dynamics conducive to CSE and to chart new pathways.
The central theory presented in this chapter is as follows: the ideology of cyberlibertarianism,
combined with organizational social processes and the impact of power, have contributed to
tech corporations acting in ways that facilitate CSE (both directly and indirectly), and relatedly,
have contributed to online spaces and processes being understood and approached as freer
from social and moral concerns than others. Four key (interrelated) online routes to
increased CSE are highlighted involving: a) online sex offending psychology, b) the online porn
1 For theories regarding how online factors affect the impact (versus the prevalence and dynamics) of CSE see
Hanson (2017a).
2
industry, c) online ‘escort’ agencies, and d) the interaction of social media and gaming
platforms with adolescent developmental proclivities. Practice and policy implications of this
‘big picture’ perspective of online contributors to CSE are explored in the final section.
2. Some definitions, principles and prevalence data
Child sexual exploitation
As has been discussed elsewhere (Coy, this volume; Beckett and Phoenix, this volume;
Melrose, 2013; Kelly and Karsna, 2017), there are fundamental difficulties in defining CSE, in
part related to its history within the UK as a replacement for the term ‘child prostitution’.
Recent guidance from the Department of Education (2017) states that CSE is a form of child
sexual abuse that is distinguishable from other forms by the centrality of exchange dynamics,
and the status or financial advantage the abuse affords the perpetrator. However forms of
CSA thought not to be CSE also often have exchange or status as part of their dynamics (for
example many cases of intrafamilial sexual abuse), and even if this were not the case, exchange
or status dimensions to abuse are often hard to identify, limiting the definition’s practical utility
and increasing the potential for confusion and ‘error’.
When it comes to online abuse, typically ‘exploitation’ and ‘abuse’ are used synonymously.
For example, research articles variously discussing ‘child sexual exploitation material’, ‘child
sexual abuse images’, and ‘child pornography’ are generally discussing the same thing. In the
United States, child abuse involving finances is termed ‘commercial sexual exploitation of
children’ (CSEC), a useful term which clearly defines the ‘gain’ involved in this form of CSE.
For the purpose of this chapter, these definitional issues are not too problematic, because the
factors and dynamics discussed here are generally relevant to all forms of child sexual abuse,
including what might be thought of as CSE. At the same time particular focus is placed on
sexual abuse involving financial exchange; this appears to have been relatively neglected in
discussions of online sexual abuse, which is often seen as predominantly the viewing and
exchange of child abuse images, and online grooming and blackmail.
Types of online child sexual abuse
Table 5.1 offers a typology of (digital) technology-assisted child sexual abuse.2 This typology
does not include child sexual abuse or exploitation that is only distally influenced by online
factors (such as via the influence of online pornography). However such influence is
considered in discussions below.
[Table 5.1 sits here – see end of this document]
‘Online’, ‘internet’ and ‘digital technology’
For pragmatic purposes, these terms are used interchangeably in this chapter. If a person is
online they are using the internet (whether via the world wide web, an app or some other
service) and to do so, they are using digital technology. Via smartphones, this technology has
brought together communication via online and telephone networks. It is continuing to
evolve, creating ever more uses of the internet.
Both, and
A dialectic can be thought of as the insights or ‘truths’ that emerge from keeping together
two positions that might otherwise be placed in binary opposition. Dialectics critical to the
2 This is a revised version of those that appear in Hamilton-Giachritsis et al. (2017) and Hanson (2017a).
3
theories advanced in this chapter are a) both individual responsibility on the part of offenders
and enablers and systemic factors have a part to play in CSE – and they closely entwine; b)
both children and young people’s rights to participate and their rights to protection are
paramount; and c) both adolescent agency and adolescent blameless victimization co-exist, and
are most usefully understood together (Hanson and Holmes, 2014). The overlap between on-
and offline abuse is also emphasised – categories that although merged have often been seen
as separate.
Prevalence of technology-assisted CSE
The rise of the internet has enabled a proliferation in child abuse image offending, as well
as other forms of sexual abuse, such as grooming and sexual blackmail. However,
methodologically robust prevalence studies are lacking. The Interpol database of online
abuse images contains data relating to approximately 15,000 identified victims and between
60,000 and 70,000 unidentified victims, the majority of whom are pre-pubescent in the
images (Moran, 2017). Between 100,000 and 590,000 adults in the UK are estimated to
have viewed child abuse imagery (Jutte, 2016).
Turning to online grooming, a U.S. household telephone survey in 2010 of children between
10 and 17 year olds found that 9% had received one or more online sexual solicitations that
were either from an adult, or, if not, were unwanted (Mitchell et al., 2013).
Even less is known about the prevalence of internet-facilitated commercial CSE. Jonsson et
al. (2015) found that 0.9% of boys and 0.6% of girls in a representative sample of Swedish
16- to 22-year-olds reported selling sex online. Mitchell et al. (2011) estimated that 8% of
arrests in the U.S. for internet child sex offending in 2006 involved financial exchange, with
36% of these involving ‘direct offences against victims’, i.e. the purchase of or profit from
either sexual activity with a child or child sexual abuse material (CSAM) produced by an
offender. The demographics of the profiteers captured in this study, such as high levels of
non-sexual offending, differ from those of other sex offender groups and ‘suggest more
seasoned offenders who are leading or involved with larger, more organized networks of criminals’
(p. 63). It seems likely that these are pimps involved in wider sexual exploitation of children
and adults who are making the most of the internet’s affordances for their business.
3. The perfect storm: Cyber-libertarianism, rapid innovation, power,
and ethical drift
‘Civilisation is the process of setting man free from men’ (Ayn Rand, 1943)
‘Technology is not neutral but serves as an overwhelming positive force in human culture… we have
a moral obligation to increase technology because it increases opportunities’ (Kelly, 2010)
People’s experiences online are largely mediated by tech corporations. Many of these have
not given sufficient priority to the rights of children and young people (Kidron et al., 2018),
and one consequence of this has been increased child sexual exploitation. Before providing
evidence and examples of this impact, this chapter discusses theories that help us understand
why this deprioritisation of children’s rights and best interests has occurred.
Some might argue that tech-assisted harms can simply be explained by the profit motive:
attending to children’s rights and best interests may lower financial profits in the short-term,
4
and so in a self-serving fashion, these interests are down-played, ignored and denied by those
who serve to gain the most. This is likely a major part of the picture, but does not satisfy as
a full explanation. In other contexts, people would baulk at increasing their wealth at the
expense of increased abuse, and it does not account for governments’ comparatively weak
regulation of the tech sector (at least up until recently). Alternatively it might be argued that
many of these harms (such as those from pornography) have only recently been substantiated
via research, and so they could not be factored into decision-making. Again this does not
satisfy, given that they are all plausible and therefore could have prompted a precautionary,
research-first approach along the lines of ‘first do no harm’.
I argue here that the libertarian philosophy that has powerfully shaped the internet and
perceptions of it, combined with the disruptive innovation enabled by the fundamental
architecture of the web, the accruing power of tech corporations, and the processes within
organisations that lead to ethical drift within them, have coalesced to produce a perfect
storm.3
Cyber-libertarianism and disruptive innovation
Libertarianism is a distinct political stance and moral psychology whose guiding principle is the
freedom of individuals, in particular from interference by the state. The concern for this ‘right’
trumps all others: moral values based on altruism, and connection to others, are rejected or
downplayed (Iyer at al., 2012). Individuals attracted to these beliefs, compared to both
conservatives and liberals, tend to have a more cerebral cognitive style and feel less empathy,
social connection and interdependence - as Iyer et al (2012) summarise in their large scale
comparison study, ‘libertarians have a lower degree of the broad social connection that typifies
liberals as well as a lower degree of the tight social connections that typify conservatives’ (pg. 19).
Despite Libertarianism being a minority stance across society, it has played an inordinately
major part in shaping both economics (championing the ‘free market’) and cyberspace. To
understand its role in the latter we must first appreciate the unique features of the internet,
in contrast to other systems and networks.
The internet is a network ideally suited to disruptive innovation, a phenomenon perhaps best
captured by Mark Zuckerberg’s (2012) exhortation to ‘move fast and break things; if you are
not breaking stuff, you are not moving fast enough’. This is because, unlike most systems, there
is no central control, and it does not aspire to optimize any particular application (Naughton,
2012). These features have allowed innovation to progress as fast as inventors can create new
uses of the internet, rather than at the speed decided upon by ‘controllers’, whether
governmental or corporate.4
The individual ‘freedom’ therefore afforded by the internet, bound up with absent or
ineffectual control by the state, was understandably particularly attractive to those with
libertarian leanings from the early days of the internet onwards, disproportionately attracting
them to both its development and narration – how it grew and how it was understood. In the
1990s a lot was written about the freeing potential of the internet, and it was constructed as
a place that could and should not be bound by the same rules and restrictions governing the
rest of life. This ‘cyberlibertarianism’ (also dubbed the ‘Californian Ideology’; Barbrook and
3 As with any broad social phenomenon, involving numerous groups, organisations and subcultures, there are
likely to be a large number of contributory factors. Therefore this does not aim to be an exhaustive explanation,
but a distillation of those factors thought to be particularly germane to this space and sector.
4 However this feature may be in decline with the increasing dominance of a small number of large tech
corporations (Naughton, 2012)
5
Cameron, 1996) persists, albeit expressed with more pragmatism and less idealism (Dahlberg,
2010) – narrating a dominant discourse for every new iteration of the internet as it stays true
to its pattern of disruptive innovation.
The broad idea that ‘computerization will set you free’ (Golumbia, 2013) has now spread
extensively into media, government and lay thinking, via its promotion by many who narrate
the digital technology sphere, and many with considerable power because of it (see Dahlberg,
2010 for examples). Assisting this process is the co-option of seductive words and phrases
such as ‘open’, ‘opportunity’, ‘creativity’, ‘connection’ and ‘freedom of speech’ to champion
what is in fact a narrow version of liberty that does not include the ‘positive liberty’ afforded
to people and communities through greater equality and material conditions, and through
protection from harm. The ‘negative liberty’ (Berlin, 1958) aspired to by libertarians enables
those with some financial power to accrue more and more, enabling corporate oppression
and exploitation. By promoting consumerism and ‘pacifying modes of existence’, and by
increasing economic inequality, this results in most people having far less freedom (Golumbia,
2013; Dahlberg, 2010). Grand abstractions are used to hide these contradictions and, what
to most would be, unpalatable aspects.
As a result of this dissemination, many across society at least loosely adhere to a libertarian
view of the online, despite not applying these principles elsewhere and the contradictions that
inevitably ensue. For example, in many countries pornography (porn) (whose influence on
CSE is discussed later in the chapter) is illegal to sell offline to those under 18 years old, yet
online it is freely available to children of all ages – the UK has only recently moved to limit
access and is the only country to do so. Similarly, offline in the UK it is illegal to sell
pornography containing either adults role-playing as children; real or apparent lack of consent,
infliction of pain or lasting physical harm; or threats, humiliation or abuse (BBFC, 2014). Online
there is a copious amount of such material and there has been no move to restrict access.
Some would argue the online space is too huge, complex and fast-moving to effectively direct
towards prosocial goals and regulate where necessary. Again there is a contradiction:
technological innovation continues apace when it comes to extending highly complex forms
of automation and the ‘freedom’ discussed above, but innovation in the service of protecting
the rights and welfare of the more vulnerable is deemed impossible to develop or apply, and
labelled as naïve wishful thinking.
In short, the moral principles that it is generally wrong to harm others, and right to both
protect the vulnerable and move towards greater equality, are relatively absent from
libertarian philosophy (Iyer at al., 2012; Haidt, 2012). As a result the relative dominance of
cyberlibertarian narratives have given permission and encouragement to individuals and
corporations to design and evolve online sites, applications and tools without due regard to
the protection, welfare and rights of children, and indeed adults. They have also weakened
the impetus of governments to regulate in line with these principles, reducing their
involvement to pleas to industry to self-regulate. As John Naughton (2018) argues, this
represents ‘the outsourcing to private corporations of public-interest and free-speech decisions that
we would normally expect to be made by democratically accountable institutions’. Such a situation
has resulted in various pressing social issues such as the proliferation of fake news, hate speech
and online abuse; surreptitious mass data collection; easy online access to child abuse images;
easy connections between young people and exploitative persons; and tech corporations’
exploitation of young people’s developmental proclivities including the ‘soft entrapment’ of
children and young people into social media and gaming (Kidron et al., 2018). Sexual
exploitation is one of various harms that children and young people are as a result more at
risk of.
6
Beyond cyberlibertarianism and the profit motive, there are other major, interacting
contributors to this state of affairs. The fast pace of innovation enabled by the core qualities
of the internet5 contrasts markedly with the slower pace of societal thinking and debate. This
means that things can be done that society has not had time to form a collective judgement
on or response to. Within this time gap, if enough people take up the new behaviour, system
justification psychological processes can take hold, then influencing the societal view. System
justification is the tendency to justify something simply because it is part of the status quo:
what is perceived as normal is judged to be acceptable simply because it is normal (Jost et al.,
2004). Furthermore, there are processes (psychological, social and organizational) that can
often operate within organisations to pull people away from their moral compasses, and to
these this chapter now turns.
Ethical drift within and at the helm of organisations
When people enter into an organization, they are invited to identify with it, and take on its
values and norms. In technology corporations these often emphasise the good of ‘freedom’,
‘connection’, ‘creativity’ (defined in a cyberlibertarian fashion), as well as of course company
profit. As a consequence, other values such as equality, and protection from exploitation and
manipulation are demoted. Value systems are typically conveyed in a variety of explicit and
implicit means (for example, what colleagues focus their conversations on), and over time,
people can come to internalize them, limiting their exposure and awareness of alternative
ways of thinking (Moore and Gino, 2013).
Roles and goals can be powerful tools in narrowing people’s moral fields of vision, emphasising
certain foci of attention at the expense of others (e.g. Ordóñez et al., 2009). Layering onto
this is the diffusion of responsibility people frequently feel when part of an organisation – we
can act less ethically when we feel less responsible, as the avoidance of guilt becomes a less
salient motivator. Related to this are the natural human proclivities to socially conform and
to submit to those higher in (organisational or societal) status and power. Classic psychology
studies by Zimbardo and Milgram in the early 1970s, as well as many that have followed,
highlight how people can quickly sacrifice moral concerns to social conformity and obedience
when these are pitted against one another (see Bandura, 2016). Smoothing the way,
euphemistic language can help enable unethical behaviour by blurring the harms being wrought
(Moore and Gino, 2013). An example of this in the tech industry is the use of vague, ambiguous
terms such as ‘dwell features’ to describe tricks and incentives used to manipulate children
into greater and more dependent use of various apps and games (Kidron et al., 2018).
Much of the psychology discussed here would not lead to less ethical behaviour within
organisations with strong moral leadership. However, research finds that the power and
money often associated with leadership can skew judgement and action away from ethics and
towards selfishness, confirming and adding nuance to the old adage ‘power corrupts’. Various
experimental and naturalistic studies have found that a sense of power can lead people to be
less empathic and sensitive, more hypocritical (promoting a self-serving rule for oneself and a
different rule for others), and more egocentric (e.g. Lammers et al., 2010; Van Kleef et al.,
2008). Power also biases people to see others more in terms of their instrumental value,
rather than as fully human (Gruenfeld et al., 2008). Similarly, a focus on money can lead people
to behave more selfishly and less altruistically (Vohs, 2015). These effects may be relevant
both to leaders with significant money and power, and to those otherwise working within
5 None of this discussion should be taken as arguing that innovation powered by the internet is inherently
negative - it has shaped human life in numerous positive ways. The challenge is finding ways to ensure that
ongoing innovation is human- versus technology-centred (Carr, 2016).
7
organisations with high status and worth. All of this means that, in societies that allow for
relatively unchecked power and wealth, individuals and organisations with influence are
vulnerable to less ethical decision-making than the average person.
The big picture of narratives and dynamics discussed here helps us to understand why many
tech companies have acted in ways which have either directly or indirectly facilitated CSE –
examples of which are discussed in sections five and six. It also helps to explain why, in
interaction, the internet is perceived by many online sex offenders to be a place where their
usual morals do not apply, a phenomenon described in brief next.
4. Online sex offending and perceived freedom
Research suggests that many online sex offenders adopt a libertarian view of the internet
which increases the permission they give themselves to offend.6 Offenders have described the
internet seemingly being governed by different rules and norms, clearly boundaried from the
offline, and this perspective enabling them to develop a new persona that, freed from the
ethical codes and social surveillance of the ‘real world’, can without compunction explore
‘taboos’. An offender in Rimer’s (2017) anthropological study summarises the perceptions of
many in his comments:
‘This frontier [the internet] that’s just wild and lawless. It’s like being, you know, in the Wild West…
when you’re in that bubble it’s like a very strange virtual world that you’re in, and, and it becomes,
you lose track of time, you lose track of place, but most importantly I think you lose track of morality.’
(p. 41)
Online, offenders may easily find or come across others who, holding similar views of online
freedom, celebrate child sexual abuse, argue that it is acceptable, and share CSAM (Prichard
et al., 2011). These views will likely influence and amplify the individuals’ own permission-
giving thoughts, and reduce their awareness of the positive values, identity and aspirations
that they might hold in other moments. The internet also provides the distance and narratives
many offenders may need to escape the empathy for victims that might otherwise inhibit their
offending, for example by helping them falsely construct abused children online as ‘actors’
(Kettleborough and Merdian, 2017).
It is important to note the offender’s active role in perceiving the online space in this way,
and finding many of its offence-facilitating qualities. For example, the demarcation between a
‘free’ online sphere and a restricted offline one appears to be strengthened by some offenders
choosing to view abuse online only using specific computers and only during the night, when
the perspective of others may be more easily evaded psychologically (Rimer, 2017).
Cyberlibertarianism is also posited to increase CSE offending via its influence on the nature
of online content and spaces, most saliently online pornography and ‘escort’ services. The
role of this online sex industry is now examined, with due attention paid to state responses
and omissions.
6 This is one of a number of ways in which online dynamics and affordances facilitate sex offending; others (which
overlap and interact) include the affordability, anonymity and accessibility of the online – the Triple A engine of
online offending articulated by Cooper (1998) in the internet’s early days.
8
5. Technology corporations playing a key role in CSE: Online porn and
online ‘escort’ services
People’s experiences online are largely mediated by tech corporations, many of which have
deprioritised the best interests and rights of children and young people (Kidron et al., 2018).
This section argues that within this context, online pornography and ‘escort’ services operate
in ways which directly amplify CSE, increasing both the motivation for it and the easy means.
Big Porn and child sexual exploitation
The nature of online pornography7 is critical to understanding its impact. Klaassen and Peter
(2014) analysed the content of 400 of the most popular online pornography films and found
that 41% of professional videos8 depicted violence9 towards women. Women’s responses to
this violence were for the most part neutral or positive. Men dominating women were also
common, as were women being instrumentalized (i.e. used as a sexual object whose own
sexual pleasure is not important). A concerning minority of films depicted non-consensual or
manipulative sex. ‘Teen’ is one of the most common search terms on porn sites (Pornhub
Insights, 2017), and common themes within this genre are cheerleaders and ‘girls’ in school
uniform. Depictions of sex as an expression of love or emotional intimacy are rare. Women
and girls are routinely depicted enjoying acts that in reality most would find distressing (for
example, one women being penetrated by numerous men sequentially). Pejorative terms for
females are frequent (such as ‘slut’ and ‘whore’), as is the labelling of them by their ethnicity
(Paasonen, 2010). Videos from a range of genres are presented alongside one another, so that
users, whilst sexually aroused, are easily exposed to more taboo-breaking sex than they had
first sought (for example, involving family members, aggression or child-like actors). This
feature appears central to the porn industry’s business model (D’Orlando, 2011) and to its
negative impact on users.
These features and content help to explain the influence that pornography has on sexual
harassment and aggression. Longitudinal studies find that male adolescent pornography usage
significantly predicts sexual harassment perpetration over time (Brown and L’Engle, 2009;
Ybarra et al., 2011), and longitudinal, experimental and meta-analytic studies indicate that
pornography also increases the risk of adult sexual aggression (Wright et al., 2015). It is most
likely to contribute to sexual aggression in males who already equate masculinity with
dominance (Malamuth et al., 2012; Kingston et al., 2008).
Pornography seems to contribute to sexual hostility, at least in part, because it develops,
strengthens, and entrenches certain sexual scripts, and conceptions of people, at the expense
of others (Wright, 2014). It narrates sex as a quest for sexual pleasure via the viewing and
use of other people’s bodies. Sex is impersonal and other people (generally women) are tools
for one’s own gratification. In the process of promoting this version of sex (both to be held
internally, and perceived in others), other versions of sex are side-lined, such as those that
place relational connection (e.g. ‘chemistry’) or intimacy as central. These porn-inspired
scripts increase both motivation and perceived permission for sexual coercion; in relation to
the former, arousal may become centred on the use of others, and become untethered from
reciprocity, mutuality, equality, and open communication. Consistent with sexual script
7 In this chapter the terms pornography and porn are used to denote online sexually explicit material involving
adults and with some form of commercial element. Big Porn refers to the online pornography industry.
8 This percentage dropped to 37% when films labelled ‘amateur’ were also included in the analysis.
9 The two most common forms depicted were spanking and gagging (inserting a penis very far into the mouth).
9
theory, research finds that pornography10 use is linked to sexism, objectified notions of
women, and impersonal sexual scripts which demote communication – these in turn are
associated with an increased proclivity for sexual coercion (e.g. Forbes et al., 2004; Peter and
Valkenburg, 2009; Wright and Tokunaga, 2016; Tomaszewska and Krahe, 2016; Hald et al.,
2013).11
All of this suggests that pornography is likely to be a potent contributor to the frequent sexual
harassment of girls in many adolescent and young adult social groups where sexism and
pornography use are common features (Ringrose et al., 2012); indeed the links are readily
made by young people (Coy et al., 2013). And, as noted, ‘lower level’ sexual hostility, such as
the use of words such as ‘slut’, rating girls on their body parts and sexist sexual jokes, can
give implicit permission for abuses such as CSE.
There are also grounds to suspect a major role for pornography in adults’ sexual exploitation
of children (indeed many CSAM offenders have described it as influential in their pathway to
offending, e.g. Rimer, 2017). Once a person has been inculcated into viewing people on screen
as categorised (such as teen or Asian) masturbatory aids, children may be relatively easily
integrated as simply another categorised object. Some pornography users describe developing
sexual interests that they did not have previously (Wood, 2013); this seems to follow from
frequent viewing leading users to become desensitized to previously arousing material
(D’Orlando, 2011), and new ‘taboo-breaking’ (read: ethical boundary breaking) material being
shown during arousal. In short, pornography may both elicit a sexual interest in children, and
normalise its enactment.
There is far less research on the content and impact of pornography aimed at gay users;
however porn aimed at gay men also appears to promote sex that is impersonal and often
involves dominance and submission (Kendall, 2011), and so may have similar effects
contributing to the sexual exploitation of boys. However such a hypothesis is more tentative.
The porn industry does not tend to acknowledge its risks and seek to limit these, even though
this would normally be expected of an industry whose products can harm (for example, the
tobacco or alcohol industries).
Backpage, online ‘escort’ services, and child sexual exploitation
The online sex industry is a context ripe for child sexual exploitation. In the United States,
online classified websites such as Craigslist and Backpage.com have been primary vehicles for
the buying and selling of sex, and implicated in facilitating CSE. Backpage appears to have been
the most major direct online abettor of CSE, and to have received the most attention. Its
story is highlighted here to illustrate how this online sector has contributed to CSE, and also
to raise questions about the actions and omissions of governments and justice systems, all
arguably influenced by cyberlibertarian ideology.
Up until April 2018, Backpage.com operated as an online advertisement website, with a
lucrative business model constructed around sex advertising.12 In recent years, over 70% of
10 Some of these studies have explored the impact of online pornography specifically, others have considered
offline and online porn together.
11 Some of these studies are cross-sectional; further longitudinal mediation studies are now required to tease
out causality.
12 The California Attorney General’s office, based on internal Backpage documents, reported that 99% of
Backpage’s internal revenue between January 2013 and March 2015 was directly attributable to its ‘adult’ section
(cited in Halverson, 2018). Its founders and CEO each reportedly received bonuses of $10 million in 2014 (Kelly,
2017).
10
child sex trafficking reports submitted by the public to the central sexual abuse tip-off line in
the U.S. concerned Backpage (Souras, 2015). Between 2010 and 2018 many survivors of
Backpage-facilitated CSE, and/or their families, came forward to disclose their abuse in the
hope of challenging the continued exploitation of children through the site. A common
reported modus operandi comprised adolescent girls being identified by pimps, then groomed,
plied with drugs and/or terrorised (for example through rape), in turn leading to their
entrapment in abuse which comprised rape by consecutive men who had bought them from
pimps via Backpage ads – the perpetrators of this CSE therefore being both pimps and buyers.
For many years, lawsuits against Backpage repeatedly failed, mainly due to Section 230 of the
U.S. 1996 Communications Decency Act (CDA) which states that providers of interactive
computer services will not be liable for content posted through their services by someone
else: they will not be treated as its publisher or speaker (Halverson, 2018).13 However
survivors and their advocates argued that Backpage was not only tolerating CSE, it was actively
facilitating it. It selectively removed from its ‘escorts’’ section postings by victim support
organisations and law enforcement sting adverts; it allowed people to post adverts without
verified contact details; and it removed meta-data from images, making the identification of
victims and perpetrators difficult. Perhaps most disturbingly, if, when reviewing adverts,
moderators came across those including terms indicative of child abuse (such as ‘lolita’ and
‘schoolgirl’), company policy was not to report, but instead, retain the adverts, remove the
key indicative terms, and allow their replacement with more subtle, code terms (Kelly, 2017).
Events finally took a turn when a Senate investigation in early 2017 concluded that Backpage
deliberately facilitated child trafficking for profit. And then, in 2018, the relevant parts of U.S.
government approved and endorsed a bill which created an exception to Section 230 of the
CDA, removing websites’ protection from liability if they knowingly facilitate sex trafficking.
This bill was passed despite months of lobbying against it by Big Tech, including by the Internet
Association, a trade group representing companies such as Amazon, Google and Facebook.
In April 2018 Backpage.com (including its international operations) was finally shut down by
U.S. law enforcement.
In the UK, similar legislation has come into force which criminalises websites knowingly
facilitating human trafficking. However, not only does there appear to be more debate and
reticence about its enforcement, this also looks to be more difficult. Whereas in the U.S.
Backpage.com appeared to account for approximately 80% of online sex trafficking, in the
U.K. the industry is thought to be much more diverse (Guilbert, 2018).
The journey so far around the role of the online sex industry in CSE gives cause for both
optimism and ongoing concern. Acknowledging the role this industry has played and still does
is an important and continuing process. As observed by a survivor of British sexual
exploitation: ‘Technology has created a new age of pimping; its easy for pimps, profits are lucrative,
and the likelihood of being caught is minimal’ (Guilbert, 2018). As we have seen, this is no less
true when it is children who are the exploited. Recent moves to increase the responsibility
of tech corporations and demand their moral action around CSE are welcome; an example
being the UK Government’s Online Harms white paper (Gov.uk 2019a). Further steps will
likely be assisted by an understanding of the financial gain, ideologies and psychology behind
Big Tech’s resistance, and the confident articulation of different narratives that emphasise
moral values online.
Cyberlibertarianism holding sway combined with the power held by the technology sector,
and the ethical drift within it, have arguably led to a situation in which some tech corporations
13 Similar laws exist in other jurisdictions including the UK.
11
have developed and profited from practices which directly fuel children’s sexual exploitation.
However, these forces may have amplified CSE just as powerfully in less direct ways; section
four highlighted how cyberlibertarian thinking enables sex offenders, and the next section
argues that many social media and gaming platforms have exploited adolescent developmental
proclivities in ways which too can increase risk of CSE. The common thread that weaves
throughout is the departure from a moral stance in which children’s rights and best interests
are prioritised.
6. Online features and adolescent psychology
‘Young girls generally, of a certain age, who don’t have anyone to listen to them, who are alone, who
see that their Mums and Dads are disinterested, minding their own business… will naturally think it’s
good to look for people who seem to be interested and care on the internet’ (victim of online
grooming, quoted in Quayle et al., 2012, pg. 34)
‘You feel the need to use social media all the time in order to be social or popular’ (14-year-old
quoted in Kidron et al., 2018, pg. 21)
As explored by Coleman in this volume, adolescence is a transformational life stage in which
autonomy, identity and self-esteem, intimacy with peers (through friendships and romance),
and sexuality all undergo substantial development (Steinberg, 2016). To assist in this,
adolescents have specific developmental proclivities compared to those younger or older than
them – for example, they are more attuned to emotional cues, more likely to take risks, and
attach more importance to how they are viewed by their peers (Steinberg, 2010; Brown et
al., 2008). Our focus here is on the ways in which these adolescent tasks and tendencies may
interact with features of the internet to increase risk of CSE (often also in combination with
cultural norms and sometimes the impact of earlier abusive experiences).14 One example of
this is internet platforms who seamlessly ease young people, acting in line with their desire
for social connection and their differential weighting of risk, into connecting with deceptive
individuals. In a recent survey of UK 14-24 year olds (72% of which were 14-17), 6% of 2008
participants reported meeting an online contact offline and discovering that they were not
who they had said they were (McGeeney and Hanson, 2017).
A second example concerns adolescent young women’s search for others’ approval and
validation in order to build a positive sense of self (as noted, a key focus for both sexes at this
life-stage). Social norms (conveyed through various media, and the actions and words of
others) teach girls that their status is inextricably bound to perceptions of their physical
attractiveness. Whilst social media can help to boost self-esteem (Best et al., 2014), when it
leads to greater comparisons between one’s own and one’s peers’ perceived attractiveness,
it can worsen it (Ferguson et al., 2014; Vandenbosch and Eggermont, 2016). The combination
of feeling unattractive whilst being (implicitly) told that attractiveness equals significance, can
propel girls towards people online who praise their appearance as part of a strategy to exploit
them. And it can also push girls to seek this praise through ‘initiating’ sexually explicit
conversation or images. When a girl’s self-esteem has been already been undermined by
earlier childhood experiences, such as neglect or abuse, her need for affirmation is likely to
be further heightened, which may lead to more serious or risky attempts to meet it, such as
offering sex for money which is made much easier online (Hanson 2016; Jonsson et al., 2014;
Svensson et al., 2013). More generally, a variety of mainstream social media platforms make
14 See Hanson and Holmes (2014), p.20, for a schematic of interacting contributors to CSE, including to risk
behaviours, albeit without the role of the internet delineated.
12
it straightforward for children and adolescents to have sexual communication with people
they have met or become familiar with online, placing no or only minor obstacles in the way
of vulnerability meeting exploitation. Young people’s consent in these circumstances is
constrained and abused (Pearce, 2013).
Young people have often been seen as to blame for their exploitation, making ‘lifestyle choices’
and unthinkingly putting themselves at risk. This victim-blaming may be particularly common
in cases of online CSE (Hamilton-Giachritsis et al., 2017), where the victim’s actions are often
more visible than those of the perpetrator, the company hosting the relevant platform, or
indeed those who promote conducive social norms. The argument here, that adaptive
developmental proclivities and needs interact with societal norms, internet affordances, and,
at times, responses to earlier life experiences, to increase vulnerability, offers a challenge to
victim-blaming, whilst also rejecting the alternative narrative often offered, that of a passive
victim (Hanson and Holmes, 2014). It places attention on the factors that lead to young
people’s needs being left unmet, and the factors that constrain their actions, encouraging them
to strive to meet their needs in the short-term in ways that undermine them in the long-term.
7. Conclusions and implications for practice
‘This is not simply a war of ideas; it is a real struggle over real issues with direct impact on real human
lives, and unless we develop more critical attitudes towards digital technologies and in particular
toward cyberlibertarianism, we will find much more than our ‘digital freedom’ – whatever exactly that
is supposed to mean – put at risk’ Golumbia (2013, p 23)
The central thesis of this chapter is that online features, dynamics and narratives interact with
the psychology of various groups to increase and change the nature of CSE both off- and
online. Cyberlibertarian ideology has shaped the internet, and our perceptions, use, and
(under-) regulation of it, so that it has become a space in social life where unethical behaviour,
including sexual exploitation, has far greater license and freedom than offline. Within at least
some tech corporations, libertarian understandings are likely to have blended with
organizational social processes and the blinkering effects of power, money and status to cause
significant ethical drift. This drift coupled with relative inaction by others such as governments
and society (influenced by cyberlibertarianism and system-justification) has led to the
development of online content, narratives, and processes that facilitate child sexual
exploitation – the clearest examples of this being the online pornography industry and online
‘escort’ services; other examples include peer-to-peer file sharing which allow for child abuse
terms, the online accessibility of CSAM, the general sense of the online as a ‘bubble’, and
social media that allow for anonymous posting and/or seamless connection between young
people and those looking to exploit them. Human agency plays a part in all of this, and the
moral responsibility of those whose actions and omissions harm others and violate their
rights.
At a fundamental level what is required is a paradigm shift in which a new understanding of
the internet is narrated, in which the aspiration that it will increase freedom is retained, but
the version of freedom aspired to is very different. Notions of ‘good freedoms’ should be
demonstrably tied to other rights and ethics (such as fairness, protection, equality and
welfare); young people should be able to enjoy the online, and draw on it in their development,
free from abuse, harassment, prejudice and exploitation. And in the words of Gail Dines
(2017, p 8), they should ‘have the right to author a sexuality that is authentic and rooted in respect,
intimacy and connection’. These rightful freedoms are contrasted with individuals using their
freedom and power to act in ways which hurt or oppress others. Aspiring to these freedoms
13
that intersect with other rights and ethics leads to different emphases and actions:
concentrations of power are prevented and challenged, rather than encouraged and seen as
the de facto result of freedom; regulation and public health initiatives are seen as a duty rather
than an embarrassment or last resort; and the myth of online fantasies and actions operating
in a vacuum is cast away, in the recognition that we are all interdependent.
In short it is hoped that reflection and understanding of cyberlibertarianism and its influence
will enable the increased growth of confident and effective counter-arguments and action.
Everyday conversations, campaigns, public pressure on tech corporations through selective
use of sites, informed government regulation and its support, can all play a part in this process.
Online industries that significantly harm others should not escape the scrutiny, regulation, and
public health strategies that would otherwise be expected offline. The advent of age
verification to access major online pornography sites should make it more difficult for young
people to access it, in turn reducing (whilst not eliminating) usage and exposure. Next, there
should be enforced rules against online content that is already effectively banned offline, such
as sexual films with adults role-playing as children, or involving physical harm, humiliation or
apparent lack of consent (BBFC, 2014). Alongside this, users should be educated on online
porn’s risks to self and others before using it (analogous to health warnings on cigarette
packets) and have mandated ‘spaces’ to reflect on this. Similar public health strategies,
alongside more robust law enforcement, could be applied to online prostitution and escort
services. It may be that much of this could be achieved through the new regulation system
proposed by the UK government in the aforementioned Online Harms white paper (Gov.uk,
2019a), and an enforced age appropriate design code which prioritises protection from harm
as well as children’s privacy rights (Information Commissioner’s Office, 2019).
This accords with what I suggest is a further fundamental principle for practice: positive change
will generally follow from people having greater awareness of a) themselves (in particular their
values, aspirations, rights, and positive identities), b) others (their thoughts, feelings, rights),
and c) the invitations of various powers (such as certain corporations and mainstream media)
to act against these aspects of self and others. This awareness principle can be applied
creatively in many different areas of practice to reduce sexual exploitation and other harms
– examples and suggestions of such an approach with the general population, young people,
and tech corporations are summarised in Table 5.2.
[Table 5.2 sits here, see end of this document]
Turning to direct work with young people who have experienced CSE or are at risk,
practitioners should recognise and harness young people’s agency and work collaboratively
with them and those around them to help meet their unmet needs, build their positive sense
of self, and identify and work towards their deep aspirations.15 This approach avoids both
victim-blaming and victim-passivity narratives, which can often be conveyed implicitly. As an
example of implicit blame, in recent research interviewing young people who had experienced
sexual abuse (much of it technology-assisted) good PSHE was highly valued and sought after
by participants. However, online safety (a part of PSHE) offered as a stand-alone to a victim
of online CSE was experienced as blaming and compounded the impact of the abuse
(Hamilton-Giachritsis et al., 2017).
15 Further discussion of adolescent-centred practice, that understands and works with their agency, can be found
in Hanson and Holmes (2014). Also for discussion of therapeutic work with victims of CSE where there is an
online element, see Hanson (2017b), and where there might be risk of further revictimization see Hanson (2016).
14
There is currently more thinking about the online and its role in our lives than ever before.
The time is ripe for new iterations of technology and new relationships with it, which place
children and young people’s rights, including freedom from exploitation, centre-stage.
References
Bandura, A. (2016). Moral disengagement: How people do harm and live with themselves. New York,
NY, US: Worth Publishers.
Barbrook, R., and Cameron, A. (1996). The Californian ideology. Science as Culture, 6(1), 44-72.
BBFC (British Board of Film Classification) (2014). BBFC Guidelines: Age ratings you trust. www.bbfc.co.uk
Beckett, H., Brodie, I., Factor, F., Melrose, M., Pearce, J., Pitts, J., Shuker, L., and Warrington, C. (2013).
‘It’s wrong… but you get used to it’: A qualitative study of gang-associated sexual violence towards, and
exploitation of, young people in England. Bedfordshire: University of Bedfordshire.
Berlin, I. (1958). Two concepts of liberty, An inaugural lecture delivered before the University of
Oxford, 31 October
Best, P., Manktelow, R., and Taylor, B. (2014). Online communication, social media and adolescent
wellbeing: A systematic narrative review. Children and Youth Services Review, 41, 27-36.
Brown, B. B., Bakken, J. P., Ameringer, S. W., and Mahon, S. D. (2008). A comprehensive
conceptualization of the peer influence process in adolescence. In M. J. Prinstein and K. Dodge (Eds),
Understanding peer influence in children and adolescents, 17-44. New York: Guilford Press.
Brown, J. D., and L'Engle, K. L. (2009). X-rated: Sexual attitudes and behaviors associated with US
early adolescents' exposure to sexually explicit media. Communication Research, 36(1), 129-151.
Carr, N. (2016). The Glass Cage: Who Needs Humans Anyway?. London: Vintage.
Cislaghi and Heise (2018) Theory and practice of social norms interventions: eight common pitfalls.
Globalization and Health, 14:83 https://doi.org/10.1186/s12992-018-0398-x
Coy. M. (this volume). What’s gender got to do with it? Sexual exploitation of children as patriarchal
violence.
Coy. M., Kelly, L., Elvines, F., Garner, M., and Kanyeredzi, A. (2013) ‘Sex without consent, I suppose that
is rape’: How young people in England understand sexual consent. London: Office of the Children’s
Commissioner.
Cooper, A. (1998). Sexuality and the Internet: Surfing into the new millennium. CyberPsychology and
Behavior, 1(2), 187-193.
Davidson, J., and Bifulco, A. (2019) Child Abuse and Protection: Contemporary Issues in Research, Policy and
Practice. London: Routledge.
Dahlberg, L. (2010). Cyber-libertarianism 2.0: A discourse theory/critical political economy
examination. Cultural politics, 6(3), 331-356.
Department of Education (2017) Child sexual exploitation: Definition and guide for practitioners.
https://www.gov.uk/government/publications/child-sexual-exploitation-definition-and-guide-for-
practitioners
15
Dines, G. (2017). Growing Up With Porn: The Developmental and Societal Impact of Pornography on
Children. Dignity: A Journal on Sexual Exploitation and Violence, 2(3), Article 3.
D’Orlando, F. (2011). The demand for pornography. Journal of Happiness Studies, 12(1), 51-75.
Ferguson, C. J., Muñoz, M. E., Garza, A., and Galindo, M. (2014). Concurrent and prospective analyses
of peer, television and social media influences on body dissatisfaction, eating disorder symptoms and
life satisfaction in adolescent girls. Journal of youth and adolescence, 43(1), 1-14.
Forbes, G. B., Adams-Curtis, L. E., and White, K. B. (2004). First-and second-generation measures of
sexism, rape myths and related beliefs, and hostility toward women: their interrelationships and
association with college students’ experiences with dating aggression and sexual coercion. Violence
against women, 10(3), 236-261.
Gohir, S. (2013). Unheard voices: The sexual exploitation of Asian girls and young women. Birmingham:
Muslim Women's Network UK.
Golumbia, D. (2013). Cyberlibertarianism: The extremist foundations of ‘digital freedom’. Clemson
University Department of English (September). http://www.uncomputing.org.
Gov.uk (2019a) Online Harms white paper, www.gov.uk/government/consultations/online-harms-
white-paper
Gruenfeld, D. H., Inesi, M. E., Magee, J. C., and Galinsky, A. D. (2008). Power and the objectification
of social targets. Journal of Personality and Social Psychology, 95(1), 111-127.
Guilbert, K. (April 9th, 2018) Online sex slavery more complicated in Britain than in the United States with
Backpage just the tip of the iceberg. London: Thomson Reuters Foundation.
Haidt, J. (2012) The Righteous Mind: Why good people are divided by politics and religion. New York:
Vintage.
Hald, G. M., Malamuth, N. N., and Lange, T. (2013). Pornography and sexist attitudes among
heterosexuals. Journal of Communication, 63(4), 638-660.
Halverson, H. C. (2018). The Communications Decency Act: Immunity for Internet-Facilitated
Commercial Sexual Exploitation. Dignity: A Journal on Sexual Exploitation and Violence, 3(1), article 12.
Hamilton-Giachritsis, C., Hanson, E., Whittle, H. C., and Beech, A. R. (2017). “Everyone deserves to be
happy and safe”: A mixed methods study exploring how online and offline child sexual abuse impact young
people and how professionals respond to it. London: NSPCC.
Hanson, E. (2017a). The impact of online sexual abuse on children and young people. In J. Brown (Ed.),
Online risk to children: Impact, Protection and Prevention, pp. 97-122. London: Wiley.
Hanson, E. (2017b). Promising therapeutic approaches for children, young people and their families
following online sexual abuse. In J. Brown (Ed.), Online risk to children: Impact, Protection and Prevention,
pp. 123-142. London: Wiley.
Hanson, E. (2016) Exploring the relationship between neglect and child sexual exploitation: Evidence Scope
1. Dartington: Research in Practice, NSPCC and Action for Children.
Hanson, E., and Holmes, D. (2014) That Difficult Age: Developing a more effective response to risks in
adolescence. Research in Practice. https://www.rip.org.uk/news-and-views/latest-news/evidence-scope-
risks-in-adolescence/
16
Information Commissioner’s Office (2019) Age appropriate design: A code of practice for online services.
Draft document for consultation. https://ico.org.uk/about-the-ico/ico-and-stakeholder-
consultations/age-appropriate-design-a-code-of-practice-for-online-services/
Iyer, R., Koleva, S., Graham, J., Ditto, P., and Haidt, J. (2012). Understanding libertarian morality: The
psychological dispositions of self-identified libertarians. PloS one, 7(8), e42366.
Jonsson, L. S., Svedin, C. G., and Hydén, M. (2014). ‘Without the Internet, I never would have sold
sex’: Young women selling sex online. Cyberpsychology: Journal of Psychosocial Research on
Cyberspace, 8(1). Article 4. http://dx.doi.org/10.5817/CP2014-1-4
Jonsson, L. S., Bladh, M., Priebe, G., and Svedin, C. G. (2015). Online sexual behaviours among Swedish
youth: Associations to background factors, behaviours and abuse. European Child & Adolescent
Psychiatry, 24(10), 1245-1260.
Jost, J. T., Banaji, M. R., and Nosek, B. A. (2004). A decade of system justification theory: Accumulated
evidence of conscious and unconscious bolstering of the status quo. Political psychology, 25(6), 881-919.
Jütte (2016) Online child abuse images: Doing more to tackle demand and supply. London: NSPCC.
Kelly, A (2nd July, 2017). Small ads sex trafficking: the battle against Backpage. The Guardian.
Kelly, K (18th October, 2010). What technology wants. Cool Tools kk.org/cooltools/archives/4749
Kelly, L. and Karsna, K. (2017) Measuring the scale and changing nature of child sexual abuse and child
sexual exploitation London: Centre of Expertise on Child Sexual Abuse
Kendall, C. N. (2011) The harms of gay male pornography. In M. T. Reist and A. Bray (Eds), Big Porn
Inc.: Exposing the harms of the global pornography industry. Australia: Spinifex.
Kettleborough, D. G., and Merdian, H. L. (2017). Gateway to offending behaviour: permission-giving
thoughts of online users of child sexual exploitation material. Journal of sexual aggression, 23(1), 19-32.
Kidron, B., Evans, A., and Afia, J. (2018) Disrupted childhood: The cost of persuasive design. 5Rights
Foundation. https://5rightsframework.com/resources.html
Kingston, D. A., Fedoroff, P., Firestone, P., Curry, S., and Bradford, J. M. (2008). Pornography use and
sexual aggression: The impact of frequency and type of pornography use on recidivism among sexual
offenders. Aggressive Behavior: Official Journal of the International Society for Research on Aggression, 34(4),
341-351.
Klaassen, M. J., and Peter, J. (2015). Gender (in) equality in Internet pornography: A content analysis
of popular pornographic Internet videos. The Journal of Sex Research, 52(7), 721-735.
Lammers, J., Stapel, D. A., and Galinsky, A. D. (2010). Power increases hypocrisy: Moralizing in
reasoning, immorality in behavior. Psychological science, 21(5), 737-744.
Malamuth, N. M., Hald, G. M., and Koss, M. (2012). Pornography, individual differences in risk and
men’s acceptance of violence against women in a representative sample. Sex Roles, 66(7-8), 427-439.
Martin J and Alaggia R (2013) Sexual abuse images in cyberspace: Expanding the ecology of the child.
Journal of Child Sexual Abuse, 22, 398-415.
Mazzio (2017) I am Jane Doe. Documentary film, 50 Eggs Films.
17
McGeeney, E. and Hanson, E. (2017) Digital Romance: A research project exploring young people’s use of
technology in their romantic relationships and love lives. London: CEOP and Brook.
Melrose, M. (2013). Young people and sexual exploitation: A critical discourse analysis. In M. Melrose
and J. Pearce (Eds.), Critical perspectives on child sexual exploitation and related trafficking (pp. 9-22).
Palgrave Macmillan, London.
Mitchell, K. J., Finkelhor, D., and Wolak, J. (2005). The internet and family and acquaintance sexual
abuse. Child Maltreatment, 10, 49-60.
Mitchell, K. J., Jones, L. M., Finkelhor, D., and Wolak, J. (2011). Internet-facilitated commercial sexual
exploitation of children: Findings from a nationally representative sample of law enforcement agencies
in the United States. Sexual Abuse, 23(1), 43-71.
Mitchell, K. J., Jones, L. M., Finkelhor, D., and Wolak, J. (2013). Understanding the decline in
unwanted online sexual solicitations for US youth 2000–2010: Findings from three Youth Internet
Safety Surveys. Child abuse and neglect, 37(12), 1225-1236.
Moore, C., and Gino, F. (2013). Ethically adrift: How others pull our moral compass from true
North, and how we can fix it. Research in organizational behavior, 33, 53-77.
Moran, M (2017) Online child sexual exploitation – a global problem with a local solution. Talk given at the
European Society for Trauma and Dissociation (ESTD) conference, Bern, Switzerland, 11th
November, 2017.
National Crime Agency (NCA) (2014). National Strategic Assessment of Serious and Organised Crime
2014. UK: National Crime Agency.
Naughton, J. (2012). What you really need to know about the internet. From Gutenberg to Zuckerberg.
London: Quercus.
Naughton, J. (2018). Theresa May thinks Facebook will police itself? Some hope. The Guardian, February
11th.
O’Leary, R., and D’Ovidio, R. (2007). Online sexual exploitation of children. The International Association
of Computer Investigative Specialists.
Ordóñez, L. D., Schweitzer, M. E., Galinsky, A. D., and Bazerman, M. H. (2009). Goals gone wild: The
systematic side effects of overprescribing goal setting. Academy of Management Perspectives, 23(1), 6-
16.
Paasonen, S. (2010). Repetition and hyperbole: The gendered choreographies of heteroporn.
In Everyday pornography (pp. 75-88). Routledge.
Peachey, P. (2013). Paedophiles blackmail thousands of UK teens into online sex acts. The Independent.
Pearce, J. (2013). A social model of ‘abused consent’. In Critical Perspectives on Child Sexual Exploitation
and Related Trafficking (pp. 52-68). Palgrave Macmillan, London.
Peter, J., and Valkenburg, P. M. (2009). Adolescents’ exposure to sexually explicit internet material
and notions of women as sex objects: Assessing causality and underlying processes. Journal of
Communication, 59, 407-433.
Peter, J., and Valkenburg, P. M. (2014). Does exposure to sexually explicit Internet material increase
body dissatisfaction? A longitudinal study. Computers in Human Behavior, 36, 297-307.
18
Pornhub Insights (2017). 2017 year in review. https://www.pornhub.com/insights/2017-year-in-review
Prichard, J., Watters, P. A., and Spiranovic, C. (2011). Internet subcultures and pathways to the use of
child pornography. Computer Law and Security Review, 27(6), 585-600.
PSHE Association (2016). Key principles of effective prevention education. London: PSHE Association
Quayle, E., Jonsson, L., and Lööf, L. (2012). Online behaviour related to child sexual abuse: Interviews
with affected young people. ROBERT, Risk-taking online behaviour, empowerment through research and
training. European Union and Council of the Baltic Sea States.
Rand, A. (1943) The Fountainhead. United States: Bobbs Merrill.
Rimer, J. R. (2017). Internet sexual offending from an anthropological perspective: analysing offender
perceptions of online spaces. Journal of sexual aggression, 23(1), 33-45.
Ringrose, J., Gill, R., Livingstone, S. and Harvey, L. (2012) A qualitative study of children, young people and
'sexting': a report prepared for the NSPCC. National Society for the Prevention of Cruelty to Children,
London, UK.
Souras, Y. G., Senior Vice President and General Counsel for the National Center for Missing and
Exploited Children: NCMEC (2015). Human trafficking Investigation: Hearing before the permanent
subcommittee on Investigations of the Senate Committee on Homeland Security and Governmental
Affairs. Senate Hearing no 114-179, 114th Congress, 39.
Steinberg (2016). Adolescence (11th Edition). McGraw-Hill Education.
Steinberg L (2010) ‘A Dual Systems Model of Adolescent Risk-taking’ Developmental Psychobiology 52
(3) 216-224.
Svensson, F., Fredlund, C., Svedin, C. G., Priebe, G., and Wadsby, M. (2013). Adolescents selling sex:
Exposure to abuse, mental health, self-harm behaviour and the need for help and support—a study of
a Swedish national sample. Nordic journal of psychiatry, 67(2), 81-88.
Thomae, M., and Pina, A. (2015). Sexist humor and social identity: The role of sexist humor in men’s
in-group cohesion, sexual harassment, rape proclivity, and victim blame. Humor, 28(2), 187-204.
Tomaszewska, P., and Krahé, B. (2016). Attitudes towards sexual coercion by Polish high school
students: links with risky sexual scripts, pornography use, and religiosity. Journal of sexual
aggression, 22(3), 291-307.
Van Kleef, G. A., Oveis, C., Van Der Löwe, I., LuoKogan, A., Goetz, J., and Keltner, D. (2008). Power,
distress, and compassion: Turning a blind eye to the suffering of others. Psychological science, 19(12),
1315-1322.
Vandenbosch, L., and Eggermont, S. (2016). The interrelated roles of mass media and social media in
adolescents’ development of an objectified self-concept: a longitudinal study. Communication
Research, 43(8), 1116-1140.
Vohs, K. D. (2015). Money priming can change people’s thoughts, feelings, motivations, and behaviors:
An update on 10 years of experiments. Journal of Experimental Psychology: General, 144(4), e86-
e93. http://dx.doi.org/10.1037/xge0000091
Whittle, H. C., Hamilton-Giachritsis, C. E., and Beech, A. R. (2015). A comparison of victim and
offender perspectives of grooming and sexual abuse. Deviant Behavior, 37(7), 539-564.
19
Wolak J and Finkelhor D (2011) Sexting: A typology. New Hampshire, US: Crimes against children
research centre.
Wood, H. (2013). Internet pornography and paedophilia. Psychoanalytic Psychotherapy, 27(4), 319-338.
Wright, P. J. (2014). Pornography and the sexual socialization of children: Current knowledge and a
theoretical future. Journal of Children and Media, 8(3), 305-312.
Wright, P. J., and Tokunaga, R. S. (2016). Men’s objectifying media consumption, objectification of
women, and attitudes supportive of violence against women. Archives of sexual behavior, 45(4), 955-
964.
Wright, P. J., Tokunaga, R. S., and Kraus, A. (2015). A meta-analysis of pornography consumption and
actual acts of sexual aggression in general population studies. Journal of Communication, 66(1), 183-205.
Ybarra, M. L., Mitchell, K. J., Hamburger, M., Diener-West, M., and Leaf, P. J. (2011). X-rated material
and perpetration of sexually aggressive behavior among children and adolescents: Is there a
link?. Aggressive Behavior, 37(1), 1-18.
Zuckerberg, M. (2012). Letter to investors. Available here:
https://www.sec.gov/Archives/edgar/data/1326801/000119312512034517/d287954ds1.htm#toc28795
4_10 ; discussed here: https://mashable.com/2014/03/13/facebook-move-fast-break-
things/?europe=true
Zunger, Y. (2018). Computer science faces an ethics crisis. The Cambridge Analytica scandal proves
it. Boston Globe, March 22. https://www.bostonglobe.com/ideas/2018/03/22/computer-science-faces-
ethics-crisis-the-cambridge-analytica-scandal-proves/IzaXxl2BsYBtwM4nxezgcP/story.html
20
Table 1 Ten types of technology-assisted child sexual abuse
Type of online child
sexual abuse
Further description and/or example
References for
further information
1. Offline abuse shared with
and viewed by unknown
others via technology
This is often the abuse depicted in child abuse images, also termed ‘child
pornography’ and ‘child sexual exploitation material’.
Example: Sexual abuse perpetrated by a victim’s father shared via images and video
with others online.
Martin and Alaggia
(2013)
Mitchell, Finkelhor and
Wolak (2005)
2. Abuse (committed via
technology or offline)
shared with others in the
victim’s peer group
Example: A young person (or persons) filming their abuse of a peer and sharing this
with their friends for approval or status, or to shame.
Beckett et al (2013)
3. Sexual images created
consensually shared non-
consensually
Example: A young person shares images with a romantic partner, the relationship
ends and those images are shared by their ex-partner with peers.
Wolak and Finkelhor
(2011)
Ringrose et al. (2012)
4. Live streaming: contact
abuse being ordered or
directed by those watching
online
Perpetrators online direct other perpetrators who are physically with the child to
commit abusive acts.
Example: Individuals in the UK watching and directing live streamed sexual abuse of
children by individuals in another country.
National Crime Agency
(NCA; 2014)
5. Offline sexual blackmail
(imagery as leverage)
Child or young person is abused offline, images are taken and used as leverage in the
continuation of the (offline) abuse.
Example: ‘If you tell, I will share this image with your friends and family’
Gohir (2013)
6. Online sexual blackmail
Sexual imagery of a child or young person is ascertained online and then used as
leverage in sexual abuse, which may be online, offline or both.
Peachey (2013)
21
(imagery as leverage)
Example: A child shares a sexual image with a person online; this person (the abuser)
then threatens to share this imagery if the child does not produce further sexual
images.
7. Online grooming
This term can include online sexual blackmail, but is more commonly used to
describe perpetrators forging a close relationship with a victim online in order to
gain the child’s compliance in and secrecy around subsequent sexual abuse.
Example: Perpetrator shows care and interest in young person online who
subsequently ‘falls in love’ and consents to online and/or offline sexual activity
despite perpetrator being an adult and/or coercive and/or the young person feeling
uncomfortable.
Whittle et al. (2015)
8. Offline or online sexual
activity with a child or young
person bought online from
‘pimps’
People controlling a young person sell sexual activity with her or him using digital
technology, for example by advertising them on ‘escort’ platforms. Both seller and
buyer are abusing the young person. When the sexual activity is through technology
(e.g. via webcam) this abuse is similar to the live streaming described above, although
will typically involve the buying abuser and victim in direct interaction
Mazzio (2017); Mitchell
et al (2011)
9. Internet-assisted ‘child sex
tourism’
Offenders using ‘sex tour operators’ online to organise travel trips to abuse children
– these trips are often to countries where protections against child sexual abuse are
weaker
O’Leary and D’Ovidio
(2007)
10. Sexual activity bought
from a young person online
A person buying and undertaking sexual acts with a young person who has, on their
own ‘volition’ advertised these sexual services online – typically the young person
has pre-existing vulnerabilities, low self-esteem and constrained choices
Svensson et al (2013)
22
Notes:
• Many of these forms of online CSA overlap and frequently co-occur. Distinctions have been made being mindful of victims’ experiences
– in other words, some forms that might have otherwise been grouped together are separated because the victims’ experiences are
likely to have important differences.
• Exposure of children to online pornography and sexual harassment or solicitations (for example, a person starting up a conversation
about sex online) have not been included in this typology, as it was felt that the definition of online sexual abuse would become too
wide (whilst recognising that exposure to such material and comments can be abusive and harmful).
23
Table 2 Examples of self- and other-awareness raising
Group
Example
References or further information
Young people
Personal, social, health education (PSHE) that, through a variety of
methods including role-play, discussion, and guided self-reflection,
supports young people in becoming more aware of their values,
aspirations and positive identities. These can act to counter invitations
in various parts of society to ignore their values and to buy into
identities and narratives (such as traditional gender norms) that can
hurt them in the long-term. A bedrock of awareness, knowledge and
skills teaching can replace the approach of bombarding young people
with various risks and harms.
Such an approach fits with the principles of
good PSHE promoted by the PSHE
Association (2016)
General population
Campaigns that invite people to consider the influence of the online on
their behaviour, and encourage them to match their online actions to
their ‘best selves’, inviting moral equivalence between on- and offline.
Such campaigns should be informed by the developing literature on
effective social norm campaigns
Such campaigns would usefully follow a
recent campaign led by the Home Office and
partners aimed at educating young adults
about the law regarding viewing sexual
images of those under 18.
Cislaghi and Heise (2018) discuss pitfalls to
avoid in social norm interventions
Tech corporations
Close user involvement, for example panels of young people advising
and sharing their experiences with social media companies.
Mandatory training that teaches more holistic versions of freedom (that
connect to other rights and social goods), and develops ethical
decision-making skills.
Wider approaches to combat organizational
ethical drift are discussed in Moore and Gino
(2013).
24