Content uploaded by Salvatore Iaconesi
Author content
All content in this area was uploaded by Salvatore Iaconesi on Sep 16, 2017
Content may be subject to copyright.
Constrained Cities
Abstract
What is the space for transgression in the smart city?
Given the analysis of the current trends and developments in the worldwide discussion about the
future of cities, the this question seems progressively harder to address.
While being extremely useful and effective, BigData, sensors, algorithms, bottom-up initiatives,
services and systems, artificial intelligences, robotics, domotics and systems and designs for health,
education, security create service, data, knowledge and relation bubbles whose effect is to reduce
imperfection, unexpectedness, unpredictability, surprise, weirdness, chance.
But these modalities are necessary for innovation, inclusion, co-existence, evolution.
Data create spaces: physically and perceptively, through algorithms and their effects on the
interfaces (whether on an App or at a physical space) which we use and which mediate our access to
them. And, in some cases, even the possibility to perceive them, to recognize them as spaces which
are existing, meant for us.
A review, an algorithmic suggestion, a $$$$ price range for a restaurant, computational
recommendations for real estate, healthcare services, entertainment, education, work may cause
certain people to be completely excluded from certain parts of the city.
To not perceiving them. To not desiring to go in those places. To erasing them from their perceptive
map of the city.
This can work according to several different logics: from financial ones, to aesthetic ones, to racial,
migratory, gender ones, and more.
In this contribution, we will interweave narrative techniques and solid, evidence based research, to
describe a few custom-made case studies which explore this concept: the Constrained Cities.
Love
The pain was unbearable.
It was like millions of blood sharp needles poking your skin from the inside, freezing cold, filling all
of your neural bandwidth with pain, obstructing any other sensation.
But still, I had to go on, I had to see him.
When they appeared – in wearable computing experiments in 1970s and, then, diffusely in the
2000s1 – nobody would have thought that they would have been used in this way.
Pornography, Health, Insurance and Work2. That's how it all started. Wearable devices and ,then,
chips, first subcutaneous, like evolved tattoos, then implants3. The next level of Augmented and
Virtual Reality, they called them: hardware and software connected to your central nervous system
and to your data profiles in the cloud that would allow you to feel impossible, outrageously
interactive sensations in connection with people and bots. Then came Insurances, which used the
1 As can be seen in Consolvo et al (2008), McClusky (2009), Rafferty et al (2002), where there is documented
evidence of the first experiments and, then, of their evolutions.
2 Lupton (2014) provides a wonderful overview of the Quantified self scenario; Silverstone (2017) and Tziallas
(2015) describe the rise of the sex robots, including the evolution from both software agents and sex dolls; Prelert
(2014) focuses on wearable IoT devices and on the evolution of data usage in medical domains; Whitson (2013)
point out various form of quantified self gaming scenarios which have evolved to address work and insurance
functionalities;
3 See Andreu-Perez (2015) for an account of these evolutions, beginning from healthcare.
same technologies to monitor you 24/7 and to provide you with enormous discounts and, then, with
completely free insurance and medical coverage4. Then new types of jobs came along, in which you
just installed the implant, subscribed to the service, and you wouldn't have to look for a job
anymore5: the job found you6, by interpreting your physical, emotional, cultural, philosophical data
captured 24/7 directly from your body and all of your digital manifestations, to understand which
jobs you were suited for7. You received a message each day: be here, at this time, to do this job.
By 2021 they were the norm, everybody had one.
Pain fades away, while I recover from the intense stimulation.
The map had become very dynamic since a few years8. Up until a while ago it was more of a static
nature: there were places you could go, places you couldn't, and places that you did not even know
that existed, in many ways. Some of them, they really were out of your perceptive and information
landscape: you just did not find them when you searched for a pub in which to hang out, a
restaurant in which to eat, or a neighborhood in which to buy or rent your next home9. Algorithms
made sure that they were not there in the list of results10, or that their reviews were chosen in ways
which were tailored for you, to prevent you from choosing them, from finding them attractive. For
reasons that were financial, political, cultural11 algorithms and who controlled them had strategies
about how much money people in a certain neighborhood should have, or what level of education,
or styles, or what sort of relational network they should be in.
4 In his article, Kaur (2014) shows multiple implications of embodied chips, starting from daily scenarios. Schwartz
(2006) analyzes implications for privacy and control.
5 What happens when my boss is an algorithm? This is what O'Connor asks in her article (2016).
6 See Townsend (2017) for yet more analysis of the implications of algorithmic recruiting.
7 Zuboff (2015) asks: what happens when the world you experience starts depending on the information which
algorightms have about you?
8 Dillahunt (et al, 2015) show the evolution of filter bubbles over time and across services.
9 Chang (2016) explores multiple cases in which information bubbles become geographic, keeping disadvantaged or
diverse people in their neighborhoods.
10 Graham, Zook and Boulton (2012) ask: how does conflict show up in maps?
11 Iaconesi and Persico (2015) show multiple types of digital maps which represent complex financial, physical,
economic, cultural, commercial, political geographies, and their implications on our perception and understanding of
the world.
The combination of the strategies of the algorithms of governments, city administrations,
corporations, institutions and organizations for health, education, finance, work, entertainment and
culture, created a map, for each of us.
I back away and try another street, on the left. I have the printout of the latest version of my CAM
(Citizen Accessibility Map), which I hacked yesterday, but it is already out-of-date. I only hope that
it didn't change that much, and that our plan is still valid.
You can't connect in zones which are not in your CAM. Or, at least, not directly.
When I met him, we had compatible CAMs, for a brief moment. But he was different than me.12
At that time, our city was being torn apart from migrations. It seems that algorithms took a while to
adjust to the systematic invasion of the city of the hundred of thousands of people that arrived13. In
those days, entire sections of the city disappeared for each of us, to accommodate where migrants
were being hosted14. “Stored”, some said: and that is an interesting prose, as whole parts of the
city became black boxes to contain incoming migrants, with corridors for them to come in, and
other ones to leave the city when it was time. These areas vanished from one day to the other from
our maps. It was difficult to adjust: fake news about building collapses and terrorist attacks;
commercial venues that, instantly, started systematically receiving negative reviews; map directions
that started to simultaneously avoid certain areas of the city15. Some people even got stuck and
could not get home for a few days, as there were no taxis, trains, buses or information and news
available about what was happening.
This was when the Cross-CAM Inhibition Act was launched: if your CAM did not include a certain
part of the city, you could not send messages to individuals or organizations that were in it.
12 Flaxman, Goel and Rao (2016) show how filter bubbles and echo chambers affect our information consumption.
13 Flaxman, Goel and Rao (2014) demonstrate various forms of segregation originated through social networking data
and information.
14 Tufekci (2015) introduces Computational Agency, and its impact on freedoms and rights.
15 Cain's article (2017) describes how the city of Quebec suffered from false representations, and describes the impacts
on presences in the city.
Figure 1: Constrained Cities: view from the exhibit, the personal map of the city, describing the
places in which you cannot go or reside.
In this way, places disappeared16. It took about one month for people to get used to this. Then it
became progressively normal: you would just follow the directions of the digital map, as before; or
the reviews to find the best deal, restaurant, school to attend, neighborhood to live; not asking too
many questions17.
I arrive at a boundary. The map beyond this intersection, for me, is red. There are no physical
barriers. I can see what is going on on the other side: there's people just like me (maybe not... who
knows what's different in their data profile), performing their daily routine. For them it's green, for
me it's red: our profiles are different18.
I have to cross this section of the city to get to him, to where we arranged to meet.
We've been separated for over a year.
While the migration crisis was going on in the city, while algorithm were adjusting, we found
ourselves together. We were actually data-locked in a place. It was a dark zone: data about that
zone momentarily disappeared, while algorithms figured out what to do with it in regards of the
changing situation19. There was no information about mobility, energy, commerce, nothing. People
there were stuck, because missing data also meant missing energy, transport, school, stores.
We realized that we were stuck while waiting for a bus: he would have taken it to arrive to his
green home area; I would have taken it just for a few stops, to get to my school.
Some people started walking away from that zone. And we were among them.
I was richer than him. I lived in an area which was completely invisible for him.
It was nothing violent, at the time. There wasn't the pain, yet. It was just that algorithms made sure
16 Datta, Tschantz and Datta (2015) describe multiple forms of data driven disappearances.
17 Wabash (2015) has written one of the most famous articles about the accidents which happened when people trusted
Google Maps directions more than what they saw with their own eyes.
18 Iaconesi and Persico (2015) talk about the social separations which can be induced through data and algorithmic
control.
19 Begley (2013) and Paglen (2009) use commonly available digital maps to show secret locations which become
visible because of their obfuscation (for example because they appear as pixellated on the map).
Figure 2: Constrained Cities: places in the city disappear from your perception through data.
(or highly unlikely) that you, for example, got off at the wrong bus stop20. There would be a
positive or negative review, a “suggested for you” route, a certain event, a “something” that would
make sure that you only went from A to B, that it was all that was in your perception, all of Cs, Ds,
Es and Fs disappearing from your desires and imaginations21.
But that afternoon during the migrant crisis there was no data and no information: we started to
walk, and we fell in love.
We didn't realize that we were different.
When we separated for the night, we discovered that we couldn't get back together and we couldn't
even communicate.
Pain arrived after that.
Pain started with healthcare and work. Limited stimulation was used to communicate dangerous
situations, such as incoming strokes and seizures predicted by the algorithms implanted in your
body22, and tactile notifications to ensure streamlined workflows, and to construct fluid interaction
patterns between human workers and their robot counterparts23: a small tactile or electrical signal
20 Do online advertising platforms allow to exclude ad viewers on racial basis? That is what Angwin and Parris ask in
their article (2016).
21 Iaconesi and Persico (2015) and Gatica-Perez (et al, 2016) describe how large platforms are able, first, to
crowdsource whole information ecosystems about territories and, then, to algorithmically control what is available
to individuals.
22 Sanfilippo and Pettersen (2015), and Shull (2015) describe interesting and progressively existing scenarios in which
haptic body augmentations provide additional senses and capabilities to people, also in the scenarios of disabilities
and pre-existing impairments.
23 Starner (et al, 1997), Milgram, Zhai and Drascic (1993), and Kortum (2008), all describe scenarios in which
technological augmentations of the body allow establishing communication patterns with robots and artificial
Figure 3: Constrained Cities: data-segregation in the city during a migrant crisis.
was created on the skin and communicated as data to other people and systems, to that actions could
be triggered, such as a medication being ingested or injected24, or some work-related information
being addressed.
Both streets and governments found ways for these types of technologies25. While these devices and
implants, together with encrypted communication was being used for sensual stimulation, to arrange
drugs smuggling and dealing processes, and as novel forms of gambling, they were also used by
police for safety monitoring in crowds, criminal control and, then, to manage crowds as they moved
in cities26.
Right, Right, Left.
Then the corridor. People have started providing corridors. There is a peer-to-peer application
which is called AirCAM, in which people rent safe, protected passages to people who want to
traverse the city in places they wouldn't have access to. They create a path in their homes, condos,
back yards, halls, and they connect paths with their neighbors' to create corridors: they wrap them
in isolating materials so that there is no network coverage in them, so that you can't be detected
and monitored in them, and so that the pain trigger cannot be sent to your implant.
At any time you can use the app to indicate locations A and B, to check if there is a corridor which
takes you from one to the other. You pay in Bitcoins, anonymously, to gain access and to get the
directions.
This time it's through the hall of this condo building, then down to the garage floor, leaving from
the ramp, then right inside the neighboring building and through the apartment on the round floor,
leaving from their balcony and onto the street with a 1.5 meter drop. Not really accessible, but at
least it's cheap.
If everything goes as planned, I will see him at the end of the corridor, in a blind spot.
Blind spots started emerging where algorithms' structure did not match reality27. Blind spots hunters
and connoisseurs started emerging, and they created guides and businesses which sat in a regulatory
grey area, enabling romantic meetings, illegal trade, gatherings, rave parties and more28.
In the same ways there were individuals whose data profiles were personified blind spots, meaning
that they either could achieve incredible feats (since their profiles were not mapped and, thus, they
had access to everything) or suffer incredible troubles (as their profiles corresponded to CAMs that
were all, completely, red).
intelligences.
24 Like in the scenarios proposed by Breimesser and Reitz (2003), in which cyber medications are ingested or injected.
25 Knibbs (2015), describes the "hustlers origins of wearable computing".
26 Rigg (2016), Bogard (2007) and Mitleton-Kelly (et al, 2013), all deal with existing scenarios in which haptic
technologies are used to control movement and behavior.
27 Many studies exist that observe the implications which emerge when organizations become able to control access to
knowledge and to aggressively deal with users' privacy, for example Lee (2011); Dwork and Mulligan (2013);
Epstein and Robertson (2014).
28 Always on blind spots, Young's beautiful dystopian architectural fictions, described by Pangburn (2017), show
multiple ways in which they could generate reappropriation patterns in the city.
Then, it happened. As I started jumping from the balcony of the last section of the corridor, I felt it:
the pain.
As I started suffering, I saw him, in the distance. He was turning from the opposite corner of the
street, crossing the roadway to arrive in the same sidewalk I was supposed to land on.
He started to collapse as I was hitting ground, my senses completely saturated from the white noise
of the pain. As I fainted, I viewed him reach under his arm, struggling as if to rip the implant off his
skin, to try and elude the agony.
The map had changed again.
The Constrained Cities
This short story is a Design Fiction which is part of a Near Future Design, a speculative design
technique and methodology which we use in our practice to produce scenarios which are the result
of a systemic research on the topics which we deal with.
This one is based on our research on the Constrained Cities, a dystopian vision of the near future of
the cities which we use to investigate potential risks and implications which derive directly from the
concepts we all are designing and implementing, as a global community of engineers, designers,
technologists, policy makers, entrepreneurs, researchers, practitioners, artists, citizens.
As seen in the references, all that is mentioned in the story is something existing now: maybe in
prototypal form, but existing, possible, and actively developed. It is a Near Future Design research,
in which current research trends are interpreted in terms of evolutive tensions. This story is a “What
If?” type of interrogation onto these evidence-based findings.
The future does not have to be scary. As authors we could have invented a completely different
story: a happy one, fun, and with a great, wonderful, positive ending. Maybe we will create such a
story for our next article.
Here, we wanted to explore an issue which, in our opinion, is in great need for discussion: the ways
in which technologies control us, out bodies, intentions and perception.
Figure 4: Constrained Cities: separated from someone you love
As we design, develop and achieve wonderful, effective, sustainable services and infrastructures for
our cities, we are also locking ourselves up in knowledge, relational and philosophical bubbles.
Here, we are facing risks which profoundly affect our ability to positively confront with diversity
and with what is unexpected and unforeseen.
What is clear is how these bubbles, on the one hand, reduce (or eliminate all together) the space for
transgression in the city. And, on the other hand, they reduce our perceptive space and landscape,
up to the point in which, as in the story, concepts, places, relations may disappear, leaving us with a
biased, egocentric, consumeristic, controlled, world.
Furthermore, this condition is a condition of remarkable asymmetry in power, or, more precisely, of
Biopower29. A Biopower which is in data and interfaces, and in their closedness, controlled
affordances, opaqueness, lack of interoperability and transparency, and in the constant trade-off
between comfort, convenience and availability, and the possibility for critique, complexity and
responsibility.
A story – and, thus, research – of this kind may bring on different reactions. Our reaction, as
researchers, artists and free (libre) citizens is to dedicate precise efforts to make sure that these
issues do not remain a science fiction tale, but wake other people's desire, imagination and
intelligence, to become items for active discussion and agency.
In our opinion and understanding, there are both enormous implications and opportunities in this,
whether we approach them from a Design education and practice perspective, or in Engineering
theories and practice, in Culture and cultural production, and in all the technologies, research,
artworks, conferences, workshop, client commissions, research projects we use and conduct in our
practice, and across our daily lives.
References
Andreu-Perez, Javier et al. 2015. "From Wearable Sensors to Smart Implants-–Toward Pervasive
and Personalized Healthcare" IEEE Transactions on Biomedical Engineering Volume: 62, Issue:
12, 2750 – 2762.
Angwin, Julia, Parris, Terry Jr. 2016 "Facebook Lets Advertisers Exclude Users by Race."
ProPublica. Accessed April 28 2017: https://www.propublica.org/article/facebook-lets-advertisers-
exclude-users-by-race
Begley, Josh. 2013. "How do you measure a military footprint?" Accessed April 28 2017:
http://empire.is/about
Bogard, William. 2007. "The Coils of a Serpent: Haptic Space and Control Societies". Ctheory:
1000 days of theory. Accessed April 28 2017: http://ctheory.net/ctheory_wp/the-coils-of-a-serpent-
haptic-space-and-control-societies
Breimesser, Fritz, Reitz, Arno. 2003. "Pocket monitor for patient cards". Patent No: US6626358 B1
Cain, Patrick. 2017. "Fake news: Meet the alternate-reality version of the Quebec City shooting."
Global News. Accessed April 28 2017: http://globalnews.ca/news/3211421/fake-news-meet-the-
alternate-reality-version-of-the-quebec-city-shooting/
Chang, Alvin. 2016. "How the internet keeps poor people in poor neighborhoods." Vox. Accessed
April 28 2017: https://www.vox.com/2016/12/12/13867692/poor-neighborhoods-targeted-ads-
internet-cartoon
Consolvo, Sunny et al. 2008. "Flowers or a robot army?: encouraging awareness & activity with
personal, mobile displays." Proceedings of the 10th international conference on Ubiquitous
29 Foucault's "The History of Sexuality" (1998) introduces this concept.
computing, pages 54–63. ACM, 2008.
Datta, Amit, Tschantz, Michael C., Datta, Anupam. 2015. "Automated Experiments on Ad Privacy
Settings: A Tale of Opacity, Choice, and Discrimination." Proceedings on Privacy Enhancing
Technologies, Volume 2015, Issue 1 (Apr 2015). https://doi.org/10.1515/popets-2015-0007
Dillahunt, Tawanna R., Brooks, Christopher A., Gulati, Samarth. 2015. "Detecting and visualizing
filter bubbles in Google and Bing." CHI'15 Extended Abstracts, ACM 978-1-4503-3146-3/15/04,
http://dx.doi.org/10.1145/2702613.2732850.
Dwork, Cynthia, Mulligan, Deidre K. 2013. "It's Not Privacy, And It's Not Fair." 66 Stan. L. Rev.
Online 35 (2013-2014)
Epstein, Robert, Robertson, Ronald E. 2014. "The search engine manipulation effect (SEME) and
its possible impact on the outcomes of elections." PNAS, vol. 112 no. 33, doi:
10.1073/pnas.1419828112
Flaxman, Seth, Goel, Sharad, Rao, Justin M. 2014. "Ideological Segregation and the Eects of
Social Media on News Consumption." Accessed Apr 28 2017:
https://bfi.uchicago.edu/research/working-paper/ideological-segregation-and-e%EF%AC%80ects-
social-media-news-consumption
Flaxman, Seth, Goel, Sharad, Rao, Justin M. 2016. "Filter Bubbles, Echo Chambers, and Online
News Consumption." Public Opin Q (2016) 80 (S1): 298-320.
Foucault, Michel. 1998 The History of Sexuality Vol. 1: The Will to Knowledge. London: Penguin
Gatica-Perez, Daniel, Correa, Salvador R., Santani, Darshan. 2016. "What TripAdvisor Can’t Tell:
Crowdsourcing Urban Impressions for Whole Cities." Digital Polis. Paris: L'Oeil d'Or.
Graham, Mark, Zook, Matthew, Boulton, Andrew. 2012. "Augmented reality in urban places:
contested content and the duplicity of code." Transactions of the Institute of British Geographers,
Volume 38, Issue 3, 464–479. DOI: 10.1111/j.1475-5661.2012.00539.x
Iaconesi, Salvatore, Persico, Oriana. 2015. "Il Terzo Infoscape. Dati, informazioni e saperi nella
città e nuovi paradigmi di interazione urbana" In: I Media Digitali e l’Interazione Uomo-Macchina
edited by Simone Arcagni, p. 139-168. Rome: Aracne Editore.
Kaur, Satwant. 2014. "How are the Embedded Chips Going to Affect Our Lives?" IETE Technical
Review volume 29/2012, Issue 2, 101-104.
Knibbs, Kate. 2015. "The Hustler Origins of Wearable Computers." Gizmodo. Accessed April 28
2017: http://gizmodo.com/casinos-and-con-men-the-hustler-origins-of-wearable-co-1718085809
Kortum, Philip. 2008. HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other
Nontraditional Interfaces. Burlington, Massachusetts: Morgan Kaufmann.
Lee, Micky. 2011. "Google ads and the blindspot debate." Media, Culture & Society, Vol 33, Issue
3, pp. 433 - 447.
Lupton, Deborah. 2014. "Self-tracking cultures: towards a sociology of personal informatics"
Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing
Futures: the Future of Design, 77-86.
McClusky, Mark. 2009. "The nike experiment: How the shoe giant unleashed the power of personal
metrics." Wired, 17(07). Accessed April 28 2017: https://www.wired.com/2009/06/lbnp-nike/
Milgram, P., Zhai, S., Drascic, D. 1993 "Applications of augmented reality for human-robot
communication." Intelligent Robots and Systems '93, IROS '93, DOI: 10.1109/IROS.1993.583833
Mitleton-Kelly, Eve, Deschenaux, Ivan, Maag, Christian, Fullerton, Matthew, Celikkaya, Nihan.
2013 "Enhancing Crowd Evacuation and Traffic Management Through AmI Technologies: A
Review of the Literature." Co-evolution of Intelligent Socio-technical Systems. DOI 10.1007/978-3-
642-36614-7_2
O'Connor, Sarah. 2016. "When your boss is an algorithm.", Financial Times. Accessed April 28
2017: https://www.ft.com/content/88fdc58e-754f-11e6-b60a-de4532d5ea35
Paglen, Trevor. 2009. Blank Spots on the Map: The Dark Geography of the Pentagon's Secret
World. London: Dutton Adult.
Pangburn, DJ. 2017. "New Subcultures Surface in the Future-Dystopian Films of Liam Young."
Vice Creators. Accessed April 28 2017: https://creators.vice.com/en_us/article/new-subcultures-
surface-in-the-future-dystopian-films-of-liam-young
Prelert, Mark J. 2014. "IoT Won't Work Without Artificial Intelligence". Wired. Accessed April 28
2017: https://www.wired.com/insights/2014/11/iot-wont-work-without-artificial-intelligence/
Rafferty, A. P., Reeves, A. J., McGee, H. B., Pivarnik, J. M. et al. 2002. "Physical activity patterns
among walkers and compliance with public health recommendations." Medicine and science in
sports and exercise, 34(8):1255–1261.
Rigg, Jaime. 2016. "Teslasuit does full-body haptic feedback for VR." Engadget. Accessed April 28
2017: https://www.engadget.com/2016/01/06/teslasuit-haptic-vr/
Sanfilippo, Filippo, Pettersen, Kristin Y. 2015. "A Sensor Fusion Wearable Health-Monitoring
System with Haptic Feedback." Innovations in Information Technology (IIT), 2015 11th
International Conference on, DOI: 10.1109/INNOVATIONS.2015.7381551
Schwartz, Paul M. 2006. "Privacy Inalienability And Personal Data Chips." Privacy and
Technologies of Identity, 93-113.
Shull, Peter B., Damian, Dana D. 2015. "Haptic wearables as sensory replacement, sensory
augmentation and trainer – a review." Journal of NeuroEngineering and Rehabilitation, DOI:
10.1186/s12984-015-0055-z
Silverstone, Tom, Kleeman, Jenny, Tait, Michael. 2017. "Rise of the sex robots". The Guardian.
Accessed April 28 2017: https://www.theguardian.com/technology/video/2017/apr/27/rise-of-the-
sex-robots-video
Starner, Thad et al 1997 ."Augmented Reality through Wearable Computing." Presence, Vol. 6, No.
4, Pages: 386-398, doi:10.1162/pres.1997.6.4.386.
Townsend, Adam. 2017. "Algorithmic Recruiting of Laborers on a Gig platform.". Accessed April
28 2017: https://medium.com/@adamscrabble/algorithmic-recruiting-of-laborers-on-a-gig-
platform-19213dac00d8
Tufekci, Zeynep. 2015. Algorithmic Harms beyond Facebook and Google: Emergent Challenges of
Computational Agency. Colorado Technology Law Journal 13: 203.
Tziallas, Evangelos. 2015. "Gamified Eroticism: Gay Male “Social Networking” Applications and
Self-Pornography" Sexuality & Culture Volume 19, Issue 4, pp 759–775.
Wabash, Robert. 2015. "9 Car Accidents Caused by Google Maps & GPS." The Ranker. Accessed
Apr 28 2017: http://www.ranker.com/list/9-car-accidents-caused-by-google-maps-and-gps/robert-
wabash
Whitson, Jennifer R. 2013. "Gaming the Quantified Self" Surveillance & Society 11.1/2 (2013):
163-176.
Zuboff, Shoshana. 2015. "Big other: surveillance capitalism and the prospects of an information
civilization." Journal of Information Technology, Volume 30, Issue 1, pp 75–89