Conference PaperPDF Available

Constrained Cities: Filter Bubbles in the Physical Space of the City

  • HER: She Loves Data
Constrained Cities
What is the space for transgression in the smart city?
Given the analysis of the current trends and developments in the worldwide discussion about the
future of cities, the this question seems progressively harder to address.
While being extremely useful and effective, BigData, sensors, algorithms, bottom-up initiatives,
services and systems, artificial intelligences, robotics, domotics and systems and designs for health,
education, security create service, data, knowledge and relation bubbles whose effect is to reduce
imperfection, unexpectedness, unpredictability, surprise, weirdness, chance.
But these modalities are necessary for innovation, inclusion, co-existence, evolution.
Data create spaces: physically and perceptively, through algorithms and their effects on the
interfaces (whether on an App or at a physical space) which we use and which mediate our access to
them. And, in some cases, even the possibility to perceive them, to recognize them as spaces which
are existing, meant for us.
A review, an algorithmic suggestion, a $$$$ price range for a restaurant, computational
recommendations for real estate, healthcare services, entertainment, education, work may cause
certain people to be completely excluded from certain parts of the city.
To not perceiving them. To not desiring to go in those places. To erasing them from their perceptive
map of the city.
This can work according to several different logics: from financial ones, to aesthetic ones, to racial,
migratory, gender ones, and more.
In this contribution, we will interweave narrative techniques and solid, evidence based research, to
describe a few custom-made case studies which explore this concept: the Constrained Cities.
The pain was unbearable.
It was like millions of blood sharp needles poking your skin from the inside, freezing cold, filling all
of your neural bandwidth with pain, obstructing any other sensation.
But still, I had to go on, I had to see him.
When they appeared – in wearable computing experiments in 1970s and, then, diffusely in the
2000s1 – nobody would have thought that they would have been used in this way.
Pornography, Health, Insurance and Work2. That's how it all started. Wearable devices and ,then,
chips, first subcutaneous, like evolved tattoos, then implants3. The next level of Augmented and
Virtual Reality, they called them: hardware and software connected to your central nervous system
and to your data profiles in the cloud that would allow you to feel impossible, outrageously
interactive sensations in connection with people and bots. Then came Insurances, which used the
1 As can be seen in Consolvo et al (2008), McClusky (2009), Rafferty et al (2002), where there is documented
evidence of the first experiments and, then, of their evolutions.
2 Lupton (2014) provides a wonderful overview of the Quantified self scenario; Silverstone (2017) and Tziallas
(2015) describe the rise of the sex robots, including the evolution from both software agents and sex dolls; Prelert
(2014) focuses on wearable IoT devices and on the evolution of data usage in medical domains; Whitson (2013)
point out various form of quantified self gaming scenarios which have evolved to address work and insurance
3 See Andreu-Perez (2015) for an account of these evolutions, beginning from healthcare.
same technologies to monitor you 24/7 and to provide you with enormous discounts and, then, with
completely free insurance and medical coverage4. Then new types of jobs came along, in which you
just installed the implant, subscribed to the service, and you wouldn't have to look for a job
anymore5: the job found you6, by interpreting your physical, emotional, cultural, philosophical data
captured 24/7 directly from your body and all of your digital manifestations, to understand which
jobs you were suited for7. You received a message each day: be here, at this time, to do this job.
By 2021 they were the norm, everybody had one.
Pain fades away, while I recover from the intense stimulation.
The map had become very dynamic since a few years8. Up until a while ago it was more of a static
nature: there were places you could go, places you couldn't, and places that you did not even know
that existed, in many ways. Some of them, they really were out of your perceptive and information
landscape: you just did not find them when you searched for a pub in which to hang out, a
restaurant in which to eat, or a neighborhood in which to buy or rent your next home9. Algorithms
made sure that they were not there in the list of results10, or that their reviews were chosen in ways
which were tailored for you, to prevent you from choosing them, from finding them attractive. For
reasons that were financial, political, cultural11 algorithms and who controlled them had strategies
about how much money people in a certain neighborhood should have, or what level of education,
or styles, or what sort of relational network they should be in.
4 In his article, Kaur (2014) shows multiple implications of embodied chips, starting from daily scenarios. Schwartz
(2006) analyzes implications for privacy and control.
5 What happens when my boss is an algorithm? This is what O'Connor asks in her article (2016).
6 See Townsend (2017) for yet more analysis of the implications of algorithmic recruiting.
7 Zuboff (2015) asks: what happens when the world you experience starts depending on the information which
algorightms have about you?
8 Dillahunt (et al, 2015) show the evolution of filter bubbles over time and across services.
9 Chang (2016) explores multiple cases in which information bubbles become geographic, keeping disadvantaged or
diverse people in their neighborhoods.
10 Graham, Zook and Boulton (2012) ask: how does conflict show up in maps?
11 Iaconesi and Persico (2015) show multiple types of digital maps which represent complex financial, physical,
economic, cultural, commercial, political geographies, and their implications on our perception and understanding of
the world.
The combination of the strategies of the algorithms of governments, city administrations,
corporations, institutions and organizations for health, education, finance, work, entertainment and
culture, created a map, for each of us.
I back away and try another street, on the left. I have the printout of the latest version of my CAM
(Citizen Accessibility Map), which I hacked yesterday, but it is already out-of-date. I only hope that
it didn't change that much, and that our plan is still valid.
You can't connect in zones which are not in your CAM. Or, at least, not directly.
When I met him, we had compatible CAMs, for a brief moment. But he was different than me.12
At that time, our city was being torn apart from migrations. It seems that algorithms took a while to
adjust to the systematic invasion of the city of the hundred of thousands of people that arrived13. In
those days, entire sections of the city disappeared for each of us, to accommodate where migrants
were being hosted14. “Stored”, some said: and that is an interesting prose, as whole parts of the
city became black boxes to contain incoming migrants, with corridors for them to come in, and
other ones to leave the city when it was time. These areas vanished from one day to the other from
our maps. It was difficult to adjust: fake news about building collapses and terrorist attacks;
commercial venues that, instantly, started systematically receiving negative reviews; map directions
that started to simultaneously avoid certain areas of the city15. Some people even got stuck and
could not get home for a few days, as there were no taxis, trains, buses or information and news
available about what was happening.
This was when the Cross-CAM Inhibition Act was launched: if your CAM did not include a certain
part of the city, you could not send messages to individuals or organizations that were in it.
12 Flaxman, Goel and Rao (2016) show how filter bubbles and echo chambers affect our information consumption.
13 Flaxman, Goel and Rao (2014) demonstrate various forms of segregation originated through social networking data
and information.
14 Tufekci (2015) introduces Computational Agency, and its impact on freedoms and rights.
15 Cain's article (2017) describes how the city of Quebec suffered from false representations, and describes the impacts
on presences in the city.
Figure 1: Constrained Cities: view from the exhibit, the personal map of the city, describing the
places in which you cannot go or reside.
In this way, places disappeared16. It took about one month for people to get used to this. Then it
became progressively normal: you would just follow the directions of the digital map, as before; or
the reviews to find the best deal, restaurant, school to attend, neighborhood to live; not asking too
many questions17.
I arrive at a boundary. The map beyond this intersection, for me, is red. There are no physical
barriers. I can see what is going on on the other side: there's people just like me (maybe not... who
knows what's different in their data profile), performing their daily routine. For them it's green, for
me it's red: our profiles are different18.
I have to cross this section of the city to get to him, to where we arranged to meet.
We've been separated for over a year.
While the migration crisis was going on in the city, while algorithm were adjusting, we found
ourselves together. We were actually data-locked in a place. It was a dark zone: data about that
zone momentarily disappeared, while algorithms figured out what to do with it in regards of the
changing situation19. There was no information about mobility, energy, commerce, nothing. People
there were stuck, because missing data also meant missing energy, transport, school, stores.
We realized that we were stuck while waiting for a bus: he would have taken it to arrive to his
green home area; I would have taken it just for a few stops, to get to my school.
Some people started walking away from that zone. And we were among them.
I was richer than him. I lived in an area which was completely invisible for him.
It was nothing violent, at the time. There wasn't the pain, yet. It was just that algorithms made sure
16 Datta, Tschantz and Datta (2015) describe multiple forms of data driven disappearances.
17 Wabash (2015) has written one of the most famous articles about the accidents which happened when people trusted
Google Maps directions more than what they saw with their own eyes.
18 Iaconesi and Persico (2015) talk about the social separations which can be induced through data and algorithmic
19 Begley (2013) and Paglen (2009) use commonly available digital maps to show secret locations which become
visible because of their obfuscation (for example because they appear as pixellated on the map).
Figure 2: Constrained Cities: places in the city disappear from your perception through data.
(or highly unlikely) that you, for example, got off at the wrong bus stop20. There would be a
positive or negative review, a “suggested for you” route, a certain event, a “something” that would
make sure that you only went from A to B, that it was all that was in your perception, all of Cs, Ds,
Es and Fs disappearing from your desires and imaginations21.
But that afternoon during the migrant crisis there was no data and no information: we started to
walk, and we fell in love.
We didn't realize that we were different.
When we separated for the night, we discovered that we couldn't get back together and we couldn't
even communicate.
Pain arrived after that.
Pain started with healthcare and work. Limited stimulation was used to communicate dangerous
situations, such as incoming strokes and seizures predicted by the algorithms implanted in your
body22, and tactile notifications to ensure streamlined workflows, and to construct fluid interaction
patterns between human workers and their robot counterparts23: a small tactile or electrical signal
20 Do online advertising platforms allow to exclude ad viewers on racial basis? That is what Angwin and Parris ask in
their article (2016).
21 Iaconesi and Persico (2015) and Gatica-Perez (et al, 2016) describe how large platforms are able, first, to
crowdsource whole information ecosystems about territories and, then, to algorithmically control what is available
to individuals.
22 Sanfilippo and Pettersen (2015), and Shull (2015) describe interesting and progressively existing scenarios in which
haptic body augmentations provide additional senses and capabilities to people, also in the scenarios of disabilities
and pre-existing impairments.
23 Starner (et al, 1997), Milgram, Zhai and Drascic (1993), and Kortum (2008), all describe scenarios in which
technological augmentations of the body allow establishing communication patterns with robots and artificial
Figure 3: Constrained Cities: data-segregation in the city during a migrant crisis.
was created on the skin and communicated as data to other people and systems, to that actions could
be triggered, such as a medication being ingested or injected24, or some work-related information
being addressed.
Both streets and governments found ways for these types of technologies25. While these devices and
implants, together with encrypted communication was being used for sensual stimulation, to arrange
drugs smuggling and dealing processes, and as novel forms of gambling, they were also used by
police for safety monitoring in crowds, criminal control and, then, to manage crowds as they moved
in cities26.
Right, Right, Left.
Then the corridor. People have started providing corridors. There is a peer-to-peer application
which is called AirCAM, in which people rent safe, protected passages to people who want to
traverse the city in places they wouldn't have access to. They create a path in their homes, condos,
back yards, halls, and they connect paths with their neighbors' to create corridors: they wrap them
in isolating materials so that there is no network coverage in them, so that you can't be detected
and monitored in them, and so that the pain trigger cannot be sent to your implant.
At any time you can use the app to indicate locations A and B, to check if there is a corridor which
takes you from one to the other. You pay in Bitcoins, anonymously, to gain access and to get the
This time it's through the hall of this condo building, then down to the garage floor, leaving from
the ramp, then right inside the neighboring building and through the apartment on the round floor,
leaving from their balcony and onto the street with a 1.5 meter drop. Not really accessible, but at
least it's cheap.
If everything goes as planned, I will see him at the end of the corridor, in a blind spot.
Blind spots started emerging where algorithms' structure did not match reality27. Blind spots hunters
and connoisseurs started emerging, and they created guides and businesses which sat in a regulatory
grey area, enabling romantic meetings, illegal trade, gatherings, rave parties and more28.
In the same ways there were individuals whose data profiles were personified blind spots, meaning
that they either could achieve incredible feats (since their profiles were not mapped and, thus, they
had access to everything) or suffer incredible troubles (as their profiles corresponded to CAMs that
were all, completely, red).
24 Like in the scenarios proposed by Breimesser and Reitz (2003), in which cyber medications are ingested or injected.
25 Knibbs (2015), describes the "hustlers origins of wearable computing".
26 Rigg (2016), Bogard (2007) and Mitleton-Kelly (et al, 2013), all deal with existing scenarios in which haptic
technologies are used to control movement and behavior.
27 Many studies exist that observe the implications which emerge when organizations become able to control access to
knowledge and to aggressively deal with users' privacy, for example Lee (2011); Dwork and Mulligan (2013);
Epstein and Robertson (2014).
28 Always on blind spots, Young's beautiful dystopian architectural fictions, described by Pangburn (2017), show
multiple ways in which they could generate reappropriation patterns in the city.
Then, it happened. As I started jumping from the balcony of the last section of the corridor, I felt it:
the pain.
As I started suffering, I saw him, in the distance. He was turning from the opposite corner of the
street, crossing the roadway to arrive in the same sidewalk I was supposed to land on.
He started to collapse as I was hitting ground, my senses completely saturated from the white noise
of the pain. As I fainted, I viewed him reach under his arm, struggling as if to rip the implant off his
skin, to try and elude the agony.
The map had changed again.
The Constrained Cities
This short story is a Design Fiction which is part of a Near Future Design, a speculative design
technique and methodology which we use in our practice to produce scenarios which are the result
of a systemic research on the topics which we deal with.
This one is based on our research on the Constrained Cities, a dystopian vision of the near future of
the cities which we use to investigate potential risks and implications which derive directly from the
concepts we all are designing and implementing, as a global community of engineers, designers,
technologists, policy makers, entrepreneurs, researchers, practitioners, artists, citizens.
As seen in the references, all that is mentioned in the story is something existing now: maybe in
prototypal form, but existing, possible, and actively developed. It is a Near Future Design research,
in which current research trends are interpreted in terms of evolutive tensions. This story is a “What
If?” type of interrogation onto these evidence-based findings.
The future does not have to be scary. As authors we could have invented a completely different
story: a happy one, fun, and with a great, wonderful, positive ending. Maybe we will create such a
story for our next article.
Here, we wanted to explore an issue which, in our opinion, is in great need for discussion: the ways
in which technologies control us, out bodies, intentions and perception.
Figure 4: Constrained Cities: separated from someone you love
As we design, develop and achieve wonderful, effective, sustainable services and infrastructures for
our cities, we are also locking ourselves up in knowledge, relational and philosophical bubbles.
Here, we are facing risks which profoundly affect our ability to positively confront with diversity
and with what is unexpected and unforeseen.
What is clear is how these bubbles, on the one hand, reduce (or eliminate all together) the space for
transgression in the city. And, on the other hand, they reduce our perceptive space and landscape,
up to the point in which, as in the story, concepts, places, relations may disappear, leaving us with a
biased, egocentric, consumeristic, controlled, world.
Furthermore, this condition is a condition of remarkable asymmetry in power, or, more precisely, of
Biopower29. A Biopower which is in data and interfaces, and in their closedness, controlled
affordances, opaqueness, lack of interoperability and transparency, and in the constant trade-off
between comfort, convenience and availability, and the possibility for critique, complexity and
A story – and, thus, research – of this kind may bring on different reactions. Our reaction, as
researchers, artists and free (libre) citizens is to dedicate precise efforts to make sure that these
issues do not remain a science fiction tale, but wake other people's desire, imagination and
intelligence, to become items for active discussion and agency.
In our opinion and understanding, there are both enormous implications and opportunities in this,
whether we approach them from a Design education and practice perspective, or in Engineering
theories and practice, in Culture and cultural production, and in all the technologies, research,
artworks, conferences, workshop, client commissions, research projects we use and conduct in our
practice, and across our daily lives.
Andreu-Perez, Javier et al. 2015. "From Wearable Sensors to Smart Implants-–Toward Pervasive
and Personalized Healthcare" IEEE Transactions on Biomedical Engineering Volume: 62, Issue:
12, 2750 – 2762.
Angwin, Julia, Parris, Terry Jr. 2016 "Facebook Lets Advertisers Exclude Users by Race."
ProPublica. Accessed April 28 2017:
Begley, Josh. 2013. "How do you measure a military footprint?" Accessed April 28 2017:
Bogard, William. 2007. "The Coils of a Serpent: Haptic Space and Control Societies". Ctheory:
1000 days of theory. Accessed April 28 2017:
Breimesser, Fritz, Reitz, Arno. 2003. "Pocket monitor for patient cards". Patent No: US6626358 B1
Cain, Patrick. 2017. "Fake news: Meet the alternate-reality version of the Quebec City shooting."
Global News. Accessed April 28 2017:
Chang, Alvin. 2016. "How the internet keeps poor people in poor neighborhoods." Vox. Accessed
April 28 2017:
Consolvo, Sunny et al. 2008. "Flowers or a robot army?: encouraging awareness & activity with
personal, mobile displays." Proceedings of the 10th international conference on Ubiquitous
29 Foucault's "The History of Sexuality" (1998) introduces this concept.
computing, pages 54–63. ACM, 2008.
Datta, Amit, Tschantz, Michael C., Datta, Anupam. 2015. "Automated Experiments on Ad Privacy
Settings: A Tale of Opacity, Choice, and Discrimination." Proceedings on Privacy Enhancing
Technologies, Volume 2015, Issue 1 (Apr 2015).
Dillahunt, Tawanna R., Brooks, Christopher A., Gulati, Samarth. 2015. "Detecting and visualizing
filter bubbles in Google and Bing." CHI'15 Extended Abstracts, ACM 978-1-4503-3146-3/15/04,
Dwork, Cynthia, Mulligan, Deidre K. 2013. "It's Not Privacy, And It's Not Fair." 66 Stan. L. Rev.
Online 35 (2013-2014)
Epstein, Robert, Robertson, Ronald E. 2014. "The search engine manipulation effect (SEME) and
its possible impact on the outcomes of elections." PNAS, vol. 112 no. 33, doi:
Flaxman, Seth, Goel, Sharad, Rao, Justin M. 2014. "Ideological Segregation and the Eects of
Social Media on News Consumption." Accessed Apr 28 2017:
Flaxman, Seth, Goel, Sharad, Rao, Justin M. 2016. "Filter Bubbles, Echo Chambers, and Online
News Consumption." Public Opin Q (2016) 80 (S1): 298-320.
Foucault, Michel. 1998 The History of Sexuality Vol. 1: The Will to Knowledge. London: Penguin
Gatica-Perez, Daniel, Correa, Salvador R., Santani, Darshan. 2016. "What TripAdvisor Can’t Tell:
Crowdsourcing Urban Impressions for Whole Cities." Digital Polis. Paris: L'Oeil d'Or.
Graham, Mark, Zook, Matthew, Boulton, Andrew. 2012. "Augmented reality in urban places:
contested content and the duplicity of code." Transactions of the Institute of British Geographers,
Volume 38, Issue 3, 464–479. DOI: 10.1111/j.1475-5661.2012.00539.x
Iaconesi, Salvatore, Persico, Oriana. 2015. "Il Terzo Infoscape. Dati, informazioni e saperi nella
città e nuovi paradigmi di interazione urbana" In: I Media Digitali e l’Interazione Uomo-Macchina
edited by Simone Arcagni, p. 139-168. Rome: Aracne Editore.
Kaur, Satwant. 2014. "How are the Embedded Chips Going to Affect Our Lives?" IETE Technical
Review volume 29/2012, Issue 2, 101-104.
Knibbs, Kate. 2015. "The Hustler Origins of Wearable Computers." Gizmodo. Accessed April 28
Kortum, Philip. 2008. HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other
Nontraditional Interfaces. Burlington, Massachusetts: Morgan Kaufmann.
Lee, Micky. 2011. "Google ads and the blindspot debate." Media, Culture & Society, Vol 33, Issue
3, pp. 433 - 447.
Lupton, Deborah. 2014. "Self-tracking cultures: towards a sociology of personal informatics"
Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing
Futures: the Future of Design, 77-86.
McClusky, Mark. 2009. "The nike experiment: How the shoe giant unleashed the power of personal
metrics." Wired, 17(07). Accessed April 28 2017:
Milgram, P., Zhai, S., Drascic, D. 1993 "Applications of augmented reality for human-robot
communication." Intelligent Robots and Systems '93, IROS '93, DOI: 10.1109/IROS.1993.583833
Mitleton-Kelly, Eve, Deschenaux, Ivan, Maag, Christian, Fullerton, Matthew, Celikkaya, Nihan.
2013 "Enhancing Crowd Evacuation and Traffic Management Through AmI Technologies: A
Review of the Literature." Co-evolution of Intelligent Socio-technical Systems. DOI 10.1007/978-3-
O'Connor, Sarah. 2016. "When your boss is an algorithm.", Financial Times. Accessed April 28
Paglen, Trevor. 2009. Blank Spots on the Map: The Dark Geography of the Pentagon's Secret
World. London: Dutton Adult.
Pangburn, DJ. 2017. "New Subcultures Surface in the Future-Dystopian Films of Liam Young."
Vice Creators. Accessed April 28 2017:
Prelert, Mark J. 2014. "IoT Won't Work Without Artificial Intelligence". Wired. Accessed April 28
Rafferty, A. P., Reeves, A. J., McGee, H. B., Pivarnik, J. M. et al. 2002. "Physical activity patterns
among walkers and compliance with public health recommendations." Medicine and science in
sports and exercise, 34(8):1255–1261.
Rigg, Jaime. 2016. "Teslasuit does full-body haptic feedback for VR." Engadget. Accessed April 28
Sanfilippo, Filippo, Pettersen, Kristin Y. 2015. "A Sensor Fusion Wearable Health-Monitoring
System with Haptic Feedback." Innovations in Information Technology (IIT), 2015 11th
International Conference on, DOI: 10.1109/INNOVATIONS.2015.7381551
Schwartz, Paul M. 2006. "Privacy Inalienability And Personal Data Chips." Privacy and
Technologies of Identity, 93-113.
Shull, Peter B., Damian, Dana D. 2015. "Haptic wearables as sensory replacement, sensory
augmentation and trainer – a review." Journal of NeuroEngineering and Rehabilitation, DOI:
Silverstone, Tom, Kleeman, Jenny, Tait, Michael. 2017. "Rise of the sex robots". The Guardian.
Accessed April 28 2017:
Starner, Thad et al 1997 ."Augmented Reality through Wearable Computing." Presence, Vol. 6, No.
4, Pages: 386-398, doi:10.1162/pres.1997.6.4.386.
Townsend, Adam. 2017. "Algorithmic Recruiting of Laborers on a Gig platform.". Accessed April
28 2017:
Tufekci, Zeynep. 2015. Algorithmic Harms beyond Facebook and Google: Emergent Challenges of
Computational Agency. Colorado Technology Law Journal 13: 203.
Tziallas, Evangelos. 2015. "Gamified Eroticism: Gay Male “Social Networking” Applications and
Self-Pornography" Sexuality & Culture Volume 19, Issue 4, pp 759–775.
Wabash, Robert. 2015. "9 Car Accidents Caused by Google Maps & GPS." The Ranker. Accessed
Apr 28 2017:
Whitson, Jennifer R. 2013. "Gaming the Quantified Self" Surveillance & Society 11.1/2 (2013):
Zuboff, Shoshana. 2015. "Big other: surveillance capitalism and the prospects of an information
civilization." Journal of Information Technology, Volume 30, Issue 1, pp 75–89
Welche Informationen uns bei unseren Online-Recherchen präsentiert werden, welche Inhalte wir in den sozialen Medien sehen und welche Musik bzw. welche Filme uns auf den Streaming-Plattformen präsentiert werden, entscheiden heute Algorithmen. Welche Kriterien diese Algorithmen im Einzelnen nutzen, bleibt uns unbekannt. Wir wissen nur, dass unser Such- und Nutzungsverhalten intensiv ausgewertet wird und uns Angebote präsentiert werden, auf die wir mit höchster Wahrscheinlichkeit reagieren werden. So werden wir degradiert zu Versuchsmäusen, denen nur solche Häppchen vorgesetzt werden, nach denen wir schnappen werden – und wir merken es nicht einmal. So besteht die Gefahr, dass uns eine Weltsicht vermittelt wird, die mit der Realität nichts oder nur wenig zu tun hat.
Conference Paper
Full-text available
Despite the pervasiveness of search engines, most users know little about the implications of search engine algorithms and are unaware of how they work. People using web search engines assume that search results are unbiased and neutral. Filter bubbles, or personalized results, could lead to polarizing effects across populations, which could create divisions in society. This preliminary work explores whether the filter bubble can be measured and described and is an initial investigation towards the larger goal of identifying how non-search experts might understand how the filter bubble impacts their search results.
Conference Paper
Full-text available
A wearable integrated health-monitoring system is presented in this paper. The system is based on a multi-sensor fusion approach. It consists of a chest-worn device that embeds a controller board, an electrocardiogram (ECG) sensor, a temperature sensor, an accelerometer, a vibration motor, a colour-changing light-emitting diode (LED) and a pushbutton. This multi-sensor device allows for performing biometric and medical monitoring applications. Distinctive haptic feedback patterns can be actuated by means of the embedded vibration motor according to the user's health state. The embedded colour-changing LED is employed to provide the wearer with an additional intuitive visual feedback of the current health state. The pushbutton provided can be pushed by the user to report a potential emergency condition. The collected biometric information can be used to monitor the health state of the person involved in real-time or to get sensitive data to be subsequently analysed for medical diagnosis. In this preliminary work, the system architecture is presented. As a possible application scenario, the health-monitoring of offshore operators is considered. Related initial simulations and experiments are carried out to validate the efficiency of the proposed technology. In particular, the system reduces risk, taking into consideration assessments based on the individual and on overall potentially-harmful situations.
Full-text available
Significance We present evidence from five experiments in two countries suggesting the power and robustness of the search engine manipulation effect (SEME). Specifically, we show that ( i ) biased search rankings can shift the voting preferences of undecided voters by 20% or more, ( ii ) the shift can be much higher in some demographic groups, and ( iii ) such rankings can be masked so that people show no awareness of the manipulation. Knowing the proportion of undecided voters in a population who have Internet access, along with the proportion of those voters who can be influenced using SEME, allows one to calculate the win margin below which SEME might be able to determine an election outcome.
Full-text available
Gamification combines the playful design and feedback mechanisms from games with users' social profiles (e.g. Facebook, twitter, and LinkedIn) in non-game applications. Successful gamification practices are reliant on encouraging playful subjectivities so that users voluntarily expose their personal information, which is then used to drive behavioural change (e.g. weight loss, workplace productivity, educational advancement, consumer loyalty, etc.). The pleasures of play, the promise of a 'game', and the desire to level up and win are used to inculcate desirable skill sets and behaviours. Gamification is rooted in surveillance; providing real-time feedback about users' actions by amassing large quantities of data and then simplifying this data into modes that easily understandable, such as progress bars, graphs and charts. This article provides an introduction to gamification for surveillance scholars. I first provide brief definitions of gamification, games and play, linking the effectiveness of gamification to the quantification of everyday life. I then explain how the quantification in gamification is different from the quantification in both analog spaces and digital non-game spaces. Next, I draw from governmentality studies to show how quantification is leveraged in terms of surveillance. I employ three examples to demonstrate the social effects and impacts of gamified behaviour. These examples range from using self-surveillance to gamify everyday life, to the participatory surveillance evoked by social networking services, to the hierarchical surveillance of the gamified call-centre. Importantly, the call-centre example becomes a limit case, emphasizing the inability to gamify all spaces, especially those framed by work and not play. This leads to my conclusion, arguing that without knowing first what games and play are, we cannot accurately respond to and critique the playful surveillant technologies leveraged by gamification.
Full-text available
To partly address people’s concerns over web tracking, Google has created the Ad Settings webpage to provide information about and some choice over the profiles Google creates on users. We present AdFisher, an automated tool that explores how user behaviors, Google’s ads, and Ad Settings interact. AdFisher can run browser-based experiments and analyze data using machine learning and significance tests. Our tool uses a rigorous experimental design and statistical analysis to ensure the statistical soundness of our results. We use AdFisher to find that the Ad Settings was opaque about some features of a user’s profile, that it does provide some choice on ads, and that these choices can lead to seemingly discriminatory ads. In particular, we found that visiting webpages associated with substance abuse changed the ads shown but not the settings page. We also found that setting the gender to female resulted in getting fewer instances of an ad related to high paying jobs than setting it to male. We cannot determine who caused these findings due to our limited visibility into the ad ecosystem, which includes Google, advertisers, websites, and users. Nevertheless, these results can form the starting point for deeper investigations by either the companies themselves or by regulatory bodies.
Online publishing, social networks, and web search have dramatically lowered the costs of producing, distributing, and discovering news articles. Some scholars argue that such technological changes increase exposure to diverse perspectives, while others worry that they increase ideological segregation. We address the issue by examining web-browsing histories for 50,000 US-located users who regularly read online news. We find that social networks and search engines are associated with an increase in the mean ideological distance between individuals. However, somewhat counterintuitively, these same channels also are associated with an increase in an individual’s exposure to material from his or her less preferred side of the political spectrum. Finally, the vast majority of online news consumption is accounted for by individuals simply visiting the home pages of their favorite, typically mainstream, news outlets, tempering the consequences—both positive and negative—of recent technological changes. We thus uncover evidence for both sides of the debate, while also finding that the magnitude of the effects is relatively modest.
Sensory impairments decrease quality of life and can slow or hinder rehabilitation. Small, computationally powerful electronics have enabled the recent development of wearable systems aimed to improve function for individuals with sensory impairments. The purpose of this review is to synthesize current haptic wearable research for clinical applications involving sensory impairments. We define haptic wearables as untethered, ungrounded body worn devices that interact with skin directly or through clothing and can be used in natural environments outside a laboratory. Results of this review are categorized by degree of sensory impairment. Total impairment, such as in an amputee, blind, or deaf individual, involves haptics acting as sensory replacement; partial impairment, as is common in rehabilitation, involves haptics as sensory augmentation; and no impairment involves haptics as trainer. This review found that wearable haptic devices improved function for a variety of clinical applications including: rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss. Future haptic wearables development should focus on clinical needs, intuitive and multimodal haptic displays, low energy demands, and biomechanical compliance for long-term usage.
The miniaturization of Embedded Chips has resulted in their implantability in animate and inanimate things around us. Thus the animate and inanimate world is now connected. This has triggered the cycle of technology connects human beings to things around us, converts that to data, information, knowledge, actionable intelligence, services, and then sends back to human beings, in the form of much needed services. The emerging future brought about by embedded chips holds a symbiosis of human beings, computers, and things that bring to our service an invisible army of miniaturized service robots at our service.
It is taken for granted that face-to-face contact is the ultimate goal of gay male social networking applications such as Grindr and Scruff. I, however, challenge this assumption and argue that these applications have succeeded not because they fulfill their tacit promise to connect gay men, but by doubling as do-it-yourself (DIY) amateur porn platforms. Gay male social networking applications are screening tools that facilitate self-pornification through a process of gamified surveillance. I contend that the rewards for playing the game are often not the sanitized ones promoted by application creators and their public relations departments but the erotic exchanges and byproducts produced during the screening process these applications ambivalently disavow—nude images and erotic chat.