Content uploaded by Charles Alves de Castro
Author content
All content in this area was uploaded by Charles Alves de Castro on Feb 24, 2022
Content may be subject to copyright.
An Ethical Discussion About the Responsibility for
Protection of Minors in the Digital Environment:
A state-of-the-art review
Charles Alves de Castro, Dr Isobel O’Reilly, Dr Aiden Carthy
Technological University Dublin, School of Business
1st Disrupting Thinking Conference
Covid-19 Global Challenges –The Economic and Financial Dimensions
Dublin (Ireland), January 17 - 18th, 2022 –15 Minutes
The Research Centre for Psychology, Education and Emotional Intelligence
Irish Research Council - IRC
01
Government Policies from the
European Union (EU)
Perspective
03
An Overview of Companies and
the Private Sector
02
Parental Control
04
Ethical Dilemma
Introduction
•Many ethical questions have been raised regarding the use of social media and the
internet, mainly related to the protection of young people in the digital environment.
•E.g. Data protection, regulations, policies, addiction, exposure to harmful content,
unhealthy eating habits, mental health issues, increased alcohol consumption, sexual and
violent content spread online.
•In order to critically address the research question “who is responsible for ethically
protecting minors in the digital environment?”, this review analysis the main literature
available to understand the role of parents, the government, and companies in protecting
children within the digital environment.
Review 1
Government Policies from the
European Union (EU) Perspective
Literature Review
“The European Commission laid the
foundations for several programs to
protect minors in the digital
environment, increasing awareness of
such problem.” (European
Commission, 1996).
The primary goal
•Content filtering systems
•Covering parental control software
•Age-based labelling system
Review 1
Government Policies from the
European Union (EU) Perspective
Literature Review
According to the European
Audiovisual Observatory IRIS plus
EAO, (2012), the European
Commission has significantly
contributed to increasing awareness
of the need to evaluate and rethink the
regulatory and legal framework
protecting minors in the digital
environment, especially in keeping
with the dynamism and changing
nature of the media landscape.
Action plan
for a safer
internet
The main pillars of the “Action plan for a safer internet”
1) the development of content classification and filtering systems;
2) safety through a European network of hotlines for reporting illegal content;
3) the development of self-regulation initiatives;
4) initiatives to raise awareness and educate through media literacy programs.
(O’Neill, 2018; Labio-Bernal, Romero-Domínguez and García-Orta, 2020)
The final stage of the program which began in 2014, is still
underway today and is focusing on specific areas of action,
namely:
1) To increase awareness and empowerment, which includes teaching digital literacy and
online safety, in all schools within the EU;
2) To encourage the production of educational and creative online content for children
and promote positive online experiences for children;
3) To combat online child sexual abuse material, as well as sexual exploitation of children;
4) To create a safe environment for children, mainly through age-appropriate privacy
configurations, age-based and content-based classifications, and broader use of parental
controls.
(IRIS plus EAO, 2015)
What is quality content?
•The concept of “quality content” for minors
refers to content that increases their skills,
knowledge, and competencies, emphasising
creativity and being reliable and safe.
•Furthermore, the same document also
recognises that this type of content can
encourage better web use, namely if
children are also involved in creating such
content.
Protection of Minors
•The combination of parents’ efforts,
national and international organisations,
educators, civil society, and public
authorities in a global approach guarantees
the protection of minors in the digital
environment (The alliance to better protect
minors online, 2017).
Literature Review
Review 2
Parental Control
A new parenting style emerged, being
completely different from the
traditional ones (i.e., authoritative
parenting, laissez-faire parenting,
authoritarian parenting, and
permissive parenting) – the “internet
parenting style” (Darling and
Steinberg, 1993; W.F. Lau and H.K.
Yuen, 2013).
Literature Review
Review 2
Parental Control
Parenting styles refer to the context in
which parents raise and socialise with
their children, comprehending two
different dimensions:
responsiveness/warmth (involvement,
acceptance, and affect that they express
towards the childs’ needs) and
demandingness/control (rules, control,
and maturity expectations for the childs’
socialisation) (Darling and Steinberg,
1993).
Key dimensions of the internet parenting style
(Valcke et al., 2010, p. 89)
Style dimensions Item/Examples
Parental control
Supervision: “I’m around when my child surfs on the Internet”
Stopping internet usage: “I stop my child when he/she visits a less suitable
website”
Internet usage rules: “I limit the time my child is allowed on the Internet”
Parental warmth
Communication: “I talk with my child about the dangers related to the
Internet”
Support: “I show my child ‘child friendly’ websites.”
Main parental mediation strategies and examples
of common practices
(Coyne et al., 2017)
Parental mediation strategy Examples of common practices
Enabling or active mediation
Parents engage in different activities to enhance the kid’s appropriate use of the digital
technologies (explain the usage of the device and/or talk about the contents)
Co-use or co-viewing mediation Parents are present when the kid displays the activity with the media but do not talk
about the content
Restrictive mediation
Strict attention to rules and control to the kid’s digital activities (rules of when the kid
can use digital technologies or pose time restrictions)
Technical restriction Adoption of software applications or other technical tools to control the kid’s activities
(filters on PC for the kid’s safety)
Parental mediation strategies
❑The enabling or active mediation is the most common approach in European
families with 9-16 years old children, while restrictive mediation is more
frequent in families with younger children (Livingstone et al., 2017).
❑Therefore, it is possible to conclude that parental mediation regarding digital
technologies also changes according to the kids’ ages, aiming to suit their needs
better and protect them from online harm.
Risks and opportunities for children in digital technologies
(De Haan and Livingstone, 2009, p. 5)
Content:
Child as recipient
Contact:
Child as participant
Conduct:
Child as actor
OPPORTUNITIES
Education learning and
digital literacy
Educational resources
Contact with others who
share one’s interests
Self
-initiated or
collaborative learning
Participation and civic
engagement
Global information
Exchange among
interest groups
Concrete forms of
civic engagement
Creativity and self
-
expression
Diversity of resources
Being invited/inspired to
create or participate
User
-generated
content creation
Identity and social
connection
Advice
(personal/health/sexual
etc)
Social networking,
shared experiences with
others
Expression of identity
RISKS
Commercial
Advertising, spam,
sponsorship
Tracking/harvesting
personal info
Gambling, illegal
downloads, hacking
Aggressive
Violent/gruesome/hateful
content
Being bullied, harassed
or stalked
Bullying or harassing
another
Sexual
Pornographic/harmful
sexual content
Meeting strangers, being
groomed
Creating/uploading
pornographic material
Values
Racist, biased info/advice
(e.g., drugs)
Self
-harm, unwelcome
persuasion
Providing advice e.g.,
suicide/pro
-anorexia
Parental Control
❑Even though there are several studies about parental controls, the literature has
not yet reached a conclusive answer regarding their effectiveness in reducing
childrens’ online risks.
❑Some research supports the effectiveness of preventive software, especially
filtering, blocking, and monitoring software, in reducing unwanted exposure to
online sexual material for kids (10-15 years old).
❑Nonetheless, the evidence cannot be generalised to all ages (Ybarra et al., 2009).
❑In a separate study, the obtained results demonstrated that parental controls
failed to reduce online risks for kids, which highlights the need for further
studies in this area of expertise (Duerager and Livingstone, 2012).
Review 3
Companies and Private Sector
Self-Regulation Sector
Literature Review
It is important to address two specific
initiatives within the self-regulation
sector, namely the “Alliance to Better
Protect Minors Online and the “ICT
Coalition for Children Online”.
Both initiatives involves more than 40
companies.
Alliance aims to tackle three types of risks:
1) Harmful content, for instance violent or sexually exploitative
content;
2) Harmful conduct, such as cyberbullying;
3) Harmful contact, such as coercion, “grooming”, or sexual
extortion.
(The alliance to better protect minors online, 2017; European Commission, 2019).
Companies and Private Sector
Self-Regulation Sector
•Further research in the area is recommended to understand the real
effectiveness of these initiatives and the main actions to protect youth by the
social media companies and their outcomes, such as content analysis and
minimum age subscription.
•For example, one of the most challenging aspects for social media companies is
to ensure that kids under the minimum age subscription are not subscribing to
their channels (O’Neill, 2013).
•This issue might be solved through a face recognition system using a document
showing their age and picture simultaneously and submitted for evaluation prior
to subscription (O’Neill, 2013).
Literature Review
Review 4
Ethical Dilemma
The concept of ethics is defined as the
“systematic exploration of questions
about how we should act concerning
others” (Rhodes, 1986).
Ethical sensitivity, referring to the
individual’s conscience, whose
actions might affect the welfare of
others (Rest, 1982).
Digital Environment
Ethics involves four different aspects:
1) Determining whether the technological options either
directly or indirectly affect another person negatively;
2) Developing an ideal plan of action;
3) Identifying the important values that are associated with
each specific situation;
4) Implementing a solution/plan of action to be monitored
and evaluated (Rest, 1982).
New areas of child online protection issues that emerge
due to the absence of effective standards
(Milovidov, 2019)
Emerging issue Impact on children and young adults
Artificial intelligence
Impact on children’s development, behaviours, and ability to learn new skills
Algorithms
Bias and discrimination on social media sites and websites
Inclusion and access
Different inclusion and access to online environments
LGBTQIA
Identity and share with others when faced with discrimination, hate speech, apps with
conversion therapy
Disability
Children with special needs face more online harms
Ethnic minorities
Bias and discrimination continue online, with hate speech more toxic
Cybersecurity in family homes
Hacking, surveillance of webcams, home assistants
Final Remarks
•The responsibility to protect minors in the digital environment
relies on us all and all institutions in our society, as the internet
and social media are present in our daily lives.
•Although parents, government and the private sector have been
acting directly to ensure the protection of young people, further
research is recommended to evaluate the current programmes, to
understand the parents' control, how to empower and effectively
train parents for better results as well as it is expected that
companies and the government constantly update and review
their programmes, policies and legislation in order to have
outstanding results.
Thank you!
© Charles Alves de Castro
PhD Student, M.Sc. in Marketing
B00139249@mytudublin.ie
Research Gate: https://www.researchgate.net/profile/Charles_Alves_De_Castro
LinkedIn: https://www.linkedin.com/in/charlescastro/
Technological University Dublin
The research presented in this conference was funded by the Irish Research Council under award number GOIPG/2021/360
20
1ST DISRUPTING
THINKING
RESEARCH
EVENT
“COVID-19 Global Challenges - The Economic and Financial Dimensions”
Dublin (Ireland), January 17 - 18th, 2022
DISRUPTING THINKING
1st Disrupting Thinking Event
Dublin (Ireland), January 17 - 18
th
, 2022
Charles Alves de Castro
Disrupting Thinking Organising Committee awards
Certificate of Attendance
TU DUBLIN
For participating at DT. Theme:
“Covid-19 Global Challenges - The Economic and Financial Dimensions"