Access to this full-text is provided by Frontiers.
Content available from Frontiers in Education
This content is subject to copyright.
Frontiers in Education 01 frontiersin.org
Lëttëra web platform: A
game-based learning approach
with the use of technology for
reading competence
EmiliaFernandaLeal Uhlig
1, CarolinaGarzaLeón
1,
XóchitlCruzVargas
1, SheilaHernándezFranco
1 and
MayPortuguez-Castro
2*
1 Prepatec Eugenio Garza Lagüera, Tecnologico de Monterrey, Monterrey, Mexico, 2 Institute for the
Future of Education, Tecnologico de Monterrey, Monterrey, Mexico
Introduction: This study explores the potential of technology, metacognition,
and game-based learning to improve reading literacy in upper secondary school
students. The focus is on the Lëttëra educational innovation, a web-based platform
that uses game-based learning and technology to develop reading literacy.
Methods: This is a quantitative, exploratory, descriptive, and quasi-experimental
study that reviewed 149 responses from high school students who took the
standardized test Planea 2017. The study aimed to analyze whether using the
Lëttëra platform brought a change in the students’ reading competence. The
authors also examined students’ motivation toward technology, the platform
interface, and the game. The data was analyzed both descriptively and inferentially.
Results: The results showed that using the Lëttëra platform significantly improved
students’ competencies in literary text, information construction, and argumentative
text. It also increased their motivation toward the proposed activities.
Discussion: This study demonstrates that integrating technology and game-based
learning into reading instruction can lead to improved reading competencies and
increased motivation among students. These findings are useful for educators,
curriculum developers, and policymakers who aim to enhance reading instruction
by integrating technology into their teaching practices.
Conclusion: Overall, this study highlights the potential of technology and game-
based learning to improve reading literacy in upper secondary school students.
The Lëttëra platform provides a promising approach for enhancing reading
instruction, and its integration into teaching practices can benefit students,
educators, and policymakers alike.
KEYWORDS
reading competence, game-based learning, metacognition, educational innovation,
learning for life, professional education, higher education
1. Introduction
Nowadays, welive in a society where human beings are bombarded daily with visual and
auditory information through various forms of media. is is precisely why reading competence
for young people is vital, as they are confronted with multiple textual or graphic issues in which
OPEN ACCESS
EDITED BY
Antonio Palacios-Rodríguez,
University of Seville, Spain
REVIEWED BY
Soheil Hussein Salha,
An-Najah National University, Palestine
Denok Sunarsi,
Pamulang University, Indonesia
*CORRESPONDENCE
May Portuguez-Castro
may.portuguez@tec.mx
SPECIALTY SECTION
This article was submitted to
Digital Education,
a section of the journal
Frontiers in Education
RECEIVED 05 March 2023
ACCEPTED 27 March 2023
PUBLISHED 24 April 2023
CITATION
Leal Uhlig EF, Garza León C, Cruz Vargas X,
Hernández Franco S and
Portuguez-Castro M (2023) Lëttëra web
platform: A game-based learning approach
with the use of technology for reading
competence.
Front. Educ. 8:1180283.
doi: 10.3389/feduc.2023.1180283
COPYRIGHT
© 2023 Leal Uhlig, Garza León, Cruz Vargas,
Hernández Franco and Portuguez-Castro. This
is an open-access article distributed under the
terms of the Creative Commons Attribution
License (CC BY). The use, distribution or
reproduction in other forums is permitted,
provided the original author(s) and the
copyright owner(s) are credited and that the
original publication in this journal is cited, in
accordance with accepted academic practice.
No use, distribution or reproduction is
permitted which does not comply with these
terms.
TYPE Original Research
PUBLISHED 24 April 2023
DOI 10.3389/feduc.2023.1180283
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 02 frontiersin.org
their reading level is the basis for their decision-making. Furthermore,
reading is a fundamental skill for success in adult life (OECD, 2013).
However, it is observed that acquiring these skills is challenging for
young. is situation generates disinterest and frustration in them,
added to the fact that the media used is not attractive for their age and
that their interests have been changing due to the availability of
new technologies.
e emergence of e-books, audiobooks and other digital reading
platforms has brought about a considerable change in the way
individuals read. Technology has advanced our reading habits and
sparked interest in both young people and adults (U.S. Department of
Education, National Center for Education Statistics, 2018). Sun etal.
(2021) indicate that interactive experiences, such as online book clubs
and digital storytelling apps, can increase curiosity toward reading for
all ages. Additionally, during a pandemic like COVID-19 where social
distancing impacts education, teachers are using educational projects
that incorporate technology to develop students’ reading skills while
bringing them closer together (Grynyuk et al., 2022).
According to the most recent results of the Organization for
Economic Cooperation and Development (OECD) Programme for
International Student Assessment (PISA), applied in 2018 because due
to the Covid-19 Pandemic (the 2021 one was postponed to 2022),
Mexican students scored below the international average. According
to Salinas etal. (2018), “in Mexico, only 1% of students performed at
the highest prociency levels (level 5 or 6) in at least one area (OECD
average: 16%) and 35% of students did not obtain a minimum level of
prociency (Level 2) in all 3 areas (OECD aver-age: 13%)” (p.3).
Although several eorts have been made at the national level to
increase reading literacy among students aged 15 and older, average
performance has remained stable in reading, mathematics, and
science throughout most of Mexico’s participation in PISA (Salinas
etal., 2018). Since 2000, when Mexico rst participated in this test,
there has been no progress in any of the assessed areas.
Assessment is one of the essential features to ensure the quality of
education systems in developed countries. Although eorts have been
made in Mexico to assessment for primary and secondary education
in the last two decades through three standardized tests: EXCALE,
Planea and PISA (Caracas Sánchez and Ornelas Hernández, 2019) and
the National Institute for the Evaluation of Education (INEE) by
publishing studies and literature on the subject to reinforce practical
exercises that have an impact on increasing reading comprehension,
the student’s results have remained the same. Another important
aspect is support for teachers on motivation, knowledge, and tools to
make them aware of their transformative power and the need to use
ICTs to achieve learning (UNESCO, 2017).
For this reason, encouraging reading competence and developing
educational innovation based on games and technology to help
improve this skill in high school students. In this sense, the
development of reading literacy in the high school period is
transcendental for them to become actively involved in society,
considering that having deciencies in reading skills can limit their
potential for the future (Sucena etal., 2022). Reading literacy is
dened as understanding, evaluating, reecting on, and engaging with
texts to achieve one’s goals, develop knowledge and personal potential
and participate in society (OECD, 2018). However, the scientic
literature related to the use of ICT in reading processes is still scarce
(Fernández Batanero etal., 2021), even though the use of digital tools
and active pedagogical strategies favor the development of these
competencies (Neira-Piñeiro, 2015; Badillo-Jiménez and Iguarán-
Jiménez, 2020).
is study seeks to analyze how technology used in an
entertaining way (game-based learning) helps to visualize
metacognition and makes the development of critical thinking
necessary to foster reading competence in a user-friendly and us-er-
friendly way self-manageable. erefore, the research objectives were:
(1) Analyze how the use of technology in the form of games can
improve the reading competence of high school students in Mexico;
(2) Evaluate the impact of the use of the web platform Lëttëra on the
reading competence of students, and (3) Identify how students
perceive the use of technology in reading learning and whether this
aects their motivation and satisfaction with the learning process.
ese objectives have has been addressed through a comparative
study of the results obtained from applying the Planea 2017 test
before and aer using the web platform Lëttëra, a web platform
designed according to PISA’s proposed reading comprehension
processes and performance levels. e variables measured were based
on the result of academic performance with the Planea 2017 test and
student satisfaction, assessed through a survey.
Although there are some studies that demonstrate the benets of
using digital tools and active pedagogical strategies to promote
reading, there has not been enough research on how game-based
learning can improve metacognition and critical thinking in
secondary school students, which is considered fundamental for the
development of reading competence. In addition, many studies
conducted in Mexico have shown that students’ results in standardized
reading tests have been insucient, indicating the need to implement
new pedagogical strategies and tools to improve the quality of
education and the development of reading competence. is study can
beuseful for educators, educational technology designers, researchers
in the eld of education, and decision-makers in educational policy
who are interested in improving the reading competency of secondary
school students in Mexico using technology, particularly online
educational platforms. Additionally, the study’s ndings can behelpful
for parents and students interested in using technology eectively to
enhance reading learning.
2. Literature review
2.1. Programme for international student
assessment test
e design of the reading comprehension exercises of the Lëttëra
Web platform is based on the PISA test based on the rationale that the
current assessments in Mexico are aligned with it. In addition, it
represents a commitment by the governments of OECD countries to
regularly monitor the performance of education systems in terms of
student achievement within a common internationally agreed
framework (OECD, 2018).
e PISA test aims to assess whether 15-year-old students have
acquired the knowledge and skills to participate fully in a knowledge-
based society. e test is developed and coordinated by the OECD and
evaluates students’ abilities in reading, mathematics, science,
collaborative problem-solving, nancial literacy, and global
competence (U.S. Department of Education, National Center for
Education Statistics, 2018). e tests are administered to a sample of
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 03 frontiersin.org
students from each participating country, and the results provide an
opportunity for countries to compare their educational systems with
those of Civini (2019). It is therefore an important tool for
policymakers, educators, and researchers to evaluate the eectiveness
of educational systems in dierent. It is administered every 3 years and
focuses on three core areas: reading, mathematics, and science
(Bohrnstedt and Stancavage, 2016).
e denition of reading has changed over time, mainly as the internet
has generated new ways of reading and is now seen “as an expanding body
of knowledge, skills and strategies that individuals construct over a lifetime
in diverse contexts, through interaction with peers and the wider
community” (OECD, 2018, p.9). In this way, PISA organizes reading
literacy into three dimensions: text, situations, and processes.
e way content is presented in texts determines how they are
managed. is can either be continuous, where sentences and
paragraphs form broader structures like articles, essays or stories;
non-continuous, organized from non-sequential information such as
diagrams, infographics or advertisements; or mixed when combining
the two forms (OECD, 2017). To cater for target audience and purpose
of creation, texts are categorized into personal, public educational and
occupational situations. e interaction between readers and text
determines cognitive processes which include locating information
comprehension evaluation, and reection on it (OECD, 2018).
From 2018 onwards, this assessment has included texts presented
digitally. ese texts are classied according to (1) how information is
accessed (static or dynamic); (2) the amount of information displayed
(single or multiple single sources with two or more sources); (3) their
format (continuous, non-continuous, and mixed) and (4) their
discursive purposes (narrative, expository, argumentative, prescriptive,
and transactional).
e Lëttëra web platform uses continuous, discontinuous, and
mixed texts. It is based on the cognitive processes 1c to 3 suggested by
PISA (National Commission for the Continuous Improvement of
Education, 2018). For the underlining and commenting part of the
reading, through distinguishing between dierent colors, to
metacognitively direct the reading competence, as well as in the design
of the reading challenges or exercises, as well as providing users with
immediate feedback, relating the performance levels to the PISA
reading competence as shown in Table1.
e most basic levels are the rst ve:1c, 1b, 1a, 2, and 3; the
highest performance levels are 4, 5 and 6. is rst edition of the
Lëttëra Web Platform is designed to help users develop their skills on
the rst ve levels.
2.2. Assessing reading literacy through
standardized tests in Mexico
According to Caracas Sánchez and Ornelas Hernández (2019), in
Mexico, evaluation played a relevant role in the 1980s. Evaluation
TABLE1 Reading literacy performance levels, PISA 2018 (National Commission for the Continuous Improvement of Education, 2018, p.44).
Performance level Description of achievement
3 - Identify the literal meaning of single or multiple texts without explicit content or organizational clues.
- Integrate content and generate basic and more advanced inferences.
- Integrate several parts of a text to identify the main idea, understand a relationship or interpret the meaning of a word or phrase when the
necessary information is presented on a single page.
- Search for information based on indirect cues and locate target information that is not prominently displayed or is accompanied by
distractors.
- In some cases, recognize the relationship between various pieces of information based on multiple criteria.
- Reect on a text fragment or a small set of texts and compare and contrast the points of view of various authors based on explicit information.
2 - Identify the main idea in a text of moderate length.
- Understanding relationships or interpreting meaning within a limited part of the text when the information is not relevant, and the reader
must make basic inferences, or when the information is in the presence of some distracting information.
- Selecting and accessing a page in a set based on explicit, but sometimes complex, indications and locating one or more pieces of information
based on multiple, partially implicit criteria.
- Reect on the general purpose, or the purpose of specic details, in texts of moderate length when explicitly asked to do so.
- Reect on simple visual or typographic features.
- Compare claims and assess their reasons based on short and explicit statements.
1a - Understand the literal meaning of sentences or short passages.
- Recognize the central theme or author’s purpose in a text on a familiar topic and make a simple connection between several adjacent pieces of
information or between the information given and their prior knowledge.
- Select a relevant page from a small set based on simple prompts and place one or more independent pieces of information within short texts.
- Reect on the overall purpose and essential and accompanying information in simple texts containing explicit clues.
1b - Evaluate the literal meaning of simple sentences.
- Interpret the literal meaning of texts by making simple connections between adjacent pieces of information in the question or text.
- Look for and locate a piece of information highlighted and explicitly placed in a single sentence, a short text, or a simple list.
- Access a relevant page from a small set based on simple prompts when there are explicit signals.
1c - Understand and state the meaning of short, syntactically simple sentences on a literal level.
- Read with a clear and simple purpose in a limited time.
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 04 frontiersin.org
through standardized tests was a strategy used to improve the
country’s education system due to educators’ high attrition rate and
low eciency. In that decade, Mexico created the National Evaluation
Centre for Higher Education (CENEVAL) to regulate the evaluation
of education in the country, as the International Monetary Fund and
the World Bank assigned economic resources for teaching in Mexico
as support to settle the country’s foreign debt (Aranda Izguerra, 2005).
One of the conditions for allocating resources was the
commitment to improve education. So, through the design and
application of standardized tests such as Planea or PISA, the country
would substantially improve the education system. Starting in 2000,
the OECD, through PISA, began the application of international tests
for 15-year-old students. Mexico requested that the PISA test
beapplied in educational institutions to improve education and full
its commitment. e PISA test focused on nding performance
indicators in mathematics, science and reading. is assessment does
not focus on the curricula of the participating countries but on the
progress of young people coping with the knowledge society (Jiménez
Moreno, 2016).
2.3. Self-directed learning
A self-directed learner is to identify and achieves goals through
eective learning strategies to understand, monitor, direct, evaluate
and reect on their process, ultimately taking control that enables
them to decide which methods to use. Students must beself-directed
learners to experience eective education and lifelong learning
(Bagheri etal., 2013).
e Lëttëra web platform promoted the students’ self-directed use
of the exercises. e platform had an immediate feedback system that
allowed students to learn about the cognitive process required to
answer each question and the justication that determined the correct
answer for each item. Another factor that eased the self-directed use
of the platform was the challenge map that visually placed the students’
progress to achieve all the activities. e last factor was determining
the time to complete the activities since, during the semester, the
students had some weeks to complete all the reading exercises.
2.4. Game-based learning
e design of the Lëttëra web platform considered the components
that provided an attractive learning environment for high school
students. ese components were: the use of technology, gamication,
self-directed learning, and metacognition through the underlining of
PISA cognitive processes in the texts. Gamication refers to the
incorporation of game design elements such as point systems,
leaderboards, and rewards into non-gaming contexts (Høiseth etal.,
2021). e goal of gamication is to increase motivation, engagement
and participation in activities that are not inherently fun or interesting
by making them more enjoyable through game-like (Bicen etal., 2022)
Lettera web is based mainly on the gamication methodology. Games
have long been known in the educational world for their eectiveness
in applying goals such as having fun, socializing, and learning, and
consequences such as winning and losing according to a specic rule
system (Baran etal., 2018). In recent years, gamication has gained
more popularity among teachers due to the gamied designs and
game mechanics added to a non-game process (Wong etal., 2022).
is has been embraced across a range of elds including education,
healthcare, marketing, and customer service.
In general terms, a game is an application, while gamication is
a process where game components are integrated into a non-game
environment (Attali and Arieli-Attali, 2015; Abdul Ghani et al.,
2022). e main objective of gamication is to eectively implement
the positive eects of gaming in educational environments to
increase student engagement, stimulate their educational
participation, and improve outcomes (Deterding et al., 2011).
Gamication seeks to harness the power of games to solve real-
world problems (Hüseyin et al., 2020), making it the most
appropriate approach to use on the Lëttëra web.
Dierent studies have analyzed the use of gamication in the
school environment. In the last 2 years, due to the COVID-19
pandemic, initiatives have emerged that seek to improve student
motivation in virtual environments due to social distancing (Chans
and Portuguez Castro, 2021). Singh etal. (2021) identied that online
activities require greater self-regulation and motivation for students
to participate, so technology plays a fundamental role in improving
methodologies; they also suggest that creating gamied environments
enhances interaction and collaborative learning.
Another study by Chans and Portuguez Castro (2021) was
conducted in the Mexican context, where students could carry out
gamication activities in chemistry classes. e results showed that
gamication elements such as autonomy and feedback allowed for the
development of intrinsic motivation of learners. In the case of
language teaching, Alharbi and Khalin (2022) mention that
gamication through digital platforms gives the student more time to
interact. If they receive immediate feedback, it engages them more to
continue learning.
Game-based learning is another dierentiator proposed by
Lëttëra. is methodology has an impact on the students’
motivation, making the didactic model more meaningful and
positioning the young person as the protagonist of the learning
(Cueva Gaibor, 2020).
e gamication in the web platform Lëttëra consisted of a
process in which there are game rules to achieve the challenges of
obtaining the highest number of badges. Each student can engage with
each challenge twice. Also, this is displayed on a map that displays the
progress in each challenge (reading exercises), encouraging
continuous improvement through feedback presented at the end of
each challenge. e texts of the reading exercises were selected
according to dierent areas of knowledge and in other formats, as
established by PISA, continuous and non-continuous texts (INEE,
2018). e various topics presented in the texts allowed students to
learn about dierent social, economic, and cultural issues according
to the type of text, thus linking reading with the context of real life in
a gamied environment.
2.5. Metacognition
e concept of metacognitive monitoring emerged in the 1970s.
Pioneering work on metacognitive monitoring by John Flavell
determined the stage of this construct by describing the development
of aspects of how a person reects or thinks about their cognition
(Crespo, 2000). Flavell dened the concept of metacognition as
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 05 frontiersin.org
“thinking about thinking.” Hedivided the idea of metacognition into
four main aspects: metacognitive knowledge, metacognitive
experience, objectives or goals, and strategies.
At a broad level, the basis of metacognition is in the individual’s
mind. Metacognition has been positioned as Moshman (2008)
classied as endogenous constructivism. Metacognition relates to
the abstract reection of new or existing cognitive structures. In this
sense, metacognition emphasizes learning development rather than
the learner’s interaction with the environment (Dinsmore
etal., 2008).
e intention of the Lëttëra web platform to contribute to the
development of students’ metacognition is a priority to equip them
as critical readers who can transfer these skills to other learning
areas or situations. By visually providing dierent colored
underlining according to the cognitive processes proposed by the
PISA test. is test evaluates the cognitive processes: locating
information, understanding, assessing, and reecting. For each of
these three classications of cognitive processes, a dierent
underline color was assigned to allow the student to visualize the
other cognitive processes to respond to the reading comprehension
exercise in the text.
The design of the items is considered a predominant cognitive
process to determine the correct answer. Once the students had
completed the entire reading exercise, feedback was provided for
each item. It explained how the cognitive process was a
determining factor in selecting the correct answer to each item. In
this way, the students could monitor their cognitive process and
reflect on how they read, building their metacognition as they
progressed through the Lëttëra challenges. The cognitive processes
exposed visually through the underlining were a constant that led
to the familiarization of these aspects that we sometimes
do automatically.
3. Methodology
is research corresponds to a quantitative, exploratory,
descriptive, and quasi-experimental study in which a technological
innovation based on games is used to develop reading skills. A
standardized test was used to analyze the study variables, and a
descriptive analysis was carried out of the results obtained before and
aer using the educational innovation.
is project was part of the Novus projects and was selected to
fund and support the development of the platform for reading skills.
Novus is an initiative of the Institute for the Future of Education that
seeks to strengthen the culture of educational innovation based on
evidence from the professors of the Tecnológico de Monterrey
(Portuguez-Castro etal., 2022). As part of their impact measurement
strategy, the faculty is trained and mentored to submit a research
protocol for approval. Due to the amount of projects supported by the
initiative, they have worked closely with the ethics committees to
ensure that Novus protocols follow federal and international
regulations in regards to research subjects and their integrity. During
training and follow-up, Novus Mentorship team ensures everything
from methods to ethics is considered in the submission. If this is the
case, the proposal is approved by Novus and faculty may begin to work
on their project.
e technological innovation of the web platform Lëttëra aims to
create a virtual space to develop autonomy through a gaming
environment, asynchronously read, and receive immediate feedback.
is innovation uses a user-friendly interface with stimulating
aspects that optimally favor teaching-learning. In addition, the
teacher benets from the reduction of revision work since the tool
monitors the management of each student’s level, administering the
exercises, evaluating, and providing feedback; thus, the student
manages their learning progress.
e research seeks to analyze how technology used in game-based
learning helps to visualize metacognition and makes the development
of critical thinking necessary for reading literacy friendly and self-
manageable. is analysis was based on a comparative analysis of the
Planea 2017 test results to assess the learning and development of a
life skill such as reading literacy. Specically with the OECD’s
perspective, through the PISA Test, “reading should therefore
beconsidered through the various ways in which citizens interact with
texts on various devices and how reading is part of lifelong learning”
(OECD, 2018, p.8).
3.1. Procedure
e experiment consisted of several phases. e rst was
administering the Planea test 2017 as a diagnosis for students before
using the platform. In the second, Lëttëra was implemented in the
classroom using autonomous, exible, and enjoyable learning
technology. In the third phase, the results were evaluated through a
second application of the test to assess the data obtained
comparatively and by cognitive processes. e data was collected
through an online form before and aer completing the activities on
the platform. Once collected, the data was anonymized and analyzed
as a whole to protect the identity of the participants. Data collection
was done prospectively during the semester in which the educational
intervention was carried out.
3.2. Participants
e participants in the sample were 149 rst-semester students
of the Eugenio Garza Lagüera High School of the Tec de Monterrey.
is institution is part of a system of 26 high schools, professional
and postgraduate campuses, distributed throughout Mexico. e
following three types of baccalaureates are oered: bicultural,
multicultural, and international (32). e inclusion criteria of this
study were: (1) students who are currently enrolled in a high school
program; (2) students who are willing to participate in the study and
provide informed consent (and, if under 18, have parental/guardian
consent), (3) students who meet the specic demographic or
academic requirements of the study (e.g., age range, grade level,
specic courses taken). e exclusion criteria were: (1) students who
are not uent in the language of instruction (if applicable), (2)
students who do not complete the informed consent. ey had been
previously enrolled in private junior high schools in the metropolitan
area of Monterrey, Mexico, and had upper-middle-class backgrounds.
Of these 149 students, 78 are male, and 71 are female, between 15 and
17 years of age.
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 06 frontiersin.org
3.3. Instruments
The Planea 2017 test is an objective, standardized test aligned
to the Common Curriculum Framework, particularly in the fields
associated with the Language and Communication and
Mathematics competencies for students in Mexico. It is a validated
and standardized multiple-choice instrument and consists of 100
items: 50 for Language and Communication and 50 for
Mathematics (Planea, 2022). Its objective is “to determine the
extent to which students achieve mastery of a set of essential
learning at the end of the different levels of compulsory education”
(Planea, 2022, p.9), reading comprehension, reading literacy and
mathematics, through the formative field of Language
and Communication.
Planea test evaluates the use of two cognitive processes: the rst
is reading competence, which comprises: (a) the extraction of
information, (b) the development of a global comprehension, (c)
the development of an interpretation, (d) the analysis of content
and structure as well as: (e) critical evaluation of the text; and the
second process, which assesses reection on language, consisting of
(a) semantic reection, (b) syntactic and morphosyntactic
reection, (c) linguistic conventions and nally, (d) knowledge of
sources of information. e validity of the test has been established
through a thorough review of the test’s content and structure by
experts in the eld of education. As for reliability, the test has been
subjected to statistical analysis that has demonstrated adequate
internal consistency and acceptable inter-rater reliability, suggesting
that the test provides consistent and accurate results (INEE, 2018).
e Language and Communication competence assess learning
related to cognitive processes and knowledge for the selection,
comprehension, and interpretation of texts with dierent
characteristics, purposes, and thematic axes: argumentative,
expository, and literary texts, in their continuous (text) and
non-continuous (text and image) modalities, to know the mastery of
the set of essential learning in this competence. e test comprises
four categories: expository text, argumentative text, literary text, and
construction of information, measured at four levels of achievement.
ese levels are described in Table2.
e development of reading literacy in students was determined
through the comparative analysis of the quantitative results of the
Planea 2017 test between the initial and nal application.
3.3.1. Programme for international student
assessment and Planea achievement levels
The PISA test presents eight levels of achievement, from
1c to 6, in three different processes: locating information,
understanding, evaluating, and reflecting. The Planea test presents
four processes for reading literacy: extracting information, overall
understanding, developing an interpretation, and analyzing
content and structure.
e Planea Level IV correlates with PISA levels 3 and 4,
specically concerning interpreting the meaning of the nuances of
language in a section of the text, demonstrating understanding in
interpretive tasks, and comparing perspectives and drawing inferences
based on diverse sources (National Commission for the Continuous
Improvement of Education, 2018). e Lëttëra web platform focuses
on the rst PISA achievement levels (1c to 3), which correspond to
level IV (the highest) of the Planea test.
3.3.2. Satisfaction survey
A survey was designed to determine the students’ motivation to
use the Lëttëra platform. e questionnaire consisted of 17 closed
questions in which they answered “true,” “false,” or “doubtful”
according to their perception of the exercises and their experience
with the tool. Two questions were included to identify the age and
gender of the student. Fieen closed questions would assess if they felt
an impact on their condence in reading; if using the platform
motivated them to read; if the interface and the methodology used
made reading easy; and if they considered it appropriate to improve
their reading comprehension.
e student satisfaction survey was validated by Spanish language
teachers and an education researcher to ensure the quality and
reliability of the obtained results.
3.4. Description of the Lëttëra platform
Lëttëra aims to develop reading skills in young people in upper
secondary education. It comprises 12 reading comprehension
exercises containing literary texts and academic, journalistic, and
popular science articles. ese texts allow users to explore dierent
topics, expand their vocabulary, and above all, understand, interpret,
analyze, and extract information from these texts that students are
TABLE2 Achievement levels of the Planea 2017 test (INEE, 2018).
Level of achievement Descriptor
IV ey select and organize relevant information from an argumentative text; identify the author’s position, interpret information
from argumentative texts (such as critical reviews and opinion pieces) and infer paraphrasing from an expository text (such as a
popular article).
III Recognize in an opinion article: purpose, argumentative connectors and constituent parts (thesis, arguments and conclusion);
identify the dierences between factual information, opinion and the author’s assessment; identify the dierent ways in which
written language is used according to the communicative purpose and use strategies to understand what they read.
II ey identify the main ideas that support the proposal of a short opinion article, discriminate and relate timely and reliable
information, and organize it based on a purpose.
I ey do not identify the author’s position in opinion articles, essays or critical reviews, nor do they explain the information in a
simple text in words other than those used in the reading.
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 07 frontiersin.org
likely to encounter at school and in their daily lives. Figure1 shows
the homepage of the platform.
3.4.1. Exercise examples
“Explore reading” is a core part before starting the exercises, in
which the students are guided through highlighting and comments
designed to enhance their critical reading skills, which help to foster
metacognition and help students become critical readers, according to
the PISA descriptors. ese processes are color-coded, where green
represents the cognitive skills of locating information, blue highlights
those of understanding, and orange identies skills related to reection
and evaluation, as well as the achievement levels of the Planea test.
Several game-based strategies were made to design a more
entertaining environment within the platform; learners can choose
their prole picture or avatar.
e texts are divided into three modules: mountaineering
challenge, aquatic challenge, and countryside challenge, each with
four challenges with reading comprehension exercises to visualize
their progress on an interactive map.
3.4.2. Rewards system
e user’s eort is rewarded through a game system, which is
explained in the “Explorer’s Guide” section (see Figure2). Each player
has two opportunities to tackle the reading comprehension exercise. If
FIGURE1
Home page of the Lëttëra web platform.
FIGURE2
Explorer’s guide, explaining the rules of the game.
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 08 frontiersin.org
they get 100 points when they engage in the challenge at the rst
opportunity, they obtain two badges and the logo of Lëttëra, which are
then colored in the “Rewards” section. If students obtain a total of
fewer than 100 points but more than 70in the rst opportunity, they
can: (1) engage in the reading again or (2) keep the score achieved and
get two badges; if they do not pass at the rst opportunity, they can do
it again, and if pass (or score higher than 70) get a badge; in case they
make the second attempt and do not pass, students do not obtain any
badges and must move on to the next challenge with the highest
score achieved.
Once each challenge or reading comprehension exercise has been
completed, Lëttëra provides the user with feedback on each item,
which presents the process (locating information, understanding,
reecting, and evaluating). It also indicates the correct answer,
explaining why each one is in that status, intending to induce personal
reection so that learners understand what an optimal reader needs
to focus on and to self-direct their learning.
e platform also has a menu that allows the learner to view: the
Challenge Map, Rewards, Explorer’s Guide, Info Kiosk, Calendar, Edit
Prole and Analytics, where they can check the score obtained in each
of the challenges.
3.5. Data analysis
e data collected in the instruments were analyzed using
descriptive statistics, graphs, and tables to present the results for each
category. e total data from the Planea test were reviewed to
examine whether there was a change in reading literacy results aer
using the Lëttëra platform. For this, the correct answer for each
question in each of the applications was identied. e mean
dierence was then determined to establish whether there was a
signicant dierence in the test results. e results were analyzed
using Excel and Minitab v.21. e paired samples t-test was applied,
given that the data came from the same subjects aer the treatment
(Johnson and Kurby, 2016).
e other analysis performed was for each sub-competence
measured in the PISA test, for which frequency distributions and
percentages were performed (Hernández Sampieri etal., 2014). Finally,
the results for each participant group were compared to identify
whether there were signicant dierences in the results obtained in the
post-test aer using the platform. e satisfaction questionnaire was
analyzed with descriptive statistics to identify the most relevant aspects
of the student’s opinions of their experience with Lëttëra.
4. Results
e results of the study aimed to answer the research objectives.
For the rst objective: “Analyze how the use of technology in the form
of games can improve the reading competence of high school students
in Mexico,” the Planea test was applied as a pre-and post-test to
identify the learning gain of students due to the intervention with the
platform. For the second objective, “Evaluate the impact of the use of
the web platform Lëttëra on the reading competence of students,” the
results of the dierent categories of the test were analyzed, and the
ones that had the greatest changes were determined. Finally, for the
third objective “Identify how students perceive the use of technology
in reading learning and whether this aects their motivation and
satisfaction with the learning process,” a satisfaction survey was
applied to learn the students’ opinions on the impact of the use of
educational innovation on their motivation.
4.1. Planea test
e Planea test is a standardized test consisting of 50 items for the
subject of Spanish. e research included using this test as a diagnostic
tool before students engaged in using the Lëttëra platform. It was also
used aer their experience to analyze the changes in reading skills that
could beattributed to this educational innovation. ese results aimed
to contribute to the objective of analyzing how the use of technology
on the platform improves the reading competencies of students.
e results of the rst application of the test are shown in Figure3.
e number of students that completed the test was 149. e
maximum score for the test was 50 points, the minimum score was in
the range of 13–17.5 correct answers (3 students), and the maximum
score was 44.5–49 (7 students). Most of the correct answers ranged
from 26.5–31 (40 students) to 31.5–65.5 (40 students). e mean was
31.436, with a standard deviation of 6.890.
In the second test implementation with the same number of
responses as the previous application, the minimum score was 18–22.2
(8 students), and the maximum score was 50 (1 student) (Figure4).
e highest number of correct answers was concentrated within the
34.8–39 range (46 students). e mean was 33.255, with a standard
deviation of 6.396. e number of correct answers above 30 increased
from 91in test 1 to 104in test 2.
When comparing the means of the two groups, a dierence of 1.82
was identied, showing that their overall results were higher aer
using the Lëttëra platform. Applying the paired samples t-statistic with
a condence level of 95%, a value of p of 0.004 (p < 0.05) was obtained,
indicating that there was a signicant dierence in the student’s scores
when using the platform.
4.1.1. Analysis by student group
e results analyzed correspond to 149 student responses divided
into six groups. e distribution of the groups is shown in Table3.
When comparing means for groups A, B, C, and E, there was a
decrease in the score between P1 and P2 between −0.11 and-3.05.
When applying the statistical test, it was determined that there were
no signicant dierences in these groups, so there was no evidence
FIGURE3
Distribution of Planea 1 test scores.
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 09 frontiersin.org
of an eect of the educational intervention in this sample. On the
other hand, there was a signicant dierence of 9.09 (p = 0) in group
D and 6.8 (p = 0) in group F.
In group D, there was a signicant improvement in the mean
score of 9.09 with a p = 0.000 (p > 0.05). As shown in Figure5, most of
the students in this group improved their scores aer the educational
intervention (87% of the students improved in the number of
correct scores).
Group F also showed signicant improvement in their results,
with an increase in their scores of 6.8 with a p = 0.000 (p > 0.05).
Students also improved in most of the questions in the second test
application, as shown in Figure6 (87% of the students improved in the
number of correct answers).
When looking at the results per student, it was found that the
students with the greatest improvement between the two tests
belong to these groups (D and F). Student 1 obtained 26 points,
student 2 of 24, and student 3 of 21. (ese three students belong to
group D); student 4 from group F improved by 17 points between
P1 and P2.
4.1.2. Analysis according to the categories of the
Planea test 2017
As mentioned above, the results of the Planea 2017 test were
analyzed according to the four categories: information construction,
argumentative text, expository text, and literary text. ese results
aimed to contribute to the objective of evaluate the impact of the use
of the web platform Lëttëra on the reading competence of students.
In the rst test administration, the data allowed for a diagnosis of
students’ prior knowledge. Students were most successful in the
questions related to identifying expository texts (72.67%), followed
by identifying literary texts (64.90%), information construction
(64.32%), and nally, identifying argumentative texts (63.03%).
In the second application of the test, students scored the highest
percentage of correct answers in the identication of literary texts
(69.60%), followed by expository text (67.42%), construction of
information (65.25%) and argumentative text (62.90%).
e dierence between the two administrations of the test was
higher in the literary text category, with an improvement of 4.70%,
followed by information construction (0.93%). However, there was
no improvement in the argumentative and expository text; in fact,
there was a decrease in the student’s performance in these skills, as
shown in Figure7.
For each level of achievement according to the Planea 2017 test,
it was possible to identify that for test 1, the level of students was
higher in the category of Expository text, reaching 85.6% in level II
and 79.19% in level III of the same category, followed by 70.28% in
level III of Information construction. e lowest levels in this rst
diagnosis were level IV of the Literary category construction
(43.46%), level IV of literary text (57.94%) and level IV of
argumentative text (58.17%).
In the second test, the students continued with higher results in
level II of the expository text category (75.50%), followed by level III
of literary text (70.18%) and level III of argumentative text (69.80%).
In this last category, they followed with a low result for level IV, with
a percentage of 44.52%.
FIGURE4
Distribution of Planea 2 test scores.
TABLE3 Dierences between study groups.
Group NP1
(Mean)
P2
(Mean)
Dierence
(av1-av 2)
p-
value
Tot a l 149 31.436 33.255 1.82 0.004
111 29 30.86 30.59 −0.27 0.819
112 26 29.88 29.77 −0.11 0.922
114 19 35.37 32.32 −3.05 0.148
116 23 26.87 35.96 9.09 0.000
414 32 35.88 35.72 −0.156 0.843
419 20 28.70 35.50 6.8 0.000
e values in bold are the signicant results in the study.
FIGURE5
Dierences in scores obtained in group D.
FIGURE6
Dierences in scores obtained in group F.
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 10 frontiersin.org
When comparing both results, it was observed that for the
category of Literary text (L-III), students had an improvement of
10.29% in the achievement of level IV (A-IV) when retaking the test
aer the educational intervention, as well as in level IV Information
Construction (IC-IV) (7.89%). ere was also an improvement of
4.95% in the results of level III in the argumentative text category
(A-III). e results are shown in Figure8.
4.2. Satisfaction survey
e satisfaction survey was analyzed descriptively to identify the
highest response rates for the following categories: self-condence,
motivation, interface, methodology and reading comprehension.
ese results aimed to contribute to the objective of identify how
students perceive the use of technology in reading learning and
whether this aects their motivation and satisfaction with the
learning process. A survey was conducted among 149 students who,
in the semester of August–December 2021, used the Lëttëra web
platform, of whom more than 93% were aged 15–17, 53.6% were
male, and 46.4% were female.
The survey asked about students’ self-confidence in the
reading comprehension process. Most of the students considered
that they could do the exercises (82.4%) and that they were good
at answering questions about lectures (62.1%) and that they would
understand the exercise questions (66%). The results are shown in
Figure9.
The interface makes it easy for the students to engage in
lecture exercises (75.8%), and most consider entering Lëttëra
because it is not complicated (62.7%). When asked questions
about the methodology, we identified that the most positive
feature they found in the platform was the support they found the
platform whenever they felt lost. Reading the highlighted parts
and the text indications helped them better understand (80.30%).
They liked the design, rewards system, possibility of selecting a
profile, and map (73%). They also acknowledged a challenge in
the questions, as only 28.70% answered them correctly on the first
try (see Figure10).
46.4% of the respondents considered working on the platforms’
activities motivating; 39.2% indicated that they would learn how to
read with this platform, and the majority believed that they have high
expectations of how the platform can help them improve their reading
(57.90%). A high percentage of the students considered that their
reading comprehension level increased with Lëttëra (77%), as shown
in Figure 11. Overall, 60.10% rate Lëttëra as “good,” 28.80% as
“excellent,” and 11.10% as “fair.”
Expository texts, -
5.25%
Argumentave
texts, -0.13%
Informaon
construcon, 0.93% Literary texts, 4.70
%
-8.00%
-6.00%
-4.00%
-2.00%
0.00%
2.00%
4.00%
6.00%
8.00%
00.5 11.5 22.5 33.5 44.5 5
Difference
FIGURE7
Dierences in results P1 and P2.
2.30%
4.95%
7.89%
10.29%
0.00%2.00% 4.00%6.00% 8.00%10.00% 12.00
%
Differenc
e
L-IV IC-IVA-IIIL-III
FIGURE8
Dierences in P1 and P2 tests by level of attainment.
62.10% 66.0%
82.40%
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
90.00%
I seem to be good at
doing these types of
exercises.
I will understand what
I am asked in the
exercises.
I can perform this type
of reading exercise.
FIGURE9
Responses related to self-confidence towards reading processes.
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 11 frontiersin.org
5. Discussion
Developing reading competence is crucial for high school
students, as critical readers must possess a considerable amount of
knowledge and skills, such as observation, identifying details, relating
ideas, comparing, contrasting, and inferring. e cultural context also
inuences the reading process. erefore, it is essential to innovate
and bring about cultural change in Mexico (INEE, 2018, p. 6). One
critical way to promote this cultural change is through implementing
reading programs in schools that focus on providing students with the
tools they need to become better.
In this sense, this research aimed to propose an educational
innovation based on gamication that would allow students to develop
reading competencies through exercises that facilitated their
achievement. e study’s rst objective was to analyze how the use of
technology in the form of games can improve the reading competence
of high school students in Mexico. It was observed that aer using the
platform, students had higher scores on the test. As in other studies,
this suggests that incorporating technology in the form of games into
reading learning can be an eective way to enhance students’ reading
competence (Hüseyin et al., 2020). e use of interactive and engaging
activities can capture students’ attention and motivate them to learn,
leading to better performance on assessments (Chans and Portuguez
Castro, 2021).
For the second proposed objective, evaluate the impact of the use
of the web platform Lëttëra on the reading competence of students,
the study found that using the web platform Lëttëra had a positive
impact on students’ reading competence. Specically, it was observed
that students showed improvement in literary text comprehension
and information construction skills. is could be attributed to the
platform’s accompanying comments being directly related to PISA
processes and established levels for each skill. Another element to
observe is that in the part of Explore reading, previous knowledge
about continuous texts (argumentative, expository, literary), mixed
texts (posters, receipts, infographics, among others), are strengthened
Overall, this nding suggests an enhancement in metacognition
among students which can help them understand and interpret
dierent types of texts in various situations while promoting
meaningful lifelong learning as pointed out by Dinsmore et al. (2008).
For the objective of identifying how students perceive the use of
technology in reading learning and whether this aects their
motivation and satisfaction with the learning process, it was found
that most students considered that using the platform improved their
level of reading comprehension and found it friendly and easy to
understand. Most of the students responded positively to the platform’s
design, rewards system, prole selection, and progress map, and
mentioned that their educational experience on the platform was
favorable. is is considered to be due to another dierentiator
proposed by Lëttëra, which is learning based on games; this has
various repercussions on students’ perception of the task to be
performed and their motivation, making the didactic model more
meaningful and stronger in less time, positioning the young person as
the protagonist of learning (Cueva Gaibor, 2020).
80.30%
73.00%
28.70%
0.00%10.00%20.00%30.00%40.00%50.00%60.00%70.00%80.00%90.00%
The underlined parts and direcons
help me when I get lost in reading.
I love the design, the rewards, the
ability to select a profile and the map.
I always pass the exercises on the first
try, and it's easy for me.
FIGURE10
Responses on the methodology used in the Lëttëra platform.
46.40%
39.20%
57.90%
77.00%
0.00% 20.00% 40.00% 60.00% 80.00% 100.00%
I find it movang to do the acvies on the
plaorm.
As always, I believe that on this plaorm, I will
learn how to read.
I have aspiraons as to how Lëëra can help me
improve my reading.
I think my reading comprehension level increase
s
with Lëëra.
FIGURE11
Students’ motivation to use the Lëttëra platform.
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 12 frontiersin.org
6. Conclusion
In conclusion, the study shows that incorporating technology in
reading instruction, particularly using web platform Lëttëra, can have
a positive impact on students’ reading competence and motivation.
e novelty and signicance of a study that concludes that
incorporating technology in reading instruction, particularly using
the web platform Lëttëra, can positively impact students’ reading
competence and motivation, lies in its ability to make the didactic
model more meaningful and stronger in less time by presenting
comments related to established levels for each skill and learning
through games, positioning the student as the protagonist of their
own learning.
Another aspect is that game-based learning presenting the
exercises twice, the reward system, the immediate feedback, the
progress map, and other playful elements increase motivation for
learning. They foster an active environment to develop skills,
making the reader optimally equipped to handle reading. In this
sense, even though much remains to bedone, it is necessary to
continue innovating and applying the knowledge acquired over
time in projects that use technology and game-based learning. To
respond to the needs of the new generations who interact with
these tools in other contexts daily and to promote further
education that synthesizes the reading process through technology
and is a means of self-managed practice to develop reading skills.
However, further research is needed to determine the long-term
impact of using technology in reading learning and to identify the
most effective types of games and activities for improving
reading competence.
For future studies, it may beinteresting to investigate the long-
term eects of incorporating technology in reading instruction on
students’ lifelong learning and their ability to transfer acquired skills
to other areas of their academic and personal lives. Wealso suggest
conducting more qualitative studies that allow for a deeper
understanding of these learnings. Furthermore, future research could
also explore how the use of technology in reading instruction can
beadapted for students with dierent learning styles or those who
may require additional support or accommodations. is study can
beuseful to educators, curriculum developers, and policymakers who
seek to improve reading instruction through the integration of
technology in their teaching practices.
Data availability statement
e raw data supporting the conclusions of this article will
bemade available by the authors, without undue reservation.
Ethics statement
Ethical review and approval was not required for the study on
human participants in accordance with the local legislation and
institutional requirements. Written informed consent to participate in
this study was provided by the participants’ legal guardian/next of kin.
Author contributions
EL, CG, XC, and MP-C contributed to conception and design of the
study. EL, CG, and XC organized the database. MP-C performed the
statistical analysis. EL, CG, XC, and MP-C wrote the rst dra of the
manuscript. All authors wrote sections of the manuscript, contributed to
the manuscript revision, read, and approved the submitted version.
Funding
e authors acknowledge the nancial support of NOVUS (Grant
number: N20-153), Institute for the Future of Education, Tecnologico
de Monterrey, Mexico, in the production of this work. e authors
would like to acknowledge the nancial support of the Writing Lab,
Institute for the Future of Education, Tecnologico de Monterrey,
Mexico, in the production of this manuscript.
Acknowledgments
We acknowledge Professor María del Carmen Benítez for her help
with the spelling and the grammar revision of this manuscript.
Conflict of interest
e authors declare that the research was conducted in the
absence of any commercial or nancial relationships that could
beconstrued as a potential conict of interest.
Publisher’s note
All claims expressed in this article are solely those of the authors
and do not necessarily represent those of their aliated organizations,
or those of the publisher, the editors and the reviewers. Any product
that may be evaluated in this article, or claim that may be made by its
manufacturer, is not guaranteed or endorsed by the publisher.
References
Abdul Ghani, A. S., Abdul Rahim, A. F., Bahri Yuso, M. S., and Hanim Haide, S. (2022).
Developing an interactive PBL environment via persuasive gamify elements: A scoping
review. Res. Pract. Technol. Enhanc. Learn. 17:21. doi: 10.1186/s41039-022-00193-z
Alharbi, K., and Khalil, L. A. (2022). Descriptive study of EFL teachers' perception
toward e-learning platforms during the COVID-19 pandemic. Electron. J. e-Learn. 20:4.
doi: 10.34190/ejel.20.4.2203
Aranda Izguerra, J. (2005). Las relaciones de México con el Fondo Monetario
Internacional. Carta de políticas públicas. Available at: http://www.economia.unam.mx/
publicaciones/carta/06.html
Attali, Y., and Arieli-Attali, M. (2015). Gamification in assessment: do points
affect test performance? Comput. Educ. 83, 57–63. doi: 10.1016/j.compedu.
2014.12.012
Badillo-Jiménez, V. T., and Iguarán-Jiménez, A. M. (2020). Uso de las TIC en la
enseñanza-aprendizaje de la comprensión lectora en niños autistas. Praxis 16:1. doi:
10.21676/23897856.3406
Bagheri, M., Wan Ali, W., Chong Binti, M., and Mohd Daud, S. (2013). Effects of
project-based learning strategy on self-directed learning skills of educational
technology students. Contemp. Educ. Technol. 4:1. doi: 10.30935/cedtech/6089
Leal Uhlig et al. 10.3389/feduc.2023.1180283
Frontiers in Education 13 frontiersin.org
Baran, M., Maskan, A., and Yaşar, Ş. (2018). Learning physics through project-based
learning game techniques. Int. J. Instr. 11, 221–234. doi: 10.12973/iji.2018.11215a
Bicen, H., Demir, B., and Serttas, Z. (2022). e attitudes of teacher candidates
towards the gamication process in educat ion, BRAIN. Broad Res. Artif. Intell. Neurosci.
13, 39–50. doi: 10.18662/brain/13.2/330
Bohrnstedt, G., and Stancavage, F. (2016) TIMSS, PISA, and NAEP: what to
know before digging into the results. Available at: https://www.air.org/resource/blog-
post/timss-pisa-and-naep-what-know-digging-results (Accesed March 23, 2023).
Caracas Sánchez, B., and Ornelas Hernández, M. (2019). e assessment of reading
comprehension in Mexico. e case of the EXCALE, PLANEA and PISA tests. Perles
Educ 41:164. doi: 10.22201/iisue.24486167e.2019.164.59087
Chans, G. M., and Portuguez Castro, M. (2021). Gamication as a strategy to increase
motivation and engagement in higher education chemistry students. Computers 10:10.
doi: 10.3390/computers10100132
Civini, C. (2019). What is the Pisa test and what does it measure? Available at: https://
www.tes.com/magazine/archive/what-pisa-test-and-what-does-it-measure.pdf
Crespo, N. (2000). La metacognición: las diferentes vertientes de una teoría. Rev
Signos 33, 97–115. doi: 10.4067/S0718-09342000004800008
Cueva Gaibor, D. (2020). Educational technology in times of crisis. Conrado 16, 341–348.
Deterding, S., Dixon, D., Khaled, R., and Nacke, L. (2011). From game design elements
to gamefulness: dening gamication. In Proceedings of the 15th international academic
mind trek conference: Envisioning future media environments (mind trek '11), Ne w York,
USA
Dinsmore, D. L., Alexander, P. A., and Loughlin, S. M. (2008). Focusing the conceptual
lens on metacognition, self-regulation, and self-regulated learning. Educ. Psychol. Rev.
20, 391–409. doi: 10.1007/s10648-008-9083-6
Fernández Batanero, J. M., Montenegro Rueda, M., Fernández Cerero, J., and
Román Gravan, P. (2021). Impacto das TIC nas habilidades de escrita e leitura: uma
revisão sistemática (2010-2020). Texto Livre 14, 1–12. doi: 10.35699/1983-3652.
2021.34055
Grynyuk, S., Kovtun, O., Sultanova, L., Zheludenko, M., Zasluzhena, A., and
Zaytseva, I. (2022). Distance learning during the Covid 19 pandemic: the experience
of Ukraine’s higher education system. Electron. J. e-Learn. 20:3. doi: 10.34190/
ejel.20.3.2198
Hernández Sampieri, R., Fernández Collado, C., and Baptista Lucio, P. (2014).
Metodología de la investigación. Mexico City: Mcgraw Hill.
Høiseth, M., Alsos, O. A., Holme, S., Ek, S., and Tendenes Gabrielsen, C. (2021).
Serious game design to support children struggling with school refusal. Int. J. Serious
Games 8, 109–128. doi: 10.17083/ijsg.v8i2.416
Hüseyin, Y., Mübin, K., and Karatas, A. (2020). e views and adoption levels of
primary school teachers on gamication, problems, and possible solutions. Particip.
Educ. Res. 7:3. doi: 10.17275/per.20.46.7.3
INEE (2018). Planea 2017 national results. Available at: http://planea.sep.gob.mx/
content/general/docs/2017/ResultadosNacionalesPlaneaMS2017.PDF
Jiménez Moreno, J. A. (2016). El papel de la evaluación a gran escala como política de
rendición de cuentas en el sistema educativo mexicano. Rev. Iberoam. Eval. Educ. 9,
109–126. doi: 10.15366/riee2016.9.1.007
Johnson, R., and Kurby, P. (2016). Estadística Elemental. Mexico City: Cengage.
Moshman, D. (2008). Adolescent psychological development: Rationality, morality, and
identity. London: Lawrence Erlbaum Assoc Inc.
National Commission for the Continuous Improvement of Education (2018).
Repensar La Evaluación Para La Mejora Educativa. Resultados de México en PISA 2018.
Mexico City: Mejoredu.
Neira-Piñeiro, M. (2015). Reading and writing about literature on the internet. Two
innovative experiences with blogs in higher education. Innov. Educ. Teach. Int. 52:5. doi:
10.1080/14703297.2014.900452
OECD (2013). OECD skills outlook 2013: First results from the survey of adult skills.
Paris: OECD Publishing
OECD (2017). PISA assessment and analysis framework for development: reading,
mathematics, and science. Paris: OECD Publishing.
OECD (2018). Sample items used in the PISA 2000 assessment: reading literacy,
mathematics, and science. Available at: https://www.oecd.org/education/school/
programmeforinternationalstudentassessmentpisa/33692793.pdf
Planea (2022). Plan nacional para la evaluación de los aprendizajes. Available at:
http://planea.sep.gob.mx/ms/
Portuguez-Castro, M., Hernández-Méndez, R. V., and Peña-Ortega, L. O. (2022).
Novus projects: innovative ideas to build new opportunities upon technology-based
avenues in higher education. Educ. Sci. 12, 1–22. doi: 10.3390/educsci12100695
Salinas, D., De Moraes, C., and Schwabe, M. (2018) Programa para la Evaluación
Internacional de Alumnos (PISA) PISA 2018-Resultados. Available at: https://www.oecd.
org/pisa/publications/PISA2018_CN_MEX_Spanish.pdf (Accessed December 26, 2022).
Singh, P., Duggal, K., and Gupta, L. (2021). “Intrinsic and extrinsic motivation for
online teaching in COVID-19: applications, issues, and solution” in Emerging
Technologies for Battling COVID-19. Studies in Systems, Decision and Control. eds. F.
Al-Turjman, A. Devi and A. Nayyar (Cham: Springer)
Sucena, A., Silva, A. F., and Marques, C. (2022). Reading skills intervention during the
Covid-19 pandemic. Humanit. Soc. Sci. 9:45. doi: 10.1057/s41599-022-01059-x
Sun, B., Loh, C. E., O’Brien, B. A., and Silver, R. E. (2021). e eect of the COVID-19
lockdown on bilingual Singaporean children’s leisure reading. AERA Open 7, 1–21. doi:
10.1177/23328584211033871
UNESCO (2017). E 2030: education and skills for the 21st century. Available at:
https://unesdoc.unesco.org/ark:/48223/pf0000250117 (Access ed September 18, 2022).
U.S. Department of Education, National Center for Education Statistics (2018). e
condition of education 2018 NCES 2018-144. Available at: https://nces.ed.gov/programs/
coe/indicator/cns
Wong, R., Rao, Y., Seong, L., Abd, K., Von, W., Ismail, R., et al. (2022). Gamifying
education for classroom engagement in primary schools. Int. J. Eval. Res. Educ. 11,
1360–1367. doi: 10.11591/ijere.v11i3.21918
Content uploaded by May Portuguez Castro
Author content
All content in this area was uploaded by May Portuguez Castro on Apr 24, 2023
Content may be subject to copyright.