Conference PaperPDF Available

A SYSTEMATIC REVIEW OF THE PERUSALL APPLICATION: EXPLORING THE BENEFITS AND CHALLENGES OF SOCIAL ANNOTATION TECHNOLOGY IN HIGHER EDUCATION

Authors:

Abstract and Figures

In this rapid systematic review of the social annotation platform Perusall, we sought to provide an overview of the context, benefits, and challenges of use in higher education. Traditionally, course readings and asynchronous videos are solitary passive learning experiences. Social annotation technologies afford students the ability to actively engage in the same materials with social support and insight. We conducted the primary article search through a search consortium consisting of 174 databases. Of the 266 articles identified, 16 explicitly focus on using Perusall in higher education and are included in this review. We used an integrated mixed research analysis and synthesis to analyze, code, and synthesize the collective insight from qualitative, quantitative, and mixed-method studies. The emergent themes were engagement and learning outcomes, while time, system scoring, and technology glitches were the primary challenges. The findings provide a starting point for future implementation; however, a greater volume of high-quality research will further improve our understanding. We provide insight into the context, benefits, and challenges in the results section; and a summary of findings, recommendations, and recommendations for future research in the discussion.
Content may be subject to copyright.
A SYSTEMATIC REVIEW OF THE PERUSALL APPLICATION:
EXPLORING THE BENEFITS AND CHALLENGES OF SOCIAL
ANNOTATION TECHNOLOGY IN HIGHER EDUCATION
C.D. Craig, R. Kay
Ontario Tech University (CANADA)
Abstract
In this rapid systematic review of the social annotation platform Perusall, we sought to provide an
overview of the context, benefits, and challenges of use in higher education. Traditionally, course
readings and asynchronous videos are solitary passive learning experiences. Social annotation
technologies afford students the ability to actively engage in the same materials with social support and
insight. We conducted the primary article search through a search consortium consisting of 174
databases. Of the 266 articles identified, 16 explicitly focus on using Perusall in higher education and
are included in this review. We used an integrated mixed research analysis and synthesis to analyze,
code, and synthesize the collective insight from qualitative, quantitative, and mixed-method studies. The
emergent themes were engagement and learning outcomes, while time, system scoring, and technology
glitches were the primary challenges. The findings provide a starting point for future implementation;
however, a greater volume of high-quality research will further improve our understanding. We provide
insight into the context, benefits, and challenges in the results section; and a summary of findings,
recommendations, and recommendations for future research in the discussion.
Keywords: Social annotation, Perusall, higher education, university, college, tertiary education, innovation,
technology, online learning, learning.
1 INTRODUCTION
Social engagement through learning experiences in higher education is a critical part of student success
as interaction can cultivate meaningful relationships with peers and the learning materials ([1], [2], [3]).
To reinforce the value of completing course readings—which is often a challenge ([4])—and increase
active learning opportunities, Harvard University-based educators in the early 2010s developed a social
annotation platform ([5]). The outcome of the effort is a digital social annotation platform entitled
Perusall, which is the focus of this rapid systematic review. The platform builds on empirical research
on behaviour and learning, causal inference, data collection, technological innovation, and theoretical
models ([5]). The three principles guiding early development focus on using social connections to
motivate students, teaching peers is also valuable as a learning tool, and rapid feedback improves
learning ([5]).
Perusall increases educational accessibility by reducing geographic location and scheduling barriers
while promoting active student engagement, self-efficacy, and conceptual understanding ([6]).
Educators can use Perusall to embed streamlined access to open or licensed digital course materials
such as audio, readings, or video for fully online, hybrid, and flipped learning ([5], [6]). Once students
sign up for the platform, they engage with the learning materials alongside their peers through social
annotation. The annotation process includes using the learning materials—and potentially prompts—to
stimulate and mediate critical thinking and community building through personal responses, which
further develop students’ conceptual knowledge ([5], [7]).
The peer feedback process can result in extended discussions, which may not always focus on the
learning at hand but can increase social connectedness ([5]). The process also allows students to teach
one another and provide timely insight that would not have been available in traditional learning
environments. In addition, traditional individual activities, such as reading articles or chapters, become
a form of social interaction ([6]). Readings are a common practice for sharing insight on a particular
subject, but they can also result in mind-wandering that reduces comprehension of a subject ([8]). The
interactive process and increased access to learning materials afford students engaging opportunities
to gain prior knowledge of a subject, allowing them to leverage their understanding and participate more
readily during live discussions or seek further clarification from the educator ([6]).
Proceedings of INTED2024 Conference
4th-6th March 2024, Valencia, Spain
ISBN: 978-84-09-59215-9
7566
To help educators process the responses and insight, Perusall runs an automated assessment focusing
on student participation and areas of confusion ([6]). Additionally, the platform uses an algorithm for
grading by using distribution, quality, quantity, and timelines to construct an overall score. Guided by
machine learning, the platform can stimulate thoughtful reflection and success through alignment with
timelines and response thoroughness ([6]). Also, if the program perceives a section as confusing, it will
flag the area and prompt educators to review it.
1.1 Previous Research
While Perusall is a recognizable digital tool for social annotation, we can gain greater insight from the
field in general. The process of digital social annotation can include actions such as shared commenting,
highlighting, notes, or questions with the intent of discussion and interaction ([9]). As technology
progressed, so did the diffusion of social annotation technologies in higher education ([9]). A literature
review of 16 empirical studies with over 1,000 participants found that social annotation in higher
education improves critical thinking, meta-cognition, reading comprehension, student motivation and
positive emotional states ([9]). Challenges at the time included the cost of technology, and students did
not always recognize that they had improved learning outcomes: a result researchers found was counter
to their pre- and post-intervention assessments ([9]). A systematic qualitative review of 71 studies found
that digital social annotation is increasingly being used and studied in higher education, primarily in
education studies (24%) and computer technology (21%). The research mainly focused on
implementing social annotation, typically with undergraduate-level students in blended learning
environments (76%).
1.2 Objectives
The use and impacts of Perusall in higher education have yet to be reviewed, prompting this rapid
systematic review of the literature to gain pragmatic insight to guide educator use. Using the Population,
Exposure, Outcome (PEO) framework to guide the development of answerable questions in systematic
reviews regarding the predictors and prevalence of key concepts ([11], [12], [13]), this review aims to
gain the following insight: Within higher education learning environments (E), what is the context,
benefits, and challenges (O) of using Perusall for educators and students (P)?
2 METHOD
To provide insights to educators who are considering Perusall, we conducted a rapid systematic review
as it provides timely and informed details for decision-making and reports low potential research bias
([14], [15], [16]). We used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses
2020 (PRISMA) to support the development of clear and transparent findings [17], and followed three
primary steps. First, we purposefully documented the search process. Second, we constructed and
applied inclusion criteria for articles. Thirdly, the analysis and synthesis followed an integrated mixed-
method approach to build broad insights while minimizing methodological differences ([18]).
2.1 Search Process
The search process focused on the benefits and challenges of using Perusall in higher education. We
performed two searches. The first search used a paired consortium of the Ontario Tech University
Library and Omni Search tool, which draws from 174 databases. The tool employs Ex Libris’s Alma
library software system and the Primo VE discovery system, which supports a streamlined and
comprehensive search of institutionally licensed databases ([19]). Using the search term “Perusall,” the
filters were available online, articles, English, and since 2015—when Perusall became public. From the
potential databases, 17 articles were included in this review. The second search included manual article
scraping from the Perusall website using the same criteria. The initial searches identified 266 total
records, refined with the above-mentioned search filters and a review of the article titles and overviews
for applicability. Following screening, 16 articles were included in the final review. Figure 1 outlines the
search flow, and article summaries are presented in Appendix A.
7567
Figure 1. PRISMA Flow Diagram
2.2 Analysis and Coding
We used an integrated mixed research analysis and synthesis design to provide a qualitative
assimilation of our findings, communicating collective insights from qualitative, quantitative, and mixed
methods studies ([18]). The analysis occurred in three steps. The first step included gathering article
summaries, including location, database, resource, title and purpose (Appendix A). The second step
included summarizing the context of each study outlined in Appendix B. The third step followed
deductive and emergent coding from thematic analysis to develop primary and secondary themes in our
integrated synthesis, which consists of four phases ([18], [20], [21], [22]). Phase one explored the
benefits and challenges—deductive coding—of using Perusall in higher education. Phase two included
summarizing the outcomes in emergent themes columns. Phase three involved recoding and
categorizing insights through concept mapping to develop subthemes. Finally, phase four focused on
refinement through a review of the quality and quantity of each emergent subtheme.
3 RESULTS
3.1 Research Context
From 16 studies, the research includes a variety of countries and disciplines. Articles in this review included
eight peer-reviewed journal articles, seven conference papers, and one thesis. The average findings certainty
7568
was medium or moderate. Studies were primarily conducted in the USA (n = 9 articles, 57%); while the
Netherlands produced two; and Canada, China, Portugal, Singapore, and Spain offered one each
Fifteen out of 16 studies identified the number of participants (n = 1134). Seven studies identified
participant gender, of which 61% (n = 366) were female. Twelve studies focused on undergraduate
students, while three targeted graduate students. Seven articles reported the impact of Perusall on final
grades; however, the distribution varied significantly from five to 60%. Perusall was used most often in
Engineering and Sciences (n = 7), followed by Education (n = 3) and Business and Legal Studies (n =
2). Full contextual details are outlined in Appendix B.
3.2 Benefits (n = 15)
Perusall offers two primary benefits for use in higher education. First is increased student engagement,
including connectedness, course materials, and comfort level. Second is enhanced learning outcomes,
including cognition and scoring.
3.2.1 Engagement (n = 12)
Connectedness (n = 8). Students felt more connected with their peers and a stronger sense of
community when using Perusall ([23], [24], [25], [26], [28]). The asynchronous but direct connection
with peers was considered an asset by students, and they used it for both academic and social purposes
([25], [31], [27], [28], [34]). Students appreciated being able to support their peers by providing their
unique and diverse points of view on the content, thereby increasing the likelihood of content
engagement and decreasing boredom ([23], [24], [25], [26]). Tian ([27]) also found that the community-
based reading experience helped students co-construct meaning with the learning materials beyond the
classroom. McFarlin ([34]) added that Perusall afforded a productive opportunity to connect with peers
and educators beyond physical spaces.
Course Materials (n = 7). Educators using Perusall found that the platform increased student
engagement with course materials ([6], [23], [29], [30], [31], [32], [33]). For example, students using
Perusall were more likely to produce posts about the learning content than controls ([6], [29], [31]),
leading to learning engagement beyond the outlined course requirements ([30]). The increased
information sharing and socially-mediated critical analysis of the content also enhanced student
understanding ([23]).
Comfort (n = 4). Student engagement was also reflected in student comfort level with using Perusall
([23], [24], [25], [31]). Two studies—one reflecting the social impacts of social annotation in media
studies, the other focusing on motivation to complete readings in an English course—found that students
felt more comfortable offering their insights through the platform than in person ([23], [24]). Some
students indicated greater comfort levels by freely participating in online discussions, and international
students—who often refrained from in-class discussions—voluntarily engaged more often in Perusall
([25], [31]).
3.2.2 Learning Outcomes (n = 7)
Students experienced enhanced learning outcomes in 44% of the studies, reflected in two primary areas:
cognition ([7], [23], [25], [34]) and scoring ([6], [28], [35]). Cognition refers to acquiring knowledge, while
scoring refers to the marks or grades received in an assignment.
Cognition (n = 4). Using Perusall enhanced students' cognitive engagement with the learning materials
([7], [23], [25], [34]). Students found that the process helped them develop a deeper understanding of
the learning content than in-class discussion or reading independently ([7], [23], [25], [34]). The finding
was further supported by two studies which found that students were more likely to engage with learning
content in Perusall when compared to a control ([29], [31]).
Scoring (n = 3). The Perusall platform’s affordances—such as asynchronous learning and
engagement—and scoring were positively associated with student grades and cognitive engagement
([6], [28], [35]). Compared with control groups, students using Perusall performed significantly better on
the final exam than their peers, with the Perusall group referencing the learning materials in the final
assessment more often ([6], [28]).
7569
3.3 Challenges (n = 12)
Three primary challenges when using Perusall included time,system scoring and technology glitches.
Each is outlined in further detail below.
3.3.1 Time (n = 10)
Educator Time Management (n = 7). For educators to leverage Perusall’s benefits for students, they
needed to purposefully allocate time before, during, and at the conclusion of a course ([7], [24], [25],
[27], [31], [32], [35]). To streamline the student learning experience, educators needed to select content
that stimulated engagement, was at the appropriate education level, and aligned with the learning
outcomes ([27], [31]). Additionally, it was important to budget time for trial and review ([24], [25]). For
example, they had to budget time to embed questions and prompts in the learning materials, align the
automated scoring with the course, and learn how to use the platform personally. Once a course was
underway, educators needed to spend time sorting through student posts and responses ([7], [31], [32]),
and responding to student inquiries ([35]).
Student Time On Task (n = 4). While it is common to set aside time for in-class discussions, extra time
is required to comment when using Perusall due to its dynamic nature ([23], [29], [31], [36]). Scripts and
prompts, while beneficial in guiding student responses, reduced the time and effort students spent on
providing responses ([29]), [31]). Next, student annotation required more time than traditionally given
during in-class discussions ([23]). Additionally, it was not uncommon for students to post last-minute
responses, limiting their peers’ ability to reply, thereby negatively affecting participation scores ([36]).
3.3.2 System Scoring (n = 4).
The challenges of system scoring outline how the Perusall scoring algorithm can be disruptive for some
students. Specifically, some students found the scoring process unclear and confusing ([23], [33], [27]).
Also, students whose primary language was not supported on the platform faced scoring barriers ([30]).
3.3.3 Technology Glitches (n = 4).
While limited in scope, technology glitches can be a disruptive aspect of the annotation and commenting
process. For example, previous annotations impaired content selection for new posts ([27], [35]), student
posts were temporarily displaced ([34]), oras the platform is web-based—student browser choice
impacted students' experiences as some functionality differed ([30]).
3.4 Discussion
This rapid systematic outlined the context, benefits, and challenges of using the social annotation
platform Perusall in higher education. Perusall is most prominent in the USA for undergraduate science
and education studies. The primary benefit of using Perusall was student engagement, specifically, the
ability to interact with peers and course materials and to support student comfort while learning.
Connecting with peers through Perusall enhanced motivation via communal knowledge-building ([23],
[24], [25], [26], [28]), relationship-building ([27], [28], [31], [34]), and increased learning engagement ([6],
[23], [29], [30], [31], [32], [33]). Additionally, students typically appeared more comfortable sharing
insights on Perusall than traditional options such as lecture discussion sections ([23], [24], [25], [31]).
Another benefit was the enhancement of student learning outcomes through cognition and scoring.
Using the learning materials continuously with continuous guidance from social learning improved
subject knowledge synthesis ([7], [23], [25], [34]). Considering quantitative metrics, student scores within
Perusall were often aligned with exam grades ([35]). Further, compared to controls or prior groups,
students using the platform also recorded higher final exam grades ([6], [28]).
Time-related challenges related to educator time management and student time on task ([7], [24], [25],
[27], [31], [32], [35]). For time management, educators needed to build in planning time to review content
appropriateness, embed questions, and trial their content for continuous review. Students had
challenges budgeting enough time to provide high-quality insight without guidance, worsened by last-
minute posting, as it restricted their peers’ ability to respond and improve their scores. The Perusall
scoring system also presented challenges due to students’ confusion about the calculation process ([23],
[27], [30]), [33]). At the same time, technology glitches, such as restrictions to posting new annotations
on content with prior posts, temporary post displacement, and temporary browser compatibility
restrictions were also challenging ([27], [30], [34], [35]).
7570
3.5 Limitations and Further Research
Perusall is a relatively new platform with limited research and methodological certainty. Further research
should focus on a broader context and increased transparency into use scenarios. For example,
researchers could look at how the use of Perusall impacts students’ grades and motivation. Furthermore,
we could better understand tool efficacy with an increased focus on cultural differences, gender-based
differences, and the effects of extended (e.g., more than a semester) periods of use.
3.6 Educational Recommendations
Following our review, we provide seven recommendations for the preparation and implementation
phases of using Perusall. During the course preparation phase, instructors need to:
Review content appropriateness and ensure that media and links are up to date;
Embed analysis and synthesis prompts;
Use a rubric and model or sample the expected attributes of annotations and responses; and
Conduct a trial with your content within the system to limit potential disruptions during the course.
During a course, instructors should consider:
Providing students with an active learning opportunity of how the system scoring works to reduce
confusion (e.g., during class time or as an asynchronous video);
Scaffold posting and response timelines to maximize opportunities for students to engage while
recognizing their tendency to produce last-minute posts; and
Schedule structured weekly time to monitor the process.
REFERENCES
[1] Z. Akyol, D. R. Garrison, and M. Y. Ozden, ‘Development of a community of inquiry in online and
blended learning contexts’, Procedia - Social and Behavioral Sciences, vol. 1, no. 1, pp. 1834–
1838, 2009, doi: 10.1016/j.sbspro.2009.01.324.
[2] S. Caskurlu, J. C. Richardson, Y. Maeda, and K. Kozan, ‘The qualitative evidence behind the
factors impacting online learning experiences as informed by the community of inquiry framework:
A thematic synthesis’, Computers & Education, vol. 165, p. 104111, May 2021, doi:
10.1016/j.compedu.2020.104111.
[3] G. Kim and R. Gurvitch, ‘Online Education Research Adopting the Community of Inquiry
Framework: A Systematic Review’, Quest, vol. 72, no. 4, pp. 395–409, Oct. 2020, doi:
10.1080/00336297.2020.1761843.
[4] S. Baker, B. Bangeni, R. Burke, and A. Hunma, ‘The invisibility of academic reading as social
practice and its implications for equity in higher education: a scoping study’, Higher Education
Research & Development, vol. 38, no. 1, pp. 142–156, Jan. 2019, doi:
10.1080/07294360.2018.1540554.
[5] G. King and M. Sen, ‘How Social Science Research Can Improve Teaching’, APSC, vol. 46, no.
03, pp. 621–629, Jul. 2013, doi: 10.1017/S1049096513000619.
[6] K. Miller, B. Lukoff, G. King, and E. Mazur, ‘Use of a Social Annotation Platform for Pre-Class
Reading Assignments in a Flipped Introductory Physics Class’, Front. Educ., vol. 3, p. 8, Mar.
2018, doi: 10.3389/feduc.2018.00008.
[7] B. Adams and N. S. Wilson, ‘Investigating Student’s During-Reading Practices Through Social
Annotation’, Literacy Research and Instruction, vol. 61, no. 4, pp. 339–360, Oct. 2022, doi:
10.1080/19388071.2021.2008560.
[8] S. Feng, S. D’Mello, and A. C. Graesser, ‘Mind wandering while reading easy and difficult texts’,
Psychon Bull Rev, vol. 20, no. 3, pp. 586–592, Jun. 2013, doi: 10.3758/s13423-012-0367-y.
[9] E. Novak, R. Razzouk, and T. E. Johnson, ‘The educational use of social annotation tools in
higher education: A literature review’, The Internet and Higher Education, vol. 15, no. 1, pp. 39–
49, Jan. 2012, doi: 10.1016/j.iheduc.2011.09.002.
7571
[10] H. Ghadirian, K. Salehi, and A. F. M. Ayub, ‘Social annotation tools in higher education: a
preliminary systematic review’, IJLT, vol. 13, no. 2, p. 130, 2018, doi: 10.1504/IJLT.2018.092096.
[11] J. Bettany-Saltikov and R. McSherry, How to do a systematic literature review in nursing: A step-
by-step guide, 2nd ed. Open University Press/McGraw-Hill Education, 2016.
[12] S. Moola et al., ‘Conducting systematic reviews of association (etiology): The Joanna Briggs
Institute’s approach’, International Journal of Evidence-Based Healthcare, vol. 13, no. 3, pp. 163–
169, Sep. 2015, doi: 10.1097/XEB.0000000000000064.
[13] A. Pollock and E. Berge, ‘How to do a systematic review’, International Journal of Stroke, vol. 13,
no. 2, pp. 138–156, Feb. 2018, doi: 10.1177/1747493017743796.
[14] A. C. Tricco et al., ‘Rapid reviews and the methodological rigor of evidence synthesis: a JBI
position statement’, JBI Evidence Synthesis, vol. 20, no. 4, pp. 944–949, Apr. 2022, doi:
10.11124/JBIES-21-00371.
[15] J. T. Van Der Steen, G. Ter Riet, C. A. Van Den Bogert, and L. M. Bouter, ‘Causes of reporting
bias: a theoretical framework’, F1000Res, vol. 8, p. 280, Jul. 2019, doi:
10.12688/f1000research.18310.2.
[16] J. T. Van Der Steen et al., ‘Determinants of selective reporting: A taxonomy based on content
analysis of a random selection of the literature’, PLoS ONE, vol. 13, no. 2, p. e0188247, Feb.
2018, doi: 10.1371/journal.pone.0188247.
[17] M. J. Page et al., ‘The PRISMA 2020 statement: an updated guideline for reporting systematic
reviews’, BMJ, p. n71, Mar. 2021, doi: 10.1136/bmj.n71.
[18] M. Sandelowski, C. I. Voils, and J. Barroso, ‘Defining and Designing Mixed Research Synthesis
Studies’, Res Sch, vol. 13, no. 1, p. 29, 2006, Available:
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2809982/
[19] S. Pagotto, ‘Introducing Omni : A New Academic Search Tool from 14 Ontario University Libraries
| Ontario Council of University Libraries’. Available: https://ocul.on.ca/introducing-omni
[20] J. Popay et al., ‘Guidance on the conduct of narrative synthesis in systematic reviews: A product
from the ESRC methods programme’, Economic and Social Research Council, vol. 1, no. b92, p.
92, Apr. 2006.
[21] J. Saldana, The Coding Manual for Qualitative Researchers, 4th ed. Sage Publications, 2021.
[22] R. Thornberg and K. Charmaz, The SAGE Handbook of Qualitative Data Analysis. SAGE
Publications, 2014. doi: 10.4135/9781446282243.
[23] A. G. D’Entremont and A. Eyking, ‘Student and instructor experience using collaborative
annotation via Perusall in upper year and graduate courses’, PCEEA, Jun. 2021, doi:
10.24908/pceea.vi0.14835.
[24] L. Gray, ‘How Can We Get Students to Do the Reading? Perusall May Help’, The Journal of
innovation, Teaching & Digital Learning Excellence, vol. 1, no. 1, pp. 23–27, 2021.
[25] A. J. Sigmon and M. J. Bodek, ‘Use of an Online Social Annotation Platform to Enhance a Flipped
Organic Chemistry Course’, J. Chem. Educ., vol. 99, no. 2, pp. 538–545, Feb. 2022, doi:
10.1021/acs.jchemed.1c00889.
[26] N. A. Theodosiou and J. D. Corbin, ‘Redesign your inperson course for online: Creating
connections and promoting engagement for better learning’, Ecology and Evolution, vol. 10, no.
22, pp. 12561–12572, Nov. 2020, doi: 10.1002/ece3.6844.
[27] J. Tian, ‘Investigating Students’ Use of a Social Annotation Tool in an English for Science and
Technology Course’, in Emerging Technologies for Education, vol. 11984, E. Popescu, T. Hao, T.-
C. Hsu, H. Xie, M. Temperini, and W. Chen, Eds., Cham: Springer International Publishing, 2020,
pp. 299–309. doi: 10.1007/978-3-030-38778-5_33.
[28] A. S. Walker, ‘Perusall: Harnessing AI Robo-Tools and Writing Analytics to Improve Student
Learning and Increase Instructor Efficiency’, The Journal of Writing Analytics, vol. 3, no. 1, pp.
227–263, 2019, doi: 10.37514/JWA-J.2019.3.1.11.
7572
[29] V. de Boer and H. Spoelstra, ‘The effects of collaboration scripts on the number and quality of
student interactions in a social annotation tool’, in Seventh International Conference on Higher
Education Advances, May 2021. Available:
http://ocs.editorial.upv.es/index.php/HEAD/HEAd21/paper/view/12862
[30] M. Francisco and C. Amado, ‘Perusall’s Machine Learning Towards Self-regulated Learning’, in
Innovative Technologies and Learning, vol. 13117, Y.-M. Huang, C.-F. Lai, and T. Rocha, Eds.,
Cham: Springer International Publishing, 2021, pp. 49–58. doi: 10.1007/978-3-030-91540-7_6.
[31] S. C. Lee and F. M. Yeong, ‘Fostering student engagement using online, collaborative reading
assignments mediated by Perusall’, TAPS, vol. 3, no. 3, pp. 46–48, Sep. 2018, doi:
10.29060/TAPS.2018-3-3/PV2000.
[32] M. W. Liberatore, ‘Board # 20 :Annotations and Discussions of Textbooks and Papers Using a
Web-based System (Work in Progress)’, Jun. 2017. Available: https://peer.asee.org/board-20-
annotations-and-discussions-of-textbooks-and-papers-using-a-web-based-system-work-in-
progress
[33] A. Ortigosa-Blanch and E. Planells-Artigot, ‘Peruse this! Use of an educational social platform for
a Global Entrepreneurship flipped clas’, in INNODOCT 2020, Jan. 2021. Available:
http://ocs.editorial.upv.es/index.php/INNODOCT/INN2020/paper/view/11855
[34] T. McFarlin, ‘Using Open-Source, Collaborative Online Reading to Teach Property’, Saint Louis
University Law Journal, vol. 64, no. 3, Apr. 2020, [Online]. Available:
https://scholarship.law.slu.edu/lj/vol64/iss3/4
[35] C. Suhre, K. Winnips, V. de Boer, P. Valdivia, and H. Beldhuis, ‘Students’ experiences with the
use of a social annotation tool to improve learning in flipped classrooms’, in Fifth International
Conference on Higher Education Advances, May 2019. Available:
http://ocs.editorial.upv.es/index.php/HEAD/HEAD19/paper/view/9131
[36] C. Linzell, ‘Critical Peer Mentor Groups: Amplifying Support During Student Teaching’,
Department of Teaching, Learning, and Teacher Education: Dissertations, Theses, and Student
Research, Dec. 2021, [Online]. Available: https://digitalcommons.unl.edu/teachlearnstudent/134
APPENDIX A - ARTICLE SUMMARIES
Ref
Type
Search
Source
Resource
Purpose
[7]
Journal
OMNI
Routledge
Literacy
Research and
Instruction
To investigate social annotation and
COP in online graduate courses.
[23]
Conf.
OMNI
DOAJ
Proceedings
of CEEA-
ACEG21
To examine Perusall in the context of a
social constructivist approach for online
readings.
[29]
Conf.
Perusall
Site
UMCG
HEAd’21
To determine reading content
engagement in control and social
annotation groups.
[30]
Conf.
OMNI
Springer
ICITL 2021
To analyze Persuall's features in the
context of peer and self-regulated
learning.
[24]
Journal
Perusall
Site
DCCC
Library
JIT&DLE
To develop insight on student reading
completion and perspectives of
Perusall.
[31]
Journal
OMNI
DOAJ
The Asia
Pacific
Scholar
To explore the use of online reading to
foster authentic student engagement.
[32]
Conf.
Perusall
Site
ASEE
PEER
2017 ASEE
To determine the impact of
augmenting static online textbooks
with interactive web tools.
[36]
Thesis
OMNI
Digital
Commons
University of
Nebraska
To share experiences in critical peer
mentor groups during student teaching
experiences.
[34]
Journal
Perusall
Site
Digital
Commons
St. Louis
University
Law Journal
To explore Perusall’s effect on
community and accessible, engaging
learning materials.
7573
Ref
Type
Search
Source
Resource
Purpose
[6]
Journal
OMNI
Frontiers
Frontiers in
Education
To examine the relationship between
exam performance and learning
engagement.
[33]
Conf.
Perusall
Site
RiuNet
INNODOCT/
20
To review the experience of
implementing Perusall curing COVID-
19.
[25]
Journal
OMNI
ACS
JCED
To explore the effect of Perusall on
student isolation while watching video
materials.
[35]
Conf.
Perusall
Site
UMCG
HEAd'19
To assess students' perceived
effectiveness and usefulness of
Perusall.
[26]
Journal
OMNI
Wiley
Ecology and
Evolution
To reflect on the use of Perusall in the
transition to online learning due to
COVID-19.
[27]
Conf.
OMNI
Springer
SETE 2019
To investigate linguistic and
pedagogical insights associated with
digital annotation.
[28]
Journal
Perusall
Site
WAC
Journal of
Writing
Analytics
To investigate the role of artificial
intelligence in the role of robo-tutor as
an aide for instructor grading
processes.
APPENDIX B - ARTICLE INSIGHTS
Ref.
Location
Level
Discipline
Grades
Sample
Female
Male
Certainty
Design
[7]
USA
Grad
Education
x
12
x
x
Med.
MM. Annotation content analysis
[23]
Canada
Grad
Mechanical
Engineering
x
25
x
x
Med.
MM. Comment content analysis, Likert5
survey w/ open-ended questions
[29]
NED
UG
Media Studies
x
59
x
x
Good
MM. Annotation content analysis, 5 papers
[30]
Portugal
UG
Education
20% of final
84
67
17
Good
MM. Reports & grades, 2 open-ended
question questionnaires, content analysis
[24]
USA
UG
English/
writing
60% of final
19
x
x
Low
Qual. Autoethnography
[31]
Singapore
UG
Sciences
8% of final
245
x
x
Low
Qual. Comment content analysis
[32]
USA
UG
Sciences
x
10
2
8
Low
Qual. Content analysis
[36]
USA
UG
Education
x
15
15
0
Low
Qual. Phenomenological; Content
analysis;
Interviews
[34]
USA
Grad
Legal studies
x
47
x
x
Low
MM. Autoethnography & Likert4 survey
[6]
USA
UG
Engineering
x
300
162
138
Good
MM. Activity, content, and grade analysis
2 comparison terms & 2 implementation
Social constructivist approach.
[33]
Spain
UG
Business
10% of final
25
17
8
Low
MM. Content analysis and 10-point scaled
questionnaire.
[25]
USA
UG
Sciences
10% of final
63
x
x
Low
MM. Content analysis and MM survey
(numerical and open-ended)
[35]
NED
x
Multiple
x
x
x
x
Low
Quant. Content analysis and scaled
survey of student perceptions
[26]
USA
UG
Sciences
10% of final
67
x
x
Low
MM. Autoethnographic and case study
design
Content analysis and survey
[27]
China
UG
Sciences
x
38
28
10
Med.
MM. Content analysis
[28]
USA
UG
English/
writing
Max 5% of final
125
75
50
Med.
MM. Content analysis and quant synthesis
of findings, 54 students as control.
7574
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Examination of social annotation during assigned readings in this study brings together research about digital reading, learning communities, metacognition, and critical literacy. Social annotation allows readers to engage with text collaboratively to build understanding during reading. The authors conducted a content analysis of over 400 annotations made by 12 novice teachers to discover their metacognitive, social, and critical practices during reading. Findings suggest that text-to-text connections are a powerful practice for developing contextual knowledge, that practice-based identities enacted by participants during reading further supported their knowledge development, and that metacognition and strategy instruction are essential for critical literacy development.
Article
Full-text available
The demand for rapid reviews has exploded in recent years. A rapid review is an approach to evidence synthesis that provides timely information to decision-makers (eg, health care planners, providers, policymakers, and patients) by simplifying the evidence synthesis process. A rapid review is particularly appealing for urgent decisions.JBI is a world-renowned international collaboration for evidence synthesis and implementation methodologies. The principles for JBI evidence synthesis include comprehensiveness, rigor, transparency, and a focus on applicability to clinical practice. As such, JBI has not yet endorsed a specific approach for rapid reviews.In this paper, we compare rapid reviews versus other types of evidence synthesis, provide a range of rapid evidence products, outline how to appraise the quality of rapid reviews, and present the JBI position on rapid reviews. JBI-affiliated Centers conduct rapid reviews for decision-makers in specific circumstances, such as limited time or funding constraints. A standardized approach is not used for these cases; instead, the evidence synthesis methods are tailored to the needs of the decision-maker.The urgent need to deliver timely evidence to decision-makers poses challenges to JBI's mission to produce high quality, trustworthy evidence. However, JBI recognizes the value of rapid reviews as part of the evidence synthesis ecosystem. As such, it is recommended that rapid reviews be conducted with the same methodological rigor and transparency expected of JBI reviews. Most importantly, transparency is essential, and the rapid review should clearly report where any simplification in the steps of the evidence synthesis process have been taken.
Conference Paper
Full-text available
Social Annotation (SA) tools can be used to facilitate active and collaborative learning when students have to study academic texts. However, making these tools available does not ensure students participate in argumentative discussions. Scaffolding students by means of collaborations scripts geared towards collaboration and discussion encourages students to engage in meaningful, high-quality interactions. We conducted an experiment with students (n=59) in a course running at a Dutch university, using the SA tool Perusall. A control group received normal instructions, while an experimental group received scaffolding through collaboration scripts. The results showed a significant increase in the number of responses to fellow students for the experimental group compared to the control group. The quality of the annotations, measured on levels of Bloom’s taxonomy, increased significantly for the experimental group compared to both its baseline measurement and the control group. However, when scaffolding was faded out over subsequent assignments these differences became non-significant. The experimental groups’ increased quality of annotations did not remain over time, suggesting that internalization of the scripts was not achieved.
Article
Full-text available
The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, published in 2009, was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found. Over the past decade, advances in systematic review methodology and terminology have necessitated an update to the guideline. The PRISMA 2020 statement replaces the 2009 statement and includes new reporting guidance that reflects advances in methods to identify, select, appraise, and synthesise studies. The structure and presentation of the items have been modified to facilitate implementation. In this article, we present the PRISMA 2020 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and the revised flow diagrams for original and updated reviews.
Conference Paper
Full-text available
The COVID-19 pandemic made all of our universities to move to a full-online teaching scheme this March. Online teaching presents different drawbacks compared to classroom teaching, but probably the main one is the difficulty to keep the interactivity with the students throughout the online sessions. This is why a flipped classroom scheme becomes even more relevant in the new normal.A key aspect for a flipped class is that the information transfer, traditionally performed through lectures, is moved out of the classroom to different types of pre-class assignments. Students are expected to fulfill those assignments before the class allowing the instructor to engage in more interactive activities with the students in the classroom. Nevertheless, getting students to read the proposed contents before the class is a very important problem in higher education. In this work, we show the results of the implementation of pre-class reading assignments in a Global Entrepreneurship course through an educational social platform that allows students to discuss the reading online with their classmates: PERUSALL. We use the platform to understand how students are reading and understanding the different materials, their reading behaviors and how the instructor can take advantage of all that information. Considering the type of group (50 students, 65% of them international students from exchange programmes we have also performed a survey to assess the platform and evaluate the perception of the students of their learning process.
Chapter
This current work presents exploratory research related to Perusall activity. One of the objectives of this study was to analyze the Perusalll's features, with emphasis on peer work, which can increase individual motivation facilitating self-regulation learning. Perusall is a social web tool that uses a machine learning algorithm, which assesses the quality of annotations and students’ engagement. This tool was integrated with the LMS of Universidade Aberta (Portugal) and it was used as a pilot project in a Curricular Unit, from the 2nd year of the Education undergraduate program. We designed a collaborative activity inspired by Inquiry-based Learning and peer-instruction, to be performed on Perusall. 115 students, from 2 classes, were involved. To assess students’ work, their engagement and motivation (basis for self-regulation) we analyzed Perusall´s reports and scoring based on 6 different components. We also asked students to report positive and negative aspects related to their experience with Perusall. Our findings confirm that collaborative reading tools can help students to get more involved in self-learning, as well machine learning can help instructors work, namely monitoring and assessment tasks.
Article
Perusall is a collaborative annotation platform designed for pre-readings in a flipped classroom, but can also be used for stand-alone, asynchronous reading discussion components of courses. We examine the use of Perusall as a social constructivist learning tool in two upper year/graduate courses in Mechanical Engineering. Perusall was used to replace in-class discussion of readings during the shift to online teaching. Data was collected from student surveys and from the student and instructor annotations themselves. Annotations were coded for content, and examined for factors such as upvoting. We found substantial engagement from students, with collaborative annotation providing opportunities for: correction of misunderstanding; linking concepts from the course and between readings; discussing larger issues around research and research writing; sharing background information among peers; and critically analyzing the readings. Students reported deeper learning than in typical in-class discussions of readings; however, they also noted that annotation required much more time. Overall, collaborative annotation appears to be an effective method for course reading discussion.