Content uploaded by Reyhan Topal
Author content
All content in this area was uploaded by Reyhan Topal on Feb 02, 2023
Content may be subject to copyright.
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=upse20
Journal of Political Science Education
ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/upse20
Teaching Students How to Find and Identify
Reliable Online Sources: A Series of Exercises
Reyhan Topal & Farzin Shargh
To cite this article: Reyhan Topal & Farzin Shargh (2023): Teaching Students How to Find and
Identify Reliable Online Sources: A Series of Exercises, Journal of Political Science Education,
DOI: 10.1080/15512169.2022.2163899
To link to this article: https://doi.org/10.1080/15512169.2022.2163899
Published online: 11 Jan 2023.
Submit your article to this journal
View related articles
View Crossmark data
POLITICAL SCIENCE INSTRUCTION
Teaching Students How to Find and Identify Reliable
Online Sources: A Series of Exercises
Reyhan Topal and Farzin Shargh
State University of New York at Albany
ABSTRACT
Online information sources help undergraduate students acquire
knowledge and find supporting evidence for their papers. However,
this practicality is often clouded with unreliable sources. As such,
instructors need better strategies to teach students how to reach
reliable online sources and evaluate the accuracy of the online infor-
mation they obtain. In this article, we introduce a series of exercises
to teach students how to identify the reliability of a source, ways to
fact-check information, steps to use online search tools effectively,
and examples of online academic blogs to utilize and check the reli-
ability of the information that they present, and how to use online
data/survey sources. We believe that educating students on identify-
ing and utilizing reliable sources is essential for their intellectual
development and ability to conduct their research independently.
Therefore, our exercises aim to provide a roadmap for instructors to
improve their students’virtual literacy.
ARTICLE HISTORY
Received 21 July 2022
Accepted 20 December 2022
KEYWORDS
Teaching exercises; online
sources; reliability
Introduction
The education system in the U.S. emphasizes the importance of good writing skills such
as successfully supporting arguments or claims, developing narratives to convey real or
imagined experiences, and writing clearly and coherently (Applebee and Langer 2015).
However, from our experience working with undergraduate students, many do not
know how to find reliable sources. The expansion of the Internet in recent decades and
the students’access to information online have exacerbated concerns over the reputa-
tion and reliability of the material students use in their research and writing. The
Internet undoubtedly helps students with their education and learning new material
with most of our own students relying on online sources. This is natural given that
most undergraduate students are currently from the technology generation. However,
based on our observation of the reference pages of students’assignments, most suffice
by simply googling their research topic and relying on websites that appear to answer
their question regardless of how reliable the source is.
In addition to students’academic ill preparation in this area, in the age of technology,
this concern has implications outside of academia as well, including combatting misin-
formation and fake news that are a significant worry and warrant national security
CONTACT Reyhan Topal rtopal@albany.edu Department of Political Science, State University of New York at
Albany, Albany, New York, USA.
ß2023 Taylor & Francis Group, LLC
JOURNAL OF POLITICAL SCIENCE EDUCATION
https://doi.org/10.1080/15512169.2022.2163899
concerns (Wilner 2018; Monsees 2020). As such, we believe that college-level instruc-
tors, regardless of the research activities of their institutions, must incorporate methods
of finding reputable information and checking the reliability of sources in the courses
that they teach. This contributes to students’intellectual development in academic and
professional research techniques, in addition to shaping them to become aware of the
dangers of misinformation and how to identify it.
A broad range of literature points to students’lack of knowledge on distinguishing
and finding reputable information before college (Coiro et al. 2015; Clark, Schmeichel,
and Garrett 2020). Meanwhile, literature on digital literacy shows that many colleges do
teach students “how to access the trustworthiness of information on the web”(Kahne,
Lee, and Feezell 2012, 8). While students might be learning to check for the reputation
of online sources in practice, they seem to rely on their prior experience with finding
information, including using the search engines that they trust to return consistently
reliable results (Hargittai et al. 2010). As such, some research suggests that practical sit-
uations might perhaps be best to teach about reliability (Metzger 2007).
Many scholars attempt to creatively approach this pedagogical challenge. For
example, Musgrove et al. (2018) propose using exercises such as the CRAAP (Currency,
Relevance, Authority, Accuracy, and Purpose) worksheet to teach undergraduate
students about reliability and fact-checking portions of their research. With a similar
approach, scholars suggest using interesting activities, such as playing games and creat-
ing memes, to relay the importance of this subject to the current generation of students
(Ireland 2018; Asal et al. 2021; Topal and Shargh 2022). In this paper, we propose
higher education instructors utilize a series of practical exercises aimed at teaching stu-
dents to identify reliable online sources and fact-check information, in addition to dis-
cussing some potentially useful sources, including online search tools, academic blogs,
and online data sources for their academic and professional research. While our discip-
line is political science, we believe any social science instructor could utilize this
approach to maximize students’intellectual development in college through participat-
ing and succeeding in research.
Exercises
In this section, we introduce our exercises in five categories. In the “reliability”category,
we present our reliability criteria and ask students to evaluate the reliability of given
pieces. In “fact-checking,”we show how to educate students on online fact-checking
tools they can employ to evaluate the reliability of online sources. In “online search
tools,”we propose strategies to teach students online search tools, such as Google
Scholar and LexisNexis. In the “online academic blogs,”we debate how to help students
get familiar with widely known online academic blogs. In the “online data sources,”
we present helpful tips to show how instructors can help students have hands-on
experience in using online qualitative and quantitative data sources.
Reliability
In this step, we teach students how to find and identify reliable online sources. Some
students might have a general sense of what reliability means, but still, we start our
2 R. TOPAL AND F. SHARGH
discussion by defining the concept as “the extent to which information can be trusted”
(Mislevy 2004, 8). After asking students what reliability means from their perspective,
we also explain to students why benefiting from reliable sources in their research proj-
ects is vital for their intellectual development. There is a great number of online sources
about politics, which are written by people with different perspectives, ideologies, and
political agendas that may affect students’way of thinking. Therefore, we believe that
teaching student how to evaluate the reliability of online sources is essential to improve
critical thinking skills.
Once we clarify that students need to differentiate between reliable and unreliable
sources, we anticipate for them to be more enthusiastic to learn about the concept.
Then, we introduce our criteria that students can employ when evaluating an online
source’s reliability. After examining the scholarly debates on how to define (Moss 2004)
and teach reliability (Sanchez, Wiley, and Goldman 2006), we prepared the following
criteria:
i. The online source should contain the following bibliographic information:
author name, text title, publication date, and publisher name (including the
name of online platforms) so that students can know if the work was written by
an expert or an anonymous person with an unknown background.
ii. The claims of the online source should be testable so that students can verify or
falsify them.
iii. The online source should include in-depth and objective analysis so that stu-
dents can know if it was written to impose an ideology.
iv. The online source should be neutral and inclusive so that students can avoid
online sources with biased, discriminative, and hateful content.
After discussing each criterion, we also show students a couple of reliable and unreli-
able sources to help them have a better understanding of how we evaluate reliability.
Then, we proceed to the next step and ask our students to practice what they learned
about reliability through two exercises. For the first exercise, we divide students into
groups and distribute the same reliable online news article or blog post to all groups.
We ask them to discuss whether the distributed piece is reliable or not with their group
members by using the criteria above. For the second one, we distribute the same unreli-
able online piece to all groups. This time, we tell students to classify the online piece as
reliable, relatively reliable, or unreliable because we want them to understand that it is
not always easy to label sources as definitely reliable or unreliable. Then, we ask each
group to justify their classifications based on our criteria. At the end of these exercises,
regardless of their decisions, we ask group members to discuss why they ended up with
similar or different views. We believe these exercises improve students’ability to evalu-
ate the reliability of the online sources they plan to use in their written assignments.
Fact-checking
Although online information sources help students acquire knowledge to develop their
research easily, they may also pave the way for their exposure to inaccurate or
JOURNAL OF POLITICAL SCIENCE EDUCATION 3
incomplete information. Existing studies show that college students struggle to evaluate
the accuracy of online information (Bartlett and Miller 2011; McGrew et al. 2018). This
problem has been exacerbated by online ads, which appear at the top of search results
and receive more interaction regardless of whether they are reliable or not. Therefore,
instructors need to educate their students on reading laterally and checking the facts
they found online.
To improve our students’fact-checking skills, we start our discussion by defining
fact-checking as “a process of verifying the accuracy of information”(Brodsky et al.
2021) and clarifying why checking the accuracy of the information in an online source
is an integral part of doing research. In the “reliability”section, we teach our students
to differentiate between reliable and unreliable sources. In this step, we deepen their
understanding by teaching them how to check the accuracy of the available information.
For this purpose, we discuss the definition of the following concepts that our students
should know before retrieving data from the Internet: fake news, misinformation, disin-
formation, and malinformation. Then, we show our students how to detect these con-
cepts in an online source through exercises. For the first exercise, we divide students
into groups and distribute the same online news article to each group. Then, we ask
them to decide if the article entails any of these four concepts with their group mem-
bers. We allow them to search the facts on the Internet. At the end of the exercise, we
ask groups to explain how they made their decision: Did they google the facts? Did they
compare the facts with those in other online news sources? Did they benefit from
library resources to find academic works that support or refute the facts? We aim to
understand which sources students employ to check the facts without using specific
fact-check tools.
Before the next exercise, we teach our students several online fact-checking tools,
such as Google Fact Check Explorer, PolitiFact, Snopes, and Truth or Fiction, and
how to use them effectively. We also highlight their shortcomings by showing that
some of them do not provide users with proper results about many political facts.
For example, when we search “Is Obama alive?”on Google Fact Check Explorer in
the class, the result is “No, Obama didn’t announce Biden’s death,”which is not rele-
vant to our question. In doing so, we show our students that they should not rely
entirely on these online fact-check tools, make additional searches when necessary,
and ensure to cross-check the information they find online. Then, we assign the
same news article we used in the previous exercise and ask groups to decide if the
article presents fake news, misinformation, disinformation, or malinformation using
those online fact-check tools. We ask each group to explain if online tools tell some-
thing different than what they found in the previous exercise. In doing so, we show
our students that different fact-checking strategies can give different results about the
accuracy of the information, so they should use a variety of strategies while checking
the facts in online sources, such as browsing the official documents and announce-
ments on governmental websites for recent political documents, searching for statis-
tics about a political overgeneralization, and using online archival sources to check
the reliability of a historical fact.
At the end of the group activity, we close the discussion by highlighting how fact-ch-
ecking can eliminate the risk of disinformation (Walter et al. 2020) and if fact-checking
4 R. TOPAL AND F. SHARGH
can help people change their minds when they learn the real facts about policy issues
(Pennycook et al. 2021).
Online search tools
Finding peer-reviewed academic articles is one of the many areas of research that is
often challenging for undergraduate students. As mentioned, many students rely on the
initial results from a search engine rather than an extensive literature review. However,
some of these online sources are of lower quality compared to peer-reviewed academic
journals due to reliability issues. As such, it is imperative to show undergraduate stu-
dents how to search for reputable sources and the various avenues that they have in
researching material.
We mainly focus on three routes: peer-reviewed journal articles, news articles, and
archival search tools. We suggest starting with peer-reviewed journal articles by discus-
sing Google Scholar. From our experience, while Google Scholar is a new and exciting
tool for many students, its similarity to the regular Google search engine makes it con-
venient to use. Further, Google Scholar, unlike other databases for locating scholarly
works, does not require users to create a user account or ask users to provide their
information to use the service. In this section, we mention tools such as specifying key-
words that focus results on appropriate books or articles of interest. To search for a
news article, we discuss how to use LexisNexis. As a tool readily available to most stu-
dents through university libraries, LexisNexis provides access to articles that are locked
or require a paid subscription to retrieve. Finally, we discuss archival data sources,
which from our experience are the most unfamiliar data sources to undergraduates.
Archival data sources can be explained through specific examples that might require
primary sources of a historical character. Based on these examples, some online archival
sources, including local, state, and national archives, and digitized collections from
presidential libraries are shown to students in class. While these are important tools, we
do believe that it is vital for students to also learn about their limitations such as poten-
tial bias (Jensenius et al. 2018), indexing and sorting issues with Google Scholar results
(Halevi, Moed, and Bar-Ilan 2017; Jacs
o2008), and the ethics of using archival research
on sensitive topics (Suboti
c2021).
To give a more hands-on experience to undergraduate students, we suggest students
use search tools to support claims made in actual pieces of writing. Depending on the
plan for the class, two exercises could be helpful. First, if the class has short writing
assignments, like memos or short drafts, instructors could distribute the submitted
assignments anonymously among students and ask them to add a few sentences to the
work of their peers based on the online search tools they learned in class. Anonymous
distribution is supported by many learning management software (LMS) such as Canvas
and Blackboard, which alleviates the burden of manually circulating the assignments. A
second possible exercise could be assigning the first couple of paragraphs of articles
from the class reading list and asking students to support their claims by citing an
online source that they found using the tools we mentioned in class. These two exer-
cises allow students to practice the techniques discussed for finding reputable sources
JOURNAL OF POLITICAL SCIENCE EDUCATION 5
while showing students that scholarly work is not an unattainable task and lets students
discover and support claims using sources like many social scientists.
Online academic blogs
With the ubiquity of online platforms over the last decade, we observe an increasing
number of online academic blogs where scholars worldwide contribute to their fields
(Esarey and Wood 2018). Academic blogging is increasingly acknowledged as an
important aspect of the political science discipline (Sides 2011; Farley 2013). Therefore,
it is important to teach our students the online political science blogs they can use while
doing research.
To make our students get familiar with the online academic blogs, we start by listing
the most common blogs, such as Political Science Now, The Monkey Cage, The Duck
of Minerva, War on the Rocks, POLITICO Magazine, Political Violence at a Glance,
and FiveThirtyEight Politics. We show our students the websites of each blog, their sec-
tions or subtopics, and examples of scholarly pieces on these blogs. This way, students
get a general idea about online blogs. Then, we teach our students how to use online
academic blogs effectively through two exercises. For the first in-class exercise, we dis-
tribute to our students a piece that we draw from an online academic blog and includes
relevant information about a course topic. For example, for an International Relations
(IR) Theory or International Security class, we distribute Tyler Bowen’s(2022)“Russia’s
Invasion of Ukraine and NATO’s Crisis of Nuclear Credibility”published in War on
the Rocks. We then ask our students to identify the main themes of the IR theories in
this piece, which entails a good number of them, such as the balance of power, great
power competition, deterrence, and alliances. With this exercise, we aim to show our
students how the theories they have learned in class are employed by scholars to explain
current political developments. In doing so, we also expect our students to learn how to
integrate these theories into their research more effectively.
For our second in-class exercise, we propose a long-term agenda to encourage our
students to follow online academic blogs regularly. After our lecture on online academic
blogs, we ask three to four students to read a blog post about current political develop-
ments before the class and talk about it briefly every week. Our primary goal is to incul-
cate the habit of following political blogs into our students. An alternative approach can
be asking students to check the reliability of the information in these posts and explain
their findings in class. Thus, they can also consolidate what they have learned during
our lectures on reliability and fact-checking.
Online data sources
Another area of research that many undergraduate students are not as comfortable with
is using quantitative data sources to support arguments and the claims they make in
papers. From our experience, this is especially the case for undergraduate students in
social sciences and humanities. Students in these fields are less likely to have a mathem-
atics or statistics class requirement in their programs of study (Buchler 2009). Many of
these students expect quantitative and statistical methods to be a tool solely for STEM
6 R. TOPAL AND F. SHARGH
students (Markle 2017). However, as methodology literature in social sciences suggests,
there is a renewed shift toward appreciating quantitative research tools in addition to
qualitative methods in these fields (Tarrow 1995; King, Keohane, and Verba 2021). As
such, we believe it is appropriate for undergraduate students in social sciences, includ-
ing political science, to become familiar with quantitative and statistical approaches to
validating information in research.
First, Students must know how to find quantitative datasets to be able to use them.
As such, we suggest instructors showcase some of these sources depending on the topic
of the class. For example, in classes on American Politics, the U.S. Census Bureau pro-
vides extensive data to researchers. Further, instructors of the global economy and inter-
national relations can utilize datasets provided by the World Bank Open Data. More
specifically, the Global Terrorism Database provided by the National Consortium for
the Study of Terrorism and Responses to Terrorism (START) at the University of
Maryland is suitable for a class on terrorism. Pew Research Center’s databases are also
another accessible source that could be useful to students who plan to conduct research
in both American politics and international affairs. We also recommend discussing
Google Dataset Search as a tool to find online data on specific issues or variables of
interest.
To practice using quantitative data to support arguments, instructors could utilize
any exercises we have already mentioned by concentrating on quantitative and survey
data. For example, like the online search tool exercise, instructors could distribute a pas-
sage on a specific topic and ask students to support its claims using online data sources
we discuss in class. Another exercise for practicing the use of quantitative data could
resemble the popular game called “two truths and a lie.”In this exercise, we distribute a
specific dataset among students and draft three short paragraphs based on the data.
Two of the paragraphs will be truths driven from the dataset while another paragraph is
an exaggeration or an utterly false claim. We then ask students to identify each para-
graph as a truth or a lie based on the dataset provided to them. However, not all stu-
dents may have the same amount of prior statistical and mathematical knowledge or
could be in diverse academic majors. For example, many undergraduate engineering
students are required to take mathematics classes such as calculus and linear algebra, or
introductory statistics classes, while extensive mathematical or statistical training is not
expected of students majoring in social sciences or humanities. Given the varying back-
grounds of students taking social science classes, we suggest constructing paragraphs
based on simple statistical and mathematical concepts such as averages, minimums,
maximums, and outliers. This exercise introduces examples of online data sources to
undergraduate students and shows them that quantitative data is not frightening but
could be a convenient tool to support arguments in research papers and fact-check
others’claims.
Conclusion
In college, students get to experience different steps of doing research under the guid-
ance of their instructors. To this end, they learn to ask proper questions, review the lit-
erature, collect evidence, and write papers. Teaching various aspects of doing research
JOURNAL OF POLITICAL SCIENCE EDUCATION 7
to students facilitates their intellectual development and encourages them to become
independent researchers. Through their research supervision, instructors can increase
the chance of college students’participation in professional projects, especially those
that will be published in academic journals. Further, in the age of technology, even stu-
dents who are not planning on conducting research in their future careers must be
skilled in identifying misinformation and fake news by finding reputable information
and checking the reliability of the information they encounter.
In this article, we addressed one of the challenges students encounter while doing
research: finding reliable information online. With the prevalence of online information
sources, we observe that college students increasingly retrieve information from the
Internet. Although online sources enable students to access information quickly, they
pose the risk of exposure to unreliable and inaccurate information. With students facing
difficulty finding and identifying reliable online sources, instructors need better strat-
egies to teach them how to overcome this difficulty. For this purpose, we presented a
roadmap for instructors willing to educate their students on the quality of online sour-
ces. Our roadmap entails five main topics for instructors to cover during their lectures:
reliability, fact-checking, online search tools, online academic blogs, and online data
sources. For each topic, we discussed our teaching strategies step by step and recom-
mended several in-class exercises from which instructors may benefit to make their stu-
dents practice what they learned. While our strategies are widely germane to social
sciences, we believe they could be especially useful for instructors teaching politics and
relevant topics as we designed our exercises by considering the needs of the students
majoring in political science, government, and international relations. With this
endeavor, we hope to facilitate students’academic and professional development and
maximize their chances of success when participating in a research project.
Notes on contributors
Reyhan Topal is a Ph.D. candidate in Political Science at University at Albany, SUNY. Her con-
centrations are in International Relations and Comparative Politics, and her research interests
include technology, political violence, authoritarianism, and teaching methodologies.
Farzin Shargh is a third-year Ph.D. student in Political Science at University at Albany, SUNY.
His concentrations are in Comparative Politics and International Relations, and his research
interests include pedagogy, Middle East politics, and human rights.
ORCID
Reyhan Topal http://orcid.org/0000-0002-0196-1175
Farzin Shargh http://orcid.org/0000-0002-9044-8022
References
Applebee, Arthur N., and Judith A. Langer. 2015.Writing Instruction That Works: Proven
Methods for Middle and High School Classrooms. New York: Teachers College Press.
Asal, Victor, Charmaine Willis, Christopher Linebarger, and Nakissa Jahanbani. 2021.“Teaching
about Oppression and Rebellion: The “peasants Are Revolting”Game.”PS: Political Science &
Politics 54(2):331–335. 10.1017/S1049096520001675.
8 R. TOPAL AND F. SHARGH
Bartlett, Jamie, and Carl Miller. 2011.Truth, Lies, and the Internet: A Report into Young People’s
Digital Fluency. London, UK: Demos.
Bowen, Tyler. 2022.“Russia’s Invasion of Ukraine and NATO’s Crisis of Nuclear Credibility.”
War on the Rocks. July 1. https://warontherocks.com/2022/04/russias-invasion-of-ukraine-and-
natos-crisis-of-nuclear-credibility/
Brodsky, Jessica E., Patricia J. Brooks, Donna Scimeca, Ralitsa Todorova, Peter Galati, Michael
Batson, Robert Grosso, Michael Matthews, Victor Miller, and Michael Caulfield. 2021.
“Improving College Students’Fact-Checking Strategies Through Lateral Reading Instruction
in a General Education Civics Course.”Cognitive Research: Principles and Implications 6 (1):
18–23. doi:10.1186/s41235-021-00291-4.
Buchler, Justin. 2009.“Teaching Quantitative Methodology to the Math Averse.”PS: Political
Science & Politics 42 (03):527–530. doi:10.1017/S1049096509090842.
Clark, Christopher H., Mardi Schmeichel, and H. James Garrett. 2020.“Social Studies Teacher
Perceptions of News Source Credibility.”Educational Researcher 49 (4):262–272. doi:10.3102/
0013189X20909823.
Coiro, Julie, Carla Coscarelli, Cheryl Maykel, and Elena Forzani. 2015.“Investigating
Criteria That Seventh Graders Use to Evaluate the Quality of Online Information.”Journal of
Adolescent & Adult Literacy 59 (3):287–297. doi:10.1002/jaal.448.
Esarey, Justin, and Andrew R. Wood. 2018.“Blogs, Online Seminars, and Social Media as Tools
of Scholarship in Political Science.”PS: Political Science & Politics 51 (4):811–819.
Farley, Robert. 2013.“Complicating the Political Scientist as Blogger.”PS: Political Science &
Politics 46 (02):383–386. doi:10.1017/S1049096513000061.
Halevi, Gali, Henk Moed, and Judit Bar-Ilan. 2017.“Suitability of Google Scholar as a Source of
Scientific Information and as a Source of Data for Scientific Evaluation—Review of the
Literature.”Journal of Informetrics 11 (3):823–834. doi:10.1016/j.joi.2017.06.005.
Hargittai, Eszter, Lindsay Fullerton, Ericka Menchen-Trevino, and Kristin Yates Thomas. 2010.
“Trust Online: Young Adults’Evaluation of Web Content.”International Journal of
Communication 4 (27):468–494.
Ireland, Sonnet. 2018.“Fake News Alerts: Teaching News Literacy Skills in a Meme World.”
The Reference Librarian 59 (3):122–128. doi:10.1080/02763877.2018.1463890.
Jacs
o, P
eter. 2008.“Google Scholar Revisited.”Online Information Review 32 (1):102–114. doi:10.
1108/14684520810866010.
Jensenius, Francesca R., Mala Htun, David J. Samuels, David A. Singer, Adria Lawrence, and
Michael Chwe. 2018.“The Benefits and Pitfalls of Google Scholar.”PS: Political Science &
Politics 51 (4):820–824. doi:10.1017/S104909651800094X.
Kahne, Joseph, Nam-Jin Lee, and Jessica Timpany Feezell. 2012.“Digital Media Literacy
Education and Online Civic and Political Participation.”International Journal of
Communication 6:1–24.
King , Gary, Robert O., Keohane , and Sidney Verba. 2021.Designing social inquiry: Scientific
inference in qualitative research. Princeton: Princeton University Press.
Markle, Gail. 2017.“Factors Influencing Achievement in Undergraduate Social Science Research
Methods Courses: A Mixed Methods Analysis.”Teaching Sociology 45 (2):105–115. doi:10.
1177/0092055X16676302.
McGrew, Sarah, Joel Breakstone, Teresa Ortega, Mark Smith, and Sam Wineburg. 2018.“Can
Students Evaluate Online Sources? Learning from Assessments of Civic Online Reasoning.”
Theory & Research in Social Education 46 (2):165–193. doi:10.1080/00933104.2017.1416320.
Metzger, Miriam J. 2007.“Making Sense of Credibility on the Web: Models for Evaluating
Online Information and Recommendations for Future Research.”Journal of the American
Society for Information Science and Technology 58 (13):2078–2091. doi:10.1002/asi.20672.
Mislevy, Robert. J. 2004.“Can There Be Reliability Without ‘Reliability?”Journal of Educational
and Behavioral Statistics 29 (2):241–244. doi:10.3102/10769986029002241.
Monsees, Linda. 2020.“A War Against Truth’- Understanding the Fake News Controversy.”
Critical Studies on Security 8 (2):116–129. doi:10.1080/21624887.2020.1763708.
JOURNAL OF POLITICAL SCIENCE EDUCATION 9
Moss, Pamela A. 2004.“The Meaning and Consequences of ‘Reliability.”Journal of Educational
and Behavioral Statistics 29 (2):245–249. doi:10.3102/10769986029002245.
Musgrove, Ann T., Jillian R. Powers, Lauri C. Rebar, and Glenn J. Musgrove. 2018.“Real or
Fake? Resources for Teaching College Students How to Identify Fake News.”College &
Undergraduate Libraries 25 (3):243–260. doi:10.1080/10691316.2018.1480444.
Pennycook, Gordon, Ziv Epstein, Mohsen Mosleh, Antonio A. Arechar, Dean Eckles, and
David G. Rand. 2021.“Shifting Attention to Accuracy Can Reduce Misinformation Online.”
Nature 592 (7855):590–595. doi:10.1038/s41586-021-03344-2.
Sanchez, Christopher A., Jennifer Wiley, and Susan R. Goldman. 2006.“Teaching Students to
Evaluate Source Reliability During Internet Research Tasks.”International Society of the
Learning Sciences 2:662–666.
Sides, John. 2011.“The Political Scientist as Blogger.”PS: Political Science & Politics 44(02):267–
271. doi:10.1017/S1049096511000060.
Suboti
c, Jelena. 2021.“Ethics of Archival Research on Political Violence.”Journal of Peace
Research 58 (3):342–354. doi:10.1177/0022343319898735.
Tarrow, Sidney. 1995.“Bridging the Quantitative-Qualitative Divide in Political Science.”
American Political Science Review 89 (2):471–474. doi:10.2307/2082444.
Topal, Reyhan, and Farzin Shargh. 2022.“Bringing the Students Back in: How to Re-Engage
Students in a “Post-Covid”World.”Political Science Educator 26(1). August 11. https://educate.
apsanet.org/bringing-the-students-back-in-how-to-re-engage-students-in-a-postcovid-world
Walter, Nathan, Jonathan Cohen, R. Lance Holbert, and Yasmin Morag. 2020.“Fact-Checking: A
Meta-Analysis of What Works and for Whom.”Political Communication 37 (3):350–375.
doi:10.1080/10584609.2019.1668894.
Wilner, Alex S. 2018.“Cybersecurity and Its Discontents: Artificial Intelligence, the Internet of
Things, and Digital Misinformation.”International Journal: Canada’s Journal of Global Policy
Analysis 73(2):308–316. doi:10.1177/0020702018782496.
10 R. TOPAL AND F. SHARGH