ArticlePDF Available

Abstract

Online information sources help undergraduate students acquire knowledge and find supporting evidence for their papers. However, this practicality is often clouded with unreliable sources. As such, instructors need better strategies to teach students how to reach reliable online sources and evaluate the accuracy of the online information they obtain. In this article, we introduce a series of exercises to teach students how to identify the reliability of a source, ways to fact-check information, steps to use online search tools effectively, and examples of online academic blogs to utilize and check the reliability of the information that they present, and how to use online data/survey sources. We believe that educating students on identifying and utilizing reliable sources is essential for their intellectual development and ability to conduct their research independently. Therefore, our exercises aim to provide a roadmap for instructors to improve their students’ virtual literacy.
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=upse20
Journal of Political Science Education
ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/upse20
Teaching Students How to Find and Identify
Reliable Online Sources: A Series of Exercises
Reyhan Topal & Farzin Shargh
To cite this article: Reyhan Topal & Farzin Shargh (2023): Teaching Students How to Find and
Identify Reliable Online Sources: A Series of Exercises, Journal of Political Science Education,
DOI: 10.1080/15512169.2022.2163899
To link to this article: https://doi.org/10.1080/15512169.2022.2163899
Published online: 11 Jan 2023.
Submit your article to this journal
View related articles
View Crossmark data
POLITICAL SCIENCE INSTRUCTION
Teaching Students How to Find and Identify Reliable
Online Sources: A Series of Exercises
Reyhan Topal and Farzin Shargh
State University of New York at Albany
ABSTRACT
Online information sources help undergraduate students acquire
knowledge and find supporting evidence for their papers. However,
this practicality is often clouded with unreliable sources. As such,
instructors need better strategies to teach students how to reach
reliable online sources and evaluate the accuracy of the online infor-
mation they obtain. In this article, we introduce a series of exercises
to teach students how to identify the reliability of a source, ways to
fact-check information, steps to use online search tools effectively,
and examples of online academic blogs to utilize and check the reli-
ability of the information that they present, and how to use online
data/survey sources. We believe that educating students on identify-
ing and utilizing reliable sources is essential for their intellectual
development and ability to conduct their research independently.
Therefore, our exercises aim to provide a roadmap for instructors to
improve their studentsvirtual literacy.
ARTICLE HISTORY
Received 21 July 2022
Accepted 20 December 2022
KEYWORDS
Teaching exercises; online
sources; reliability
Introduction
The education system in the U.S. emphasizes the importance of good writing skills such
as successfully supporting arguments or claims, developing narratives to convey real or
imagined experiences, and writing clearly and coherently (Applebee and Langer 2015).
However, from our experience working with undergraduate students, many do not
know how to find reliable sources. The expansion of the Internet in recent decades and
the studentsaccess to information online have exacerbated concerns over the reputa-
tion and reliability of the material students use in their research and writing. The
Internet undoubtedly helps students with their education and learning new material
with most of our own students relying on online sources. This is natural given that
most undergraduate students are currently from the technology generation. However,
based on our observation of the reference pages of studentsassignments, most suffice
by simply googling their research topic and relying on websites that appear to answer
their question regardless of how reliable the source is.
In addition to studentsacademic ill preparation in this area, in the age of technology,
this concern has implications outside of academia as well, including combatting misin-
formation and fake news that are a significant worry and warrant national security
CONTACT Reyhan Topal rtopal@albany.edu Department of Political Science, State University of New York at
Albany, Albany, New York, USA.
ß2023 Taylor & Francis Group, LLC
JOURNAL OF POLITICAL SCIENCE EDUCATION
https://doi.org/10.1080/15512169.2022.2163899
concerns (Wilner 2018; Monsees 2020). As such, we believe that college-level instruc-
tors, regardless of the research activities of their institutions, must incorporate methods
of finding reputable information and checking the reliability of sources in the courses
that they teach. This contributes to studentsintellectual development in academic and
professional research techniques, in addition to shaping them to become aware of the
dangers of misinformation and how to identify it.
A broad range of literature points to studentslack of knowledge on distinguishing
and finding reputable information before college (Coiro et al. 2015; Clark, Schmeichel,
and Garrett 2020). Meanwhile, literature on digital literacy shows that many colleges do
teach students how to access the trustworthiness of information on the web(Kahne,
Lee, and Feezell 2012, 8). While students might be learning to check for the reputation
of online sources in practice, they seem to rely on their prior experience with finding
information, including using the search engines that they trust to return consistently
reliable results (Hargittai et al. 2010). As such, some research suggests that practical sit-
uations might perhaps be best to teach about reliability (Metzger 2007).
Many scholars attempt to creatively approach this pedagogical challenge. For
example, Musgrove et al. (2018) propose using exercises such as the CRAAP (Currency,
Relevance, Authority, Accuracy, and Purpose) worksheet to teach undergraduate
students about reliability and fact-checking portions of their research. With a similar
approach, scholars suggest using interesting activities, such as playing games and creat-
ing memes, to relay the importance of this subject to the current generation of students
(Ireland 2018; Asal et al. 2021; Topal and Shargh 2022). In this paper, we propose
higher education instructors utilize a series of practical exercises aimed at teaching stu-
dents to identify reliable online sources and fact-check information, in addition to dis-
cussing some potentially useful sources, including online search tools, academic blogs,
and online data sources for their academic and professional research. While our discip-
line is political science, we believe any social science instructor could utilize this
approach to maximize studentsintellectual development in college through participat-
ing and succeeding in research.
Exercises
In this section, we introduce our exercises in five categories. In the reliabilitycategory,
we present our reliability criteria and ask students to evaluate the reliability of given
pieces. In fact-checking,we show how to educate students on online fact-checking
tools they can employ to evaluate the reliability of online sources. In online search
tools,we propose strategies to teach students online search tools, such as Google
Scholar and LexisNexis. In the online academic blogs,we debate how to help students
get familiar with widely known online academic blogs. In the online data sources,
we present helpful tips to show how instructors can help students have hands-on
experience in using online qualitative and quantitative data sources.
Reliability
In this step, we teach students how to find and identify reliable online sources. Some
students might have a general sense of what reliability means, but still, we start our
2 R. TOPAL AND F. SHARGH
discussion by defining the concept as the extent to which information can be trusted
(Mislevy 2004, 8). After asking students what reliability means from their perspective,
we also explain to students why benefiting from reliable sources in their research proj-
ects is vital for their intellectual development. There is a great number of online sources
about politics, which are written by people with different perspectives, ideologies, and
political agendas that may affect studentsway of thinking. Therefore, we believe that
teaching student how to evaluate the reliability of online sources is essential to improve
critical thinking skills.
Once we clarify that students need to differentiate between reliable and unreliable
sources, we anticipate for them to be more enthusiastic to learn about the concept.
Then, we introduce our criteria that students can employ when evaluating an online
sources reliability. After examining the scholarly debates on how to define (Moss 2004)
and teach reliability (Sanchez, Wiley, and Goldman 2006), we prepared the following
criteria:
i. The online source should contain the following bibliographic information:
author name, text title, publication date, and publisher name (including the
name of online platforms) so that students can know if the work was written by
an expert or an anonymous person with an unknown background.
ii. The claims of the online source should be testable so that students can verify or
falsify them.
iii. The online source should include in-depth and objective analysis so that stu-
dents can know if it was written to impose an ideology.
iv. The online source should be neutral and inclusive so that students can avoid
online sources with biased, discriminative, and hateful content.
After discussing each criterion, we also show students a couple of reliable and unreli-
able sources to help them have a better understanding of how we evaluate reliability.
Then, we proceed to the next step and ask our students to practice what they learned
about reliability through two exercises. For the first exercise, we divide students into
groups and distribute the same reliable online news article or blog post to all groups.
We ask them to discuss whether the distributed piece is reliable or not with their group
members by using the criteria above. For the second one, we distribute the same unreli-
able online piece to all groups. This time, we tell students to classify the online piece as
reliable, relatively reliable, or unreliable because we want them to understand that it is
not always easy to label sources as definitely reliable or unreliable. Then, we ask each
group to justify their classifications based on our criteria. At the end of these exercises,
regardless of their decisions, we ask group members to discuss why they ended up with
similar or different views. We believe these exercises improve studentsability to evalu-
ate the reliability of the online sources they plan to use in their written assignments.
Fact-checking
Although online information sources help students acquire knowledge to develop their
research easily, they may also pave the way for their exposure to inaccurate or
JOURNAL OF POLITICAL SCIENCE EDUCATION 3
incomplete information. Existing studies show that college students struggle to evaluate
the accuracy of online information (Bartlett and Miller 2011; McGrew et al. 2018). This
problem has been exacerbated by online ads, which appear at the top of search results
and receive more interaction regardless of whether they are reliable or not. Therefore,
instructors need to educate their students on reading laterally and checking the facts
they found online.
To improve our studentsfact-checking skills, we start our discussion by defining
fact-checking as a process of verifying the accuracy of information(Brodsky et al.
2021) and clarifying why checking the accuracy of the information in an online source
is an integral part of doing research. In the reliabilitysection, we teach our students
to differentiate between reliable and unreliable sources. In this step, we deepen their
understanding by teaching them how to check the accuracy of the available information.
For this purpose, we discuss the definition of the following concepts that our students
should know before retrieving data from the Internet: fake news, misinformation, disin-
formation, and malinformation. Then, we show our students how to detect these con-
cepts in an online source through exercises. For the first exercise, we divide students
into groups and distribute the same online news article to each group. Then, we ask
them to decide if the article entails any of these four concepts with their group mem-
bers. We allow them to search the facts on the Internet. At the end of the exercise, we
ask groups to explain how they made their decision: Did they google the facts? Did they
compare the facts with those in other online news sources? Did they benefit from
library resources to find academic works that support or refute the facts? We aim to
understand which sources students employ to check the facts without using specific
fact-check tools.
Before the next exercise, we teach our students several online fact-checking tools,
such as Google Fact Check Explorer, PolitiFact, Snopes, and Truth or Fiction, and
how to use them effectively. We also highlight their shortcomings by showing that
some of them do not provide users with proper results about many political facts.
For example, when we search Is Obama alive?on Google Fact Check Explorer in
the class, the result is No, Obama didnt announce Bidens death,which is not rele-
vant to our question. In doing so, we show our students that they should not rely
entirely on these online fact-check tools, make additional searches when necessary,
and ensure to cross-check the information they find online. Then, we assign the
same news article we used in the previous exercise and ask groups to decide if the
article presents fake news, misinformation, disinformation, or malinformation using
those online fact-check tools. We ask each group to explain if online tools tell some-
thing different than what they found in the previous exercise. In doing so, we show
our students that different fact-checking strategies can give different results about the
accuracy of the information, so they should use a variety of strategies while checking
the facts in online sources, such as browsing the official documents and announce-
ments on governmental websites for recent political documents, searching for statis-
tics about a political overgeneralization, and using online archival sources to check
the reliability of a historical fact.
At the end of the group activity, we close the discussion by highlighting how fact-ch-
ecking can eliminate the risk of disinformation (Walter et al. 2020) and if fact-checking
4 R. TOPAL AND F. SHARGH
can help people change their minds when they learn the real facts about policy issues
(Pennycook et al. 2021).
Online search tools
Finding peer-reviewed academic articles is one of the many areas of research that is
often challenging for undergraduate students. As mentioned, many students rely on the
initial results from a search engine rather than an extensive literature review. However,
some of these online sources are of lower quality compared to peer-reviewed academic
journals due to reliability issues. As such, it is imperative to show undergraduate stu-
dents how to search for reputable sources and the various avenues that they have in
researching material.
We mainly focus on three routes: peer-reviewed journal articles, news articles, and
archival search tools. We suggest starting with peer-reviewed journal articles by discus-
sing Google Scholar. From our experience, while Google Scholar is a new and exciting
tool for many students, its similarity to the regular Google search engine makes it con-
venient to use. Further, Google Scholar, unlike other databases for locating scholarly
works, does not require users to create a user account or ask users to provide their
information to use the service. In this section, we mention tools such as specifying key-
words that focus results on appropriate books or articles of interest. To search for a
news article, we discuss how to use LexisNexis. As a tool readily available to most stu-
dents through university libraries, LexisNexis provides access to articles that are locked
or require a paid subscription to retrieve. Finally, we discuss archival data sources,
which from our experience are the most unfamiliar data sources to undergraduates.
Archival data sources can be explained through specific examples that might require
primary sources of a historical character. Based on these examples, some online archival
sources, including local, state, and national archives, and digitized collections from
presidential libraries are shown to students in class. While these are important tools, we
do believe that it is vital for students to also learn about their limitations such as poten-
tial bias (Jensenius et al. 2018), indexing and sorting issues with Google Scholar results
(Halevi, Moed, and Bar-Ilan 2017; Jacs
o2008), and the ethics of using archival research
on sensitive topics (Suboti
c2021).
To give a more hands-on experience to undergraduate students, we suggest students
use search tools to support claims made in actual pieces of writing. Depending on the
plan for the class, two exercises could be helpful. First, if the class has short writing
assignments, like memos or short drafts, instructors could distribute the submitted
assignments anonymously among students and ask them to add a few sentences to the
work of their peers based on the online search tools they learned in class. Anonymous
distribution is supported by many learning management software (LMS) such as Canvas
and Blackboard, which alleviates the burden of manually circulating the assignments. A
second possible exercise could be assigning the first couple of paragraphs of articles
from the class reading list and asking students to support their claims by citing an
online source that they found using the tools we mentioned in class. These two exer-
cises allow students to practice the techniques discussed for finding reputable sources
JOURNAL OF POLITICAL SCIENCE EDUCATION 5
while showing students that scholarly work is not an unattainable task and lets students
discover and support claims using sources like many social scientists.
Online academic blogs
With the ubiquity of online platforms over the last decade, we observe an increasing
number of online academic blogs where scholars worldwide contribute to their fields
(Esarey and Wood 2018). Academic blogging is increasingly acknowledged as an
important aspect of the political science discipline (Sides 2011; Farley 2013). Therefore,
it is important to teach our students the online political science blogs they can use while
doing research.
To make our students get familiar with the online academic blogs, we start by listing
the most common blogs, such as Political Science Now, The Monkey Cage, The Duck
of Minerva, War on the Rocks, POLITICO Magazine, Political Violence at a Glance,
and FiveThirtyEight Politics. We show our students the websites of each blog, their sec-
tions or subtopics, and examples of scholarly pieces on these blogs. This way, students
get a general idea about online blogs. Then, we teach our students how to use online
academic blogs effectively through two exercises. For the first in-class exercise, we dis-
tribute to our students a piece that we draw from an online academic blog and includes
relevant information about a course topic. For example, for an International Relations
(IR) Theory or International Security class, we distribute Tyler Bowens(2022)Russias
Invasion of Ukraine and NATOs Crisis of Nuclear Credibilitypublished in War on
the Rocks. We then ask our students to identify the main themes of the IR theories in
this piece, which entails a good number of them, such as the balance of power, great
power competition, deterrence, and alliances. With this exercise, we aim to show our
students how the theories they have learned in class are employed by scholars to explain
current political developments. In doing so, we also expect our students to learn how to
integrate these theories into their research more effectively.
For our second in-class exercise, we propose a long-term agenda to encourage our
students to follow online academic blogs regularly. After our lecture on online academic
blogs, we ask three to four students to read a blog post about current political develop-
ments before the class and talk about it briefly every week. Our primary goal is to incul-
cate the habit of following political blogs into our students. An alternative approach can
be asking students to check the reliability of the information in these posts and explain
their findings in class. Thus, they can also consolidate what they have learned during
our lectures on reliability and fact-checking.
Online data sources
Another area of research that many undergraduate students are not as comfortable with
is using quantitative data sources to support arguments and the claims they make in
papers. From our experience, this is especially the case for undergraduate students in
social sciences and humanities. Students in these fields are less likely to have a mathem-
atics or statistics class requirement in their programs of study (Buchler 2009). Many of
these students expect quantitative and statistical methods to be a tool solely for STEM
6 R. TOPAL AND F. SHARGH
students (Markle 2017). However, as methodology literature in social sciences suggests,
there is a renewed shift toward appreciating quantitative research tools in addition to
qualitative methods in these fields (Tarrow 1995; King, Keohane, and Verba 2021). As
such, we believe it is appropriate for undergraduate students in social sciences, includ-
ing political science, to become familiar with quantitative and statistical approaches to
validating information in research.
First, Students must know how to find quantitative datasets to be able to use them.
As such, we suggest instructors showcase some of these sources depending on the topic
of the class. For example, in classes on American Politics, the U.S. Census Bureau pro-
vides extensive data to researchers. Further, instructors of the global economy and inter-
national relations can utilize datasets provided by the World Bank Open Data. More
specifically, the Global Terrorism Database provided by the National Consortium for
the Study of Terrorism and Responses to Terrorism (START) at the University of
Maryland is suitable for a class on terrorism. Pew Research Centers databases are also
another accessible source that could be useful to students who plan to conduct research
in both American politics and international affairs. We also recommend discussing
Google Dataset Search as a tool to find online data on specific issues or variables of
interest.
To practice using quantitative data to support arguments, instructors could utilize
any exercises we have already mentioned by concentrating on quantitative and survey
data. For example, like the online search tool exercise, instructors could distribute a pas-
sage on a specific topic and ask students to support its claims using online data sources
we discuss in class. Another exercise for practicing the use of quantitative data could
resemble the popular game called two truths and a lie.In this exercise, we distribute a
specific dataset among students and draft three short paragraphs based on the data.
Two of the paragraphs will be truths driven from the dataset while another paragraph is
an exaggeration or an utterly false claim. We then ask students to identify each para-
graph as a truth or a lie based on the dataset provided to them. However, not all stu-
dents may have the same amount of prior statistical and mathematical knowledge or
could be in diverse academic majors. For example, many undergraduate engineering
students are required to take mathematics classes such as calculus and linear algebra, or
introductory statistics classes, while extensive mathematical or statistical training is not
expected of students majoring in social sciences or humanities. Given the varying back-
grounds of students taking social science classes, we suggest constructing paragraphs
based on simple statistical and mathematical concepts such as averages, minimums,
maximums, and outliers. This exercise introduces examples of online data sources to
undergraduate students and shows them that quantitative data is not frightening but
could be a convenient tool to support arguments in research papers and fact-check
othersclaims.
Conclusion
In college, students get to experience different steps of doing research under the guid-
ance of their instructors. To this end, they learn to ask proper questions, review the lit-
erature, collect evidence, and write papers. Teaching various aspects of doing research
JOURNAL OF POLITICAL SCIENCE EDUCATION 7
to students facilitates their intellectual development and encourages them to become
independent researchers. Through their research supervision, instructors can increase
the chance of college studentsparticipation in professional projects, especially those
that will be published in academic journals. Further, in the age of technology, even stu-
dents who are not planning on conducting research in their future careers must be
skilled in identifying misinformation and fake news by finding reputable information
and checking the reliability of the information they encounter.
In this article, we addressed one of the challenges students encounter while doing
research: finding reliable information online. With the prevalence of online information
sources, we observe that college students increasingly retrieve information from the
Internet. Although online sources enable students to access information quickly, they
pose the risk of exposure to unreliable and inaccurate information. With students facing
difficulty finding and identifying reliable online sources, instructors need better strat-
egies to teach them how to overcome this difficulty. For this purpose, we presented a
roadmap for instructors willing to educate their students on the quality of online sour-
ces. Our roadmap entails five main topics for instructors to cover during their lectures:
reliability, fact-checking, online search tools, online academic blogs, and online data
sources. For each topic, we discussed our teaching strategies step by step and recom-
mended several in-class exercises from which instructors may benefit to make their stu-
dents practice what they learned. While our strategies are widely germane to social
sciences, we believe they could be especially useful for instructors teaching politics and
relevant topics as we designed our exercises by considering the needs of the students
majoring in political science, government, and international relations. With this
endeavor, we hope to facilitate studentsacademic and professional development and
maximize their chances of success when participating in a research project.
Notes on contributors
Reyhan Topal is a Ph.D. candidate in Political Science at University at Albany, SUNY. Her con-
centrations are in International Relations and Comparative Politics, and her research interests
include technology, political violence, authoritarianism, and teaching methodologies.
Farzin Shargh is a third-year Ph.D. student in Political Science at University at Albany, SUNY.
His concentrations are in Comparative Politics and International Relations, and his research
interests include pedagogy, Middle East politics, and human rights.
ORCID
Reyhan Topal http://orcid.org/0000-0002-0196-1175
Farzin Shargh http://orcid.org/0000-0002-9044-8022
References
Applebee, Arthur N., and Judith A. Langer. 2015.Writing Instruction That Works: Proven
Methods for Middle and High School Classrooms. New York: Teachers College Press.
Asal, Victor, Charmaine Willis, Christopher Linebarger, and Nakissa Jahanbani. 2021.Teaching
about Oppression and Rebellion: The peasants Are RevoltingGame.PS: Political Science &
Politics 54(2):331335. 10.1017/S1049096520001675.
8 R. TOPAL AND F. SHARGH
Bartlett, Jamie, and Carl Miller. 2011.Truth, Lies, and the Internet: A Report into Young Peoples
Digital Fluency. London, UK: Demos.
Bowen, Tyler. 2022.Russias Invasion of Ukraine and NATOs Crisis of Nuclear Credibility.
War on the Rocks. July 1. https://warontherocks.com/2022/04/russias-invasion-of-ukraine-and-
natos-crisis-of-nuclear-credibility/
Brodsky, Jessica E., Patricia J. Brooks, Donna Scimeca, Ralitsa Todorova, Peter Galati, Michael
Batson, Robert Grosso, Michael Matthews, Victor Miller, and Michael Caulfield. 2021.
Improving College StudentsFact-Checking Strategies Through Lateral Reading Instruction
in a General Education Civics Course.Cognitive Research: Principles and Implications 6 (1):
1823. doi:10.1186/s41235-021-00291-4.
Buchler, Justin. 2009.Teaching Quantitative Methodology to the Math Averse.PS: Political
Science & Politics 42 (03):527530. doi:10.1017/S1049096509090842.
Clark, Christopher H., Mardi Schmeichel, and H. James Garrett. 2020.Social Studies Teacher
Perceptions of News Source Credibility.Educational Researcher 49 (4):262272. doi:10.3102/
0013189X20909823.
Coiro, Julie, Carla Coscarelli, Cheryl Maykel, and Elena Forzani. 2015.Investigating
Criteria That Seventh Graders Use to Evaluate the Quality of Online Information.Journal of
Adolescent & Adult Literacy 59 (3):287297. doi:10.1002/jaal.448.
Esarey, Justin, and Andrew R. Wood. 2018.Blogs, Online Seminars, and Social Media as Tools
of Scholarship in Political Science.PS: Political Science & Politics 51 (4):811819.
Farley, Robert. 2013.Complicating the Political Scientist as Blogger.PS: Political Science &
Politics 46 (02):383386. doi:10.1017/S1049096513000061.
Halevi, Gali, Henk Moed, and Judit Bar-Ilan. 2017.Suitability of Google Scholar as a Source of
Scientific Information and as a Source of Data for Scientific EvaluationReview of the
Literature.Journal of Informetrics 11 (3):823834. doi:10.1016/j.joi.2017.06.005.
Hargittai, Eszter, Lindsay Fullerton, Ericka Menchen-Trevino, and Kristin Yates Thomas. 2010.
Trust Online: Young AdultsEvaluation of Web Content.International Journal of
Communication 4 (27):468494.
Ireland, Sonnet. 2018.Fake News Alerts: Teaching News Literacy Skills in a Meme World.
The Reference Librarian 59 (3):122128. doi:10.1080/02763877.2018.1463890.
Jacs
o, P
eter. 2008.Google Scholar Revisited.Online Information Review 32 (1):102114. doi:10.
1108/14684520810866010.
Jensenius, Francesca R., Mala Htun, David J. Samuels, David A. Singer, Adria Lawrence, and
Michael Chwe. 2018.The Benefits and Pitfalls of Google Scholar.PS: Political Science &
Politics 51 (4):820824. doi:10.1017/S104909651800094X.
Kahne, Joseph, Nam-Jin Lee, and Jessica Timpany Feezell. 2012.Digital Media Literacy
Education and Online Civic and Political Participation.International Journal of
Communication 6:124.
King , Gary, Robert O., Keohane , and Sidney Verba. 2021.Designing social inquiry: Scientific
inference in qualitative research. Princeton: Princeton University Press.
Markle, Gail. 2017.Factors Influencing Achievement in Undergraduate Social Science Research
Methods Courses: A Mixed Methods Analysis.Teaching Sociology 45 (2):105115. doi:10.
1177/0092055X16676302.
McGrew, Sarah, Joel Breakstone, Teresa Ortega, Mark Smith, and Sam Wineburg. 2018.Can
Students Evaluate Online Sources? Learning from Assessments of Civic Online Reasoning.
Theory & Research in Social Education 46 (2):165193. doi:10.1080/00933104.2017.1416320.
Metzger, Miriam J. 2007.Making Sense of Credibility on the Web: Models for Evaluating
Online Information and Recommendations for Future Research.Journal of the American
Society for Information Science and Technology 58 (13):20782091. doi:10.1002/asi.20672.
Mislevy, Robert. J. 2004.Can There Be Reliability Without Reliability?Journal of Educational
and Behavioral Statistics 29 (2):241244. doi:10.3102/10769986029002241.
Monsees, Linda. 2020.A War Against Truth- Understanding the Fake News Controversy.
Critical Studies on Security 8 (2):116129. doi:10.1080/21624887.2020.1763708.
JOURNAL OF POLITICAL SCIENCE EDUCATION 9
Moss, Pamela A. 2004.The Meaning and Consequences of Reliability.Journal of Educational
and Behavioral Statistics 29 (2):245249. doi:10.3102/10769986029002245.
Musgrove, Ann T., Jillian R. Powers, Lauri C. Rebar, and Glenn J. Musgrove. 2018.Real or
Fake? Resources for Teaching College Students How to Identify Fake News.College &
Undergraduate Libraries 25 (3):243260. doi:10.1080/10691316.2018.1480444.
Pennycook, Gordon, Ziv Epstein, Mohsen Mosleh, Antonio A. Arechar, Dean Eckles, and
David G. Rand. 2021.Shifting Attention to Accuracy Can Reduce Misinformation Online.
Nature 592 (7855):590595. doi:10.1038/s41586-021-03344-2.
Sanchez, Christopher A., Jennifer Wiley, and Susan R. Goldman. 2006.Teaching Students to
Evaluate Source Reliability During Internet Research Tasks.International Society of the
Learning Sciences 2:662666.
Sides, John. 2011.The Political Scientist as Blogger.PS: Political Science & Politics 44(02):267
271. doi:10.1017/S1049096511000060.
Suboti
c, Jelena. 2021.Ethics of Archival Research on Political Violence.Journal of Peace
Research 58 (3):342354. doi:10.1177/0022343319898735.
Tarrow, Sidney. 1995.Bridging the Quantitative-Qualitative Divide in Political Science.
American Political Science Review 89 (2):471474. doi:10.2307/2082444.
Topal, Reyhan, and Farzin Shargh. 2022.Bringing the Students Back in: How to Re-Engage
Students in a Post-CovidWorld.Political Science Educator 26(1). August 11. https://educate.
apsanet.org/bringing-the-students-back-in-how-to-re-engage-students-in-a-postcovid-world
Walter, Nathan, Jonathan Cohen, R. Lance Holbert, and Yasmin Morag. 2020.Fact-Checking: A
Meta-Analysis of What Works and for Whom.Political Communication 37 (3):350375.
doi:10.1080/10584609.2019.1668894.
Wilner, Alex S. 2018.Cybersecurity and Its Discontents: Artificial Intelligence, the Internet of
Things, and Digital Misinformation.International Journal: Canadas Journal of Global Policy
Analysis 73(2):308316. doi:10.1177/0020702018782496.
10 R. TOPAL AND F. SHARGH
... For instance, students can be tasked with finding the same information from various sources and then comparing their reliabilities and accuracy. Additionally, teachers should introduce a variety of information sources, both reliable and unreliable, and assign tasks that require students to assess their credibility (Topal & Shargh, 2023). The third aspect, inference, falls into the very low category with an average percentage of 25.5%. ...
Article
Full-text available
Critical thinking is an essential ability for students in the 21 st century, particularly in complex subjects like the endocrine system in biology, which requires a deeper conceptual understanding. This study aims to analyze the critical thinking ability profile of Grade XI high school students in Bantul Regency on the topic of the endocrine system, a subject often considered difficult due to its abstract nature. This research is a descriptive quantitative study using a purposive sampling technique, involving 260 Grade XI students from various high schools in Bantul Regency. The test used consists of 8 essay questions, designed based on Ennis' critical thinking aspects. The results reveal that the overall critical thinking ability of these students is still very low. Specifically, the percentage results for each aspect are as follows: the elementary clarification aspect is 37.8% with a very low category, the basic support aspect is 47.4% with a low category, the inference aspect is 25.5% with a very low category, the advance clarification aspect is 5 4.2% with a low category, and the strategy and tactics aspect is 47.8% with a very low category. These results provide a broader understanding of the critical thinking ability of Grade XI high school students in Bantul Regency, which can serve as a basis for educators in developing and designing various learning innovations aimed at enhancing pupils' capacity for critical thought.
... This view is further supported by Seraphin et al. (2023), who argued that the current ideology of education in the UK (neoliberalism and managerialism) is largely influencing curriculum and recruitment, particularly in tourism and related subjects. However, online content as a replacement for course materials has already raised concerns on the basis of reliability and misinformation (Genota, 2018;Topal and Shargh, 2023;Wyatt, 2024). Such arguments extend to online and AI-related replacements of academic teaching staff. ...
Article
Purpose – The recruitment and promotion of teaching academics in the UK is constrained by a complex array of career progression barriers. These barriers have led to an increasing trend of horizontal career (lack of) progression. The purpose of this paper is to reveal and discuss linearity and horizontality constrictions, challenges and issues impacting on potential careers in tourism academia. Design/methodology/approach – This study uses a leading UK national academic recruitment website to gather data and insights from across 137 posted jobs related to tourism between 2020 and 2022. Findings – The main findings of this work note the constrictions of the UK academic job market and the consequences it poses for academics within tourism and beyond. It is proposed that future research to further understand the realities faced by academics is needed to prompt action for change to create more enriching career development. Originality/value – The contribution of this study centres around sense making a phenomenon that exists but is not often talked about within academia (whether in tourism or beyond). For academics and managers, this paper presents an opportunity to reflect more holistically on careers with a view to instigating valuable change moving forward (for oneself or others). There is also a dearth of studies relating to career progression of tourism higher education educators.
... Information literacy-that is, 'the set of skills needed to find, retrieve, analyze, and use information' (Marfleet and Dille 2005)-has not changed with the advent of AI-assisted search. Instructors are still going to have to help students to hone their information and media literacy skills with in-class exercises on the proper use of AI (Topal and Shargh 2023). It is even more reason why information literacy cannot simply be seen as just an extension of the library. ...
Article
For event management success, event management students must understand and develop skills relating to storyline development for festival (and event) experiences. However, as this teaching and learning note explains, students may not be fully developing their theoretical understanding and how to apply that understanding in practice due to their growing reliance on non-academic online resources. Teaching notes offer practical guidance for delivering and teaching various topics but are not often directed towards students. This article adapts the traditional teaching note to highlight this issue and offers a dual teaching and learning note to better support student learning and skills development.
Article
Full-text available
College students lack fact-checking skills, which may lead them to accept information at face value. We report findings from an institution participating in the Digital Polarization Initiative (DPI), a national effort to teach students lateral reading strategies used by expert fact-checkers to verify online information. Lateral reading requires users to leave the information (website) to find out whether someone has already fact-checked the claim, identify the original source, or learn more about the individuals or organizations making the claim. Instructor-matched sections of a general education civics course implemented the DPI curriculum ( N = 136 students) or provided business-as-usual civics instruction ( N = 94 students). At posttest, students in DPI sections were more likely to use lateral reading to fact-check and correctly evaluate the trustworthiness of information than controls. Aligning with the DPI’s emphasis on using Wikipedia to investigate sources, students in DPI sections reported greater use of Wikipedia at posttest than controls, but did not differ significantly in their trust of Wikipedia. In DPI sections, students who failed to read laterally at posttest reported higher trust of Wikipedia at pretest than students who read at least one problem laterally. Responsiveness to the curriculum was also linked to numbers of online assignments attempted, but unrelated to pretest media literacy knowledge, use of lateral reading, or self-reported use of lateral reading. Further research is needed to determine whether improvements in lateral reading are maintained over time and to explore other factors that might distinguish students whose skills improved after instruction from non-responders.
Article
Full-text available
In recent years, there has been a great deal of concern about the proliferation of false and misleading news on social media1–4. Academics and practitioners alike have asked why people share such misinformation, and sought solutions to reduce the sharing of misinformation5–7. Here, we attempt to address both of these questions. First, we find that the veracity of headlines has little effect on sharing intentions, despite having a large effect on judgments of accuracy. This dissociation suggests that sharing does not necessarily indicate belief. Nonetheless, most participants say it is important to share only accurate news. To shed light on this apparent contradiction, we carried out four survey experiments and a field experiment on Twitter; the results show that subtly shifting attention to accuracy increases the quality of news that people subsequently share. Together with additional computational analyses, these findings indicate that people often share misinformation because their attention is focused on factors other than accuracy—and therefore they fail to implement a strongly held preference for accurate sharing. Our results challenge the popular claim that people value partisanship over accuracy8,9, and provide evidence for scalable attention-based interventions that social media platforms could easily implement to counter misinformation online.
Article
Jean Jacques Rousseau wrote that “Man is born free, but is everywhere in chains.” Whereas the former claim of the quote is contestable and gendered, the latter part is empirically true from slavery to economic exploitation and widespread oppression that occurs to this day. Nevertheless, history shows that rarely will people take up weapons and rebel against the powerful. We have found that students often do not understand why this should be the case, given the rights that all people deserve. We use the Peasant Game exercise in class to shine a light on why most people, most of the time, endure repression and choose not to rebel. The game is played in turns with some students as lords, who decide how “food” will be apportioned, and other students as peasants, who produce the food. We discuss how power differentials occur and the difference they make. Students who play the game come away with a better understanding of why many people decide not to fight back against oppression—even if it is the right thing to do.
Article
While archival research most often does not include direct interaction with living subjects, ethical issues surrounding this method are no less acute. These issues are even more profound in studies of violence, where the likely questions are often about life, death, murder, culpability, responsibility, punishment, or remorse. Identifying answers to such questions is a process rife with ethical minefields, including possibility of unfair affiliation of individuals with violent groups, or tendentious interpretation of past documents, or even avoidance of specific archival material if it causes direct and irreversible reputational harm. While other disciplines have begun a more thorough evaluation of the ethics of archival research, political science has so far remained largely silent on this issue. To bring these conversations to political science, I discuss three main ethical challenges in conducting archival research on political violence: the role of researcher in interpretation; harms and benefits to subjects of research; and the politics of archives and politicization of research. I illustrate the arguments with my own archival research on Holocaust remembrance in post-communist Europe. I discuss archives – public and private – as sites of my own research and present ethical challenges I encountered while working with these archival materials. I then provide a possible path toward more ethical archival research on political violence and link this path to the ongoing discussion about data and research transparency in qualitative work.
Article
Fake News, dis- and misinformation campaigns are a core concern for current democratic societies. Whereas most academic interventions have focused on the epistemological and political implication, this paper provides an empirically informed analysis of the fake news controversies. Through an empirical analysis of the German fake news controversy, this paper advances two points: It first gives insights into how the fake news controversy unfolded in Germany. The article shows how multiple issues such as racism, social media and the geopolitical threat of Russia were bound together. Second, on a conceptual level, this article argues for analysing security controversies as a valuable tool to understand new security anxieties. In the context of fake news, studying the controversy reveals how anxieties concerning fake news are produced and reinforced by linking them through a multiplicity of issues. (In)security emerges in controversies where threats in and through new media are linked with the problem of fake news. As a result, ‘fake news’ becomes part of the broader security landscape of contemporary societies.
Article
Politically tumultuous times have created a problematic space for teachers who include the news in their classrooms. Few studies have explored perceptions of news credibility among secondary social studies teachers, the educators most likely to regularly incorporate news media into their classrooms. We investigated teachers’ operational definitions of credibility and the relationships between political ideology and assessments of news source credibility. Most teachers in this study used either static or dynamic definitions to describe news media sources’ credibility. Further, teachers’ conceptualizations of credibility and perceived ideological differences with news sources were associated with how credible teachers found each source. These results indicate potential inconsistencies in how news credibility is defined and possible political bias in which sources social studies teachers use as exemplars of credibility.
Article
Despite its growing prominence in news coverage and public discourse, there is still considerable ambiguity regarding when and how fact-checking affects beliefs. Informed by theories of motivated reasoning and message design, a meta-analytic review was undertaken to examine the effectiveness of fact-checking in correcting political misinformation (k = 30,N = 20,963). Fact-checking has a significantly positive overall influence on political beliefs (d = 0.29), but the effects gradually weaken when using “truth scales,” refuting only parts of a claim, and fact-checking campaign-related statements. Likewise, the ability to correct political misinformation with fact-checking is substantially attenuated by participants’ preexisting beliefs, ideology, and knowledge. The study concludes with a discussion of the fact-checking literature in light of current gaps and future opportunities.
Article
Fake news has captured the world’s attention. Educational survey research has highlighted the difficulties students and adults have in determining how to identify valid sources. Psychology can help us to understand why it is difficult to distinguish fact from fiction. The authors describe how to identify fake news from digital sources and ways faculty and librarians can teach information literacy skills using the Association of College and Research Libraries (ACRL) Framework, websites, LibGuides, worksheets, and other resources shared in the extensive appendix.
Article
The future of cybersecurity is in flux. Artificial intelligence challenges existing notions of security, human rights, and governance. Digital misinformation campaigns leverage fabrications and mistruths for political and geostrategic gain. And the Internet of Things—a digital landscape in which billions of wireless objects from smart fridges to smart cars are tethered together—provides new means to distribute and conduct cyberattacks. As technological developments alter the way we think about cybersecurity, they will likewise broaden the way governments and societies will have to learn to respond. This policy brief discusses the emerging landscape of cybersecurity in Canada and abroad, with the intent of informing public debate and discourse on emerging cyber challenges and opportunities.