Figure - available via license: Creative Commons Attribution-NonCommercial 4.0 International
Content may be subject to copyright.
Source publication
Electoral misinformation, where citizens believe false or misleading claims about the electoral process and electoral institutions—sometimes actively and strategically spread by political actors—is a challenge to public confidence in elections specifically and democracy more broadly. In this article, we analyze a combination of 42 million clicks in...
Contexts in source publication
Context 1
... model 6, using our survey responses, and again focusing on WhatsApp and Facebook, we find no relationship between getting news on these platforms and belief in electoral misinformation. When we include other platforms (see Supplementary Material table 17), we find inconsistent relationships for different platforms and each wave. Given these different patterns, H3 is not confirmed. ...Context 2
... and his supporters were responsible for sharing and disseminating most of the misinformation examined in the study. In Supplementary Material table 18, we also include results controlling for support for Bolsonaro, which predicts increase in electoral misinformation belief over time. These effects are stronger in wave 4, which might be a consequence of a "loser effect" following the election outcome. ...Citations
... Low-information voters are vulnerable to irrelevant cues in the political environment (21), vote against their personal and group interests (22)(23)(24), and are more susceptible to populist, manipulative, and misinformative rhetoric (25). In turn, news exposure leads to more informed citizens (26)(27)(28)(29)(30), increases opinion stability and voting in accordance with one's interests (24, 31), decreases beliefs in misinformation (32)(33)(34), enhances efficacy, tolerance, and the acceptance of democratic norms (35,36), and leads to more equitable voting outcomes (37). Therefore, minimizing interest bias in recommendation algorithms and incentivizing greater consumption of verified news among citizens is of importance. ...
Recommendation algorithms profoundly shape users’ attention and information consumption on social media platforms. This study introduces a computational intervention aimed at mitigating two key biases in algorithms by influencing the recommendation process. We tackle \interest bias, or algorithms creating narrow non-news and entertainment information diets, and ideological bias, or algorithms directing the more strongly partisan users to like-minded content. Employing a sock-puppet experiment (N = 8,600 sock puppets) alongside a month-long randomized experiment involving 2,142 frequent YouTube users, we investigate if nudging the algorithm by playing videos from verified and ideologically balanced news channels in the background increases recommendations to and consumption of news. We additionally test if providing balanced news input to the algorithm promotes diverse and cross-cutting news recommendations and consumption. We find that nudging the algorithm significantly and sustainably increases both recommendations to and consumption of news and also minimizes ideological biases in recommendations and consumption, particularly among conservative users. In fact, recommendations have stronger effects on users’ exposure than users’ exposure has on subsequent recommendations. In contrast, nudging the users has no observable effects on news consumption. Increased news consumption has no effects on a range of survey outcomes (e.g., knowledge, participation, polarization, misperceptions), adding to the growing evidence of limited attitudinal effects of on-platform exposure. The intervention does not adversely affect user engagement on YouTube, showcasing its potential for real-world implementation. These findings underscore the influence wielded by platform recommender algorithms on users’ attention and information exposure.
This research paper examines how disinformation campaigns harm self-determination processes by examining the psychological mechanisms by which public opinion can be manipulated, including confirmation bias and emotional manipulation. This body of work takes on task of understanding how these tactics can affect political choices, from voting behavior to the support of a protest, hampering the development of democracy. The paper reveals real-world consequences of disinformation on self-determination movements through a case study of a selection of significant events, such as Brexit referendum and 2016 U.S. presidential election. It is vital to have high-level international legal frameworks and accountability mechanisms to address this problem. The proposals include establishing international conventions, increasing transparency in social media, strengthening public media literacy & establishing partnerships among stakeholders. This study concludes that protecting self-determination rights demands shared work to address misinformation, ensuring that people can interact suitably with right information for participation in democratic procedures. Addressing these challenges ensures respect for the integrity of public discourse and fundamental rights of the people worldwide in an increasingly complex information landscape.
Studies have found limited evidence consistent with the theory that partisan and like-minded online news exposure have demonstrable effects on political outcomes. Most of this prior research, however, has focused on the particular case of the United States even as concern elsewhere in the world has grown about political parallelism in media content online, which has sometimes been blamed for heightened social divisiveness. This article investigates the impact of online partisan news consumption on voting behavior and social polarization during the 2022 elections in Brazil, a country where the public’s ties to political parties have historically been more limited or nonexistent but where ideologically aligned news content online has markedly increased in recent years. Drawing on a unique dataset linking behavioral web-tracking data of 2,200 internet users in Brazil and 4 survey waves with the same respondents, conducted before, during, and after the 2022 presidential elections, we find no significant relationship between the use of partisan media on either vote choice or social polarization overall; however, we do find some weak and inconsistent effects of trust in news moderating the impact of partisan media on social polarization.