Article

Money, Time, and Political Knowledge: Distinguishing Quick Recall and Political Learning Skills

American Journal of Political Science (Impact Factor: 2.76). 12/2007; 52(1):169 - 183. DOI: 10.1111/j.1540-5907.2007.00306.x

ABSTRACT Surveys provide widely cited measures of political knowledge. Do seemingly arbitrary features of survey interviews affect their validity? Our answer comes from experiments embedded in a representative survey of over 1200 Americans. A control group was asked political knowledge questions in a typical survey context. Treatment groups received the questions in altered contexts. One group received a monetary incentive for answering the questions correctly. Another was given extra time. The treatments increase the number of correct answers by 11–24%. Our findings imply that conventional knowledge measures confound respondents' recall of political facts with variation in their motivation to exert effort during survey interviews. Our work also suggests that existing measures fail to capture relevant political search skills and, hence, provide unreliable assessments of what many citizens know when they make political decisions. As a result, existing knowledge measures likely underestimate people's capacities for informed decision making.

0 Followers
 · 
266 Views
    • "There have been criticisms of the use of factual questions as an indicator of what people know about politics (e.g., Graber 2001), but most of the discussion has focused on issues of measurement. In particular, past work had demonstrated that aspects of the interview context, such as question format, respondent incentives, and survey protocol, have powerful effects on observed levels of knowledge (e.g., Gibson and Caldiera 2009; Miller and Orr 2008; Mondak 2001; Prior and Lupia 2008). These efforts have resulted in valuable insights regarding optimal methods for measuring political knowledge (Boudreau and Lupia 2013). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Political knowledge is a central concept in the study of public opinion and political behavior. Yet what the field collectively believes about this construct is based on dozens of studies using different indicators of knowledge. We identify two theoretically relevant dimensions: a temporal dimension that corresponds to the time when a fact was established and a topical dimension that relates to whether the fact is policy-specific or general. The resulting typology yields four types of knowledge questions. In an analysis of more than 300 knowledge items from late in the first decade of the 2000s, we examine whether classic findings regarding the predictors of knowledge withstand differences across types of questions. In the case of education and the mass media, the mechanisms for becoming informed operate differently across question types. However, differences in the levels of knowledge between men and women are robust, reinforcing the importance of including gender-relevant items in knowledge batteries.
    American Political Science Association 11/2014; 108(04):840-855. DOI:10.1017/S0003055414000392 · 3.05 Impact Factor
  • Source
    • "8 To be sure, these measures are not without their problems. Prior and Lupia (2008) demonstrate that conventional knowledge measures often underestimate the degree to which the general public are able to engage in informed decision making, as the typical survey context provides few incentives for respondents to exert much effort to recall the correct response. We have no reason, however, to expect these effects to differ in any systematic way between respondents in initiative and non-initiative states. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Current literature often suggests that more information and choices will enhance citizens’ general political knowledge. Notably, some studies indicate that a greater number of state ballot initiatives raise Americans’ knowledge through increases in motivation and supply of political information. By contrast, we contend that political psychology theory and findings indicate that, at best, more ballot measures will have no effect on knowledge. At worst greater use of direct democracy should make it more costly to learn about institutions of representative government and lessen motivation by overwhelming voters with choices. To test this proposition, we develop a new research design and draw upon data more appropriate to assessing the question at hand. We also make use of a propensity score matching algorithm to assess the balance in the data between initiative state and non-initiative state voters. Controlling for a wide variety of variables, we find that there is no empirical relationship between ballot initiatives and political knowledge. These results add to a growing list of findings which cast serious doubt on the educative potential of direct democracy.
    Political Behavior 06/2014; 37(2). DOI:10.1007/s11109-014-9273-5 · 1.63 Impact Factor
  • Source
    • "Evidence of campaign learning among individuals who fail more general awareness tests means that they can learn about politics, but they habitually do not. Prior and Lupia (2008) show that small incentives greatly reduce the correlations between socioeconomic status indicators and ability to correctly answer factual questions about politics in a survey-based experiment. They conclude that the experimental treatments (e.g. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Revisionists demonstrate campaigns mobilize, educate, activate predispositions, and change minds. Attention has turned from the “minimum effects” thesis to questions about the conditions under which campaigns matter and questions about which types of people are susceptible to campaign effects. Focusing on whether campaign effects are mediated by chronic political awareness, I find that current scholarship on this question is mixed. Some find that campaigns affect the politically unaware most, some find bigger effects among more aware citizens, and some find similar effects across the awareness distribution. Noting the possibility that awareness mediates different types of campaign effects differently (e.g. priming, persuasion, or learning), Zaller’s Receive–Accept-Sample framework is consulted to develop expectations. I test the RAS generated predictions using the 2004 National Annenberg Election Survey pre/post panel. The results support the theory that awareness mediates different campaign effects differently. KeywordsPolitical awareness–Information effects–Campaign effects–Persuasion–Priming–Learning
    Political Behavior 01/2011; 33(2):203-223. DOI:10.1007/s11109-010-9129-6 · 1.63 Impact Factor
Show more

Preview

Download
4 Downloads
Available from