Article

Skilled or unskilled, but still unaware of it: how perceptions of difficulty drive miscalibration in relative comparisons.

Ross School of Business, University of Michigan, Ann Arbor, MI 48109, USA.
Journal of Personality and Social Psychology (Impact Factor: 5.08). 02/2006; 90(1):60-77. DOI: 10.1037/0022-3514.90.1.60
Source: PubMed

ABSTRACT People are inaccurate judges of how their abilities compare to others'. J. Kruger and D. Dunning (1999, 2002) argued that unskilled performers in particular lack metacognitive insight about their relative performance and disproportionately account for better-than-average effects. The unskilled overestimate their actual percentile of performance, whereas skilled performers more accurately predict theirs. However, not all tasks show this bias. In a series of 12 tasks across 3 studies, the authors show that on moderately difficult tasks, best and worst performers differ very little in accuracy, and on more difficult tasks, best performers are less accurate than worst performers in their judgments. This pattern suggests that judges at all skill levels are subject to similar degrees of error. The authors propose that a noise-plus-bias model of judgment is sufficient to explain the relation between skill level and accuracy of judgments of relative standing.

1 Follower
 · 
176 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Inaccurate judgments of task difficulty and invested mental effort may negatively affect how accurate students monitor their own performance. When students are not able to accurately monitor their own performance, they cannot control their learning effectively (e.g., allocate adequate mental effort and study time). Although students' judgments of task difficulty and invested mental effort are closely related to their study behaviors, it is still an open question how the accuracy of these judgments can be improved in learning from problem solving. The present study focused on the impact of three types of instructional support on the accuracy of students' judgments of difficulty and invested mental effort in relation to their performance while learning genetics in a computer-based environment. Sixty-seven university students with different prior knowledge received either incomplete worked-out examples, completion problems, or conventional problems. Results indicated that lower prior knowledge students performed better with completion problems, while higher prior knowledge students performed better with conventional problems. Incomplete worked-out examples resulted in an overestimation of performance, that is, an illusion of understanding, whereas completion and conventional problems showed neither over-nor underestimation. The findings suggest that completion problems can be used to avoid students' misjudgments of their competencies.
    Contemporary Educational Psychology 01/2015; 41. DOI:10.1016/j.cedpsych.2015.01.001 · 2.20 Impact Factor
  • Source
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper addresses measurement and conceptual issues related to the realism of people’s confidence judgments about their own cognitive abilities. We employed three cognitive tests: listening and reading subtests from the Test of English as a Foreign Language (TOEFL iBT) and a synonyms vocabulary test. The sample consisted of community college students. Our results show that the participants tend to be overconfident about their cognitive abilities on most tasks, representing poor realism. Significant group differences were noted with respect to gender and race/ethnicity: female and European American participants showed smaller levels of overconfidence than males and African Americans or Hispanics. We point out that there appear to be significant individual differences in the understanding of subjective probabilities, and these differences can influence the realism of confidence judgments. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
    European Journal of Psychological Assessment 01/2009; 25(2):123. DOI:10.1027/1015-5759.25.2.123 · 2.53 Impact Factor

Full-text (3 Sources)

Download
284 Downloads
Available from
Jun 4, 2014