Table 1 - uploaded by Allison B. Kaufman
Content may be subject to copyright.
2 × 2 Agreement matrix showing 57 % rater agreement The associated r value in this example was −.27, which is statistically significant in the opposite direction 

2 × 2 Agreement matrix showing 57 % rater agreement The associated r value in this example was −.27, which is statistically significant in the opposite direction 

Source publication
Article
Full-text available
The ability to measure agreement between two independent observers is vital to any observational study. We use a unique situation, the calculation of inter-rater reliability for transcriptions of a parrot’s speech, to present a novel method of dealing with inter-rater reliability which we believe can be applied to situations in which speech from hu...

Context in source publication

Context 1
... benefit of this approach was that the statistic addressed instances in which raters agreed upon what the behavior was, as well as instances in which raters agreed upon what the behavior was not, thus circumventing the main pitfall of the percent agreement statistic (Kaufman and Rosenthal 2009; see Table 2). As Table 1 demonstrates, percent agreement often overestimates the actual agreement between observers, resulting in high percent agreement and an r statistic that is low or even statistically significant in the opposite direction. Brennan and Light (1974) also assumed the marginal totals to be fixed. ...