Reliability of ratings for multiple judges: Intraclass correlation and metric scales

Persistent link to this item

Statistics
View Statistics

Journal Title

Journal ISSN

Volume Title

Title

Reliability of ratings for multiple judges: Intraclass correlation and metric scales

Alternative title

Published Date

1991

Publisher

Type

Article

Abstract

Scale-dependent procedures are presented for assessing the reliability of ratings for multiple judges using intraclass correlation. Scale type is defined in terms of admissible transformations, and standardizing transformations for ratio and interval scales are presented to solve the problem of adjusting ratings for "arbitrary scale factors" (unit and/or origin of the scale). The theory of meaningfulness of numerical statements is introduced and the coefficient of relational agreement (Stine, 1989b) is defined as the degree of agreement among judges, with respect to (scale-dependent) empirically meaningful relationships. Other topics discussed include the treatment of variability due to judges in relation to scale type, and the reliability of magnitude estimates in psychophysics. Index terms: coefficient of agreement, intraclass correlation, meaningfulness, metric scales, reliability of rating scales.

Keywords

Description

Related to

Replaces

License

Series/Report Number

Funding information

Isbn identifier

Doi identifier

Previously Published Citation

Fagot, Robert F. (1991). Reliability of ratings for multiple judges: Intraclass correlation and metric scales. Applied Psychological Measurement, 15, 1-11. doi:10.1177/014662169101500101

Other identifiers

doi:10.1177/014662169101500101

Suggested citation

Fagot, Robert F.. (1991). Reliability of ratings for multiple judges: Intraclass correlation and metric scales. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/113942.

Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.