On the Quantification and Generalizability of Differential Prediction in Selection Systems

Loading...
Thumbnail Image

Persistent link to this item

Statistics
View Statistics

Journal Title

Journal ISSN

Volume Title

Title

On the Quantification and Generalizability of Differential Prediction in Selection Systems

Published Date

2019-05

Publisher

Type

Thesis or Dissertation

Abstract

Differential prediction analyses are important for personnel psychologists to determine whether the regression lines linking a predictor variable to a criterion/performance variable are comparable between a referent group and a legally protected focal group. Although many decades of research on cognitive tests has indicated that differential prediction does occur for racial/ethnic minority groups in the U.S. relative to Whites, the bulk of evidence has indicated that these differences result into the overprediction of Black and Hispanic individuals’ performance from cognitive test scores, which does not indicate predictive bias against these groups. However, research published over the past decade by Aguinis Culpepper, and Pierce (2010; 2016) has questioned the accuracy and generalizability of past findings, arguing that the historic trends could have been caused by statistical artifacts. In a series of four studies, I present methodological advancements in the quantification of differential prediction and supply substantive analyses that refute the findings reported by Aguinis et al. (2010; 2016). Specifically, I (1) offer derivations of simplified effect-size estimation procedures for differential prediction analyses with accompanying standard-error estimators, (2) illustrate the effects of composite predictors on differential prediction effects, (3) demonstrate the generalizability of White-minority and male-female differential prediction in the post-secondary education admissions domain, and (4) present findings from a simulation study designed to identify which features of selection systems could cause statistical artifacts to bias the results of differential prediction analyses conducted on cognitive test scores.

Description

University of Minnesota Ph.D. dissertation. May 2019. Major: Psychology. Advisors: Paul Sackett, Nathan Kuncel. 1 computer file (PDF); xix, 321 pages.

Related to

Replaces

License

Collections

Series/Report Number

Funding information

Isbn identifier

Doi identifier

Previously Published Citation

Suggested citation

Dahlke, Jeffrey. (2019). On the Quantification and Generalizability of Differential Prediction in Selection Systems. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/206301.

Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.