Faking and the Validity of Personality Tests: Using New Faking-Resistant Measures to Study Some Old Questions
2017-02
Loading...
View/Download File
Persistent link to this item
Statistics
View StatisticsJournal Title
Journal ISSN
Volume Title
Title
Faking and the Validity of Personality Tests: Using New Faking-Resistant Measures to Study Some Old Questions
Authors
Published Date
2017-02
Publisher
Type
Thesis or Dissertation
Abstract
Despite strong evidence supporting the validity of personality measures for personnel selection, their susceptibility to faking has been a persistent concern. Research has found that many job applicants exaggerate their possession of desirable traits, and there are reasons to believe that this distortion reduces criterion-related validity. However, the lack of studies that combine experimental control with real-world generalizability makes it difficult to isolate the effects of applicant faking. Experimental studies have typically induced faking using explicit instructions to fake, which elicit unusually extreme faking compared to typical applicant settings. A variety of non-experimental approaches have also been employed, but these approaches are largely inadequate for establishing cause-and-effect relationships. Thus, researchers continue to debate whether applicant faking substantially attenuates the validity of personality tests. The present study used a new experimental framework to study this question and related methodological issues in the faking literature. First, it included a subtle incentive to fake in addition to explicit instructions to respond honestly or fake good. Second, it compared faking on standard Likert scales to faking on multidimensional forced choice (MFC) scales designed to resist deception. Third, it compared more and less fakable versions of the same MFC inventory to eliminate confounding differences between MFC and Likert scales. The result was a 3 x 3 design that simultaneously manipulated the motivation and ability to fake, allowing for a more rigorous examination of the faking–validity relationship. Results indicated complex relationships between faking and the validity of personality scores. Directed fakers were much better at raising their scores on Likert scales than MFC measures of the same traits. However, MFC scales failed to retain more validity than Likert scales when participants faked. Supplemental analyses suggested that extreme faking decimated the construct validity of all scales regardless of their fakability. Faking also added new common method variance to the Likert scales, which in turn contributed to the scales’ criterion-related validity. In addition to the effects of faking, the present study investigated two recurring methodological issues in the faking literature. First, I investigated the claim that directed faking is fundamentally different from typical faking by comparing results from directed and incentivized fakers. Directed faking results generally replicated using a subtle incentive to fake, but the effects were much smaller and less consistent. Second, some have argued that traditional criterion-related validity coefficients fail to capture the negative effects of faking on actual selection decisions. I investigated this possibility by creating simulated selection pools in which fakers and honest responders competed for limited positions. The simulation results generally indicated reasonable correspondence between validity estimates and selected group performance, suggesting that validity coefficients adequately reflected the effects of faking. Results are interpreted using existing theories of faking, and new methodologies are proposed to advance the study of typical faking behavior.
Description
University of Minnesota Ph.D. dissertation. February 2017. Major: Psychology. Advisor: Nathan Kuncel. 1 computer file (PDF); xii, 280 pages.
Related to
Replaces
License
Collections
Series/Report Number
Funding information
Isbn identifier
Doi identifier
Previously Published Citation
Other identifiers
Suggested citation
Huber, Christopher. (2017). Faking and the Validity of Personality Tests: Using New Faking-Resistant Measures to Study Some Old Questions. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/185605.
Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.