Browsing by Author "Candell, Gregory L."
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item An iterative procedure for linking metrics and assessing item bias in item response theory(1988) Candell, Gregory L.; Drasgow, FritzThe presence of biased items may seriously affect methods used to link metrics in item response theory. An iterative procedure designed to minimize this methodological problem was examined in a monte carlo investigation using the two-parameter item response model. The iterative procedure links the scales of independently calibrated parameter estimates using only those items identified as unbiased. Two methods for transforming parameter estimates to a common metric were incorporated into the iterative procedure. The first method links scales by equating the first two moments of the distributions of estimated item difficulties. The second method determines the linking transformation by minimizing differences across IRT characteristic curve estimates. Results indicate that iterative linking provides a substantial improvement in item bias detection over the noniterative approach. Index terms: Item bias, Item response theory, Iterative method, Linking, Metric linking, Two-parameter item response model.Item Modeling incorrect responses to multiple-choice items with multilinear formula score theory(1989) Drasgow, Fritz; Levine, Michael V.; Williams, Bruce; McLaughlin, Mary E.; Candell, Gregory L.Multilinear formula score theory (Levine, 1984, 1985, 1989a, 1989b) provides powerful methods for addressing important psychological measurement problems. In this paper, a brief review of multilinear formula scoring (MFS) is given, with specific emphasis on estimating option characteristic curves (OCCS). MFS was used to estimate OCCS for the Arithmetic Reasoning subtest of the Armed Services Vocational Aptitude Battery. A close match was obtained between empirical proportions of option selection for examinees in 25 ability intervals and the modeled probabilities of option selection. In a second analysis, accurately estimated OCCS were obtained for simulated data. To evaluate the utility of modeling incorrect responses to the Arithmetic Reasoning test, the amounts of statistical information about ability were computed for dichotomous and polychotomous scorings of the items. Consistent with earlier studies, moderate gains in information were obtained for low to slightly above average abilities. Index terms: item response theory, marginal maximum likelihood estimation, maximum likelihood estimation, multilinear formula scoring, option characteristic curves, polychotomous measurement, test information function.