This study examined the relationship of an expert-system
scored constrained free-response item (requiring
the student to debug a faulty computer program) to
two other item types: (1) multiple-choice and (2) free-response
(requiring production of a program). Confirmatory
factor analysis was used to test the fit of a
three-factor model to these data and to compare the fit
of the model to three alternatives. These models were
fit using two random-half samples, one given a faulty
program containing one bug and the other a program
with three bugs. A single-factor model best fit the data
for the sample taking the one-bug constrained free response
and a two-factor model fit the data somewhat
better for the second sample. In addition, the factor
intercorrelations showed this item type to be highly related
to both the free-response and multiple-choice
measures. Index terms: artificial intelligence, constructed-response items, expert-system scoring, free-response
items, open-ended items.
Bennett, Randy E, Rock, Donald A, Braun, Henry I, Frye, Douglas & et al. (1990). The relationship of expert-system scored constrained free-response items to multiple-choice and open-ended items. Applied Psychological Measurement, 14, 151-162. doi:10.1177/014662169001400204
Bennett, Randy Elliot; Rock, Donald A.; Braun, Henry I.; Douglas, Frye; Spohrer, James C.; Soloway, Elliot.
The relationship of expert-system scored constrained free-response items to multiple-choice and open-ended items.
Retrieved from the University of Minnesota Digital Conservancy,
Content distributed via the University of Minnesota's Digital Conservancy may be subject to additional license and use restrictions applied by the depositor.