Browsing by Author "Tatsuoka, Kikumi K."
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Effects of response format on diagnostic assessment of scholastic achievement(1992) Birenbaum, Menucha; Tatsuoka, Kikumi K.; Gutvirtz, YaffaThe effect of response format on diagnostic assessment of students’ performance on an algebra test was investigated. Two sets of parallel, open-ended (OE) items and a set of multiple-choice (MC) items-which were stem-equivalent to one of the OE item sets-were compared using two diagnostic approaches: a "bug" analysis and a rule-space analysis. Items with identical format (parallel OE items) were more similar than items with different formats (OE vs. MC). Index terms: bug analysis, diagnostic assessment, free-response, item format, multiple-choice, rule space.Item Indices for detecting unusual patterns: Links between two general approaches and potential applications(1983) Tatsuoka, Kikumi K.; Linn, Robert L.Two distinct approaches, one based on item response theory and the other based on observed item responses and standard summary statistics, have been proposed to identify unusual response patterns of responses to test items. A link between these two approaches is provided by showing certain correspondences between Sato’s S-P curve theory and item response theory. This link makes possible several extensions of Sato’s caution index that take advantage of the results of item response theory. Several such indices are introduced and their use illustrated by application to a set of achievement test data. Two of the newly introduced extended indices were found to be very effective for purposes of identifying persons who consistently use an erroneous rule in attempting to solve signed-number arithmetic problems. The potential importance of this result is briefly discussed.Item Open-ended versus multiple-choice response formats--it does make a difference for diagnostic purposes(1987) Birenbaum, Menucha; Tatsuoka, Kikumi K.The purpose of the present study was to examine the effect of response format—open-ended (OE) versus multiple-choice (MC)—on the diagnosis of examinee misconceptions in a procedural task. A test in fraction addition arithmetic was administered to 285 eighth-grade students, 148 of whom responded to the OE version of the test and 137 to the MC version. The two datasets were compared with respect to the underlying structure of the test, the number of different error types, and the diagnosed sources of misconception (bugs) reflected in the response patterns. The overall results indicated considerable differences between the two formats, with more favorable results for the OE format. The effect of item format on examinee responses has been studied extensively in the past decade. The equivalence of open-ended (OE) items (also known as free-response or recall items) and multiple- choice (MC)items(also known as recognition items) has addressed by psychometricians and cognitive psychologists. From an information-processing point of view, different models for the two formats have been suggested (e. g., Bender, 1980). The commonly held view suggests that recall items require examinees to both search for and retrieve information, whereas recognition items require them only to discriminate among the presented information.