Browsing by Author "Sireci, Stephen G."
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Analyzing test content using cluster analysis and multidimensional scaling(1992) Sireci, Stephen G.; Geisinger, Kurt F.A new method for evaluating the content representation of a test is illustrated. Item similarity ratings were obtained from content domain experts in order to assess whether their ratings corresponded to item groupings specified in the test blueprint. Three expert judges rated the similarity of items on a 30-item multiple-choice test of study skills. The similarity data were analyzed using a multidimensional scaling (MDS) procedure followed by a hierarchical cluster analysis of the MDS stimulus coordinates. The results indicated a strong correspondence between the similarity data and the arrangement of items as prescribed in the test blueprint. The findings suggest that analyzing item similarity data with MDS and cluster analysis can provide substantive information pertaining to the content representation of a test. The advantages and disadvantages of using MDS and cluster analysis with item similarity data are discussed. Index terms: cluster analysis, content validity, multidimensional scaling, similarity data, test construction.Item Using subject-matter experts to assess content representation: An MDS analysis(1995) Sireci, Stephen G.; Geisinger, Kurt F.Demonstration of content domain representation is of central importance in test validation. An expanded version of the method of content evaluation proposed by Sireci & Geisinger (1992) was evaluated with respect to a national licensure examination and a nationally standardized social studies achievement test. Two groups of 15 subject-matter experts (SMEs) rated the similarity of all item pairs comprising a test, and then rated the relevance of the items to the content domains listed in the test blueprints. The similarity ratings were analyzed using multidimensional scaling (MDS); the item relevance ratings were analyzed using procedures proposed by Hambleton (1984) and Aiken (1980). The SMES’ perceptions of the underlying content structures of the tests emerged in the MDS solutions. All dimensions were germane to the content domains measured by the tests. Some of these dimensions were consistent with the content structure specified in the test blueprint, others were not. Correlation and regression analyses of the MDS item coordinates and item relevance ratings indicated that using both item similarity and item relevance data provided greater information of content representation than did using either approach alone. The implications of the procedure for test validity are discussed and suggestions for future research are provided. Index terms: construct validity, content validity, cluster analysis, multidimensional scaling, subject-matter experts, test construction.