Sireci, Stephen G.Geisinger, Kurt F.2011-11-282011-11-281995Sireci, Stephen G & Geisinger, Kurt F. (1995). Using subject-matter experts to assess content representation: An MDS analysis. Applied Psychological Measurement, 19, 241-255. doi:10.1177/014662169501900303doi:10.1177/014662169501900303https://hdl.handle.net/11299/118281Demonstration of content domain representation is of central importance in test validation. An expanded version of the method of content evaluation proposed by Sireci & Geisinger (1992) was evaluated with respect to a national licensure examination and a nationally standardized social studies achievement test. Two groups of 15 subject-matter experts (SMEs) rated the similarity of all item pairs comprising a test, and then rated the relevance of the items to the content domains listed in the test blueprints. The similarity ratings were analyzed using multidimensional scaling (MDS); the item relevance ratings were analyzed using procedures proposed by Hambleton (1984) and Aiken (1980). The SMES’ perceptions of the underlying content structures of the tests emerged in the MDS solutions. All dimensions were germane to the content domains measured by the tests. Some of these dimensions were consistent with the content structure specified in the test blueprint, others were not. Correlation and regression analyses of the MDS item coordinates and item relevance ratings indicated that using both item similarity and item relevance data provided greater information of content representation than did using either approach alone. The implications of the procedure for test validity are discussed and suggestions for future research are provided. Index terms: construct validity, content validity, cluster analysis, multidimensional scaling, subject-matter experts, test construction.enUsing subject-matter experts to assess content representation: An MDS analysisArticle