Ziegler, Laura Ann2014-08-262014-08-262014-06https://hdl.handle.net/11299/165153University of Minnesota Ph.D. dissertation. June 2014. Major: Educational Psychology. Advisors: Joan Garfield and Michelle Everson. 1 computer file (PDF); xv, 433 pages; appendices A-L.The purpose of this study was to develop the Basic Literacy In Statistics (BLIS) assessment for students in an introductory statistics course, at the postsecondary level, that includes, to some extent, simulation-based methods. The definition of statistical literacy used in the development of the assessment was the ability to read, understand, and communicate statistical information. Evidence of reliability, validity, and value were collected during the development of the assessment using a mixed-methods approach. There is a need for a new assessment for introductory statistics courses. Multiple instruments were available to assess students in introductory statistics courses (e.g., Comprehensive Assessment of Outcomes in a First Statistics Course, CAOS; delMas, Garfield, Ooms, & Chance, 2007; Goals and Outcomes Associated with Learning Statistics, GOALS; Garfield, delMas, & Zieffler, 2012); however, there were not assessments available that focused on statistical literacy. In addition, there are introductory statistics courses that are teaching new content such as simulation-based methods (e.g., Garfield et al., 2012; Tintle, VanderStoep, Holmes, Quisenberry, & Swanson, 2011). To meet the need for a new assessment, the BLIS assessment was developed. Throughout the development of the BLIS assessment, evidence of reliability, validity, and value were collected. A test blueprint was created based on a review of textbooks that incorporate simulation-based methods (e.g., Catalysts for Change, 2013), reviewed by six experts in statistics education, and modified to provide evidence of validity. A preliminary version of the assessment included 19 items chosen from existing instruments and 18 new items. To collect evidence of reliability and validity, the assessment was reviewed by the six experts and revised. Additional rounds of revisions were made based on cognitive interviews (N=6), a pilot test (N=76), and a field test (N=940), all of which were conducted with students who had recently completed or were currently enrolled in an introductory statistics course, at the secondary level. Instructors who administered the assessment to their students in the field test completed a survey to gather evidence of the value of the BLIS assessment to statistics educators (N=26). Data from the field test was examined using analyses based on Classical Test Theory (CTT) and Item Response Theory (IRT). When examining individual item scores, coefficient alpha was high, .83. The BLIS assessment contains testlets, so the Partial Credit (PC) model was fit to the data. Evidence of reliability and validity was high; however, more items with high difficulty levels could increase the precision in estimating ability estimates for higher achieving students. Instructors who completed the survey indicated that the BLIS assessment has high value to statistics educators. Therefore, the BLIS assessment could provide valuable information to researchers conducting studies about students' understanding of statistical literacy in an introductory statistics course that includes simulation-based methods.en-USAssessmentSimulationStatistics education researchReconceptualizing statistical literacy: developing an assessment for the modern introductory statistics courseThesis or Dissertation