Browsing by Subject "progress monitoring"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item An Evaluation of the Accuracy of Time Series Interpretations of CBM-R Progress Monitoring Data(2015-06) Van Norman, EthanCurriculum based measurement of reading (CBM-R) is used to monitor the effects of academic interventions for individual students. Decisions to continue, modify, or terminate instructional programs are made by interpreting patterns of observations collected across time. Educators visually analyze or apply decision rules to evaluate student progress. Despite the popularity of CBM-R as a progress monitoring tool, there is a paucity of research evaluating the accuracy of visual analysis and decision rules. Inaccurate interpretations undermine the use of CBM-R as a progress monitoring tool because educators may continue ineffective interventions or prematurely terminate effective interventions. The accuracy of visual analysis and decision rules were investigated in this project. In Study 1 a large extant dataset was analyzed to identify measurement characteristics of CBM-R progress monitoring data. In Study 2 the accuracy of visual analysis and decision rules were evaluated by comparing responses from visual analysts and decision rules with responses of an expert panel. One hundred eight progress monitoring graphs were evaluated in Study 2. The manner in which progress monitoring graphs differed was informed by the results of Study 1. The results of this project suggest evaluation method, number of weeks data are collected, variability of observations, and whether the student is making adequate progress influence the probability of correct decisions. Educators and researchers can improve the probability of correct decisions by visually analyzing progress monitoring graphs with a goal line and trend line, minimizing variability, and collecting data for longer than six weeks. The implications of the findings, limitations, and needs for future research are discussed.Item The IEP Data Collection Intentions Scale (IDCIS): Scale Development and Validation for Intended Score Interpretation and Use in Early Childhood(2019-08) Rudolph, BrennaThere is evidence to suggest a research-to-practice gap exists in regard to Early Childhood Special Education (ECSE) teachers’ collection of data highlighting students’ progress toward meeting their Individualized Education Program (IEP) goals and objectives (i.e., IEP data collection). Due to the negligible amount of research in this area in addition to the limitations present in the literature, however, it is unclear what factors are responsible for causing and maintaining this gap. Given that teachers are ultimately responsible for deciding whether and how to engage in IEP data collection, a focus on better understanding teachers’ intentions to collect IEP data is a logical first step. With an emphasis on enhancing the measurement techniques employed in previous studies, this application of a cross-sectional survey design aimed to validate the intended interpretations and uses of scores resulting from administration of a newly developed scale—the IEP Data Collection Intentions Scale (IDCIS). Following survey completion by 368 ECSE teachers across the state of Minnesota, confirmatory factor analysis, item analysis, and item response modeling were performed to support scale development. Results indicated that following minor adjustments, the IDCIS can be used to produce precise measures of teachers’ attitudes, subjective norms, self-efficacy, controllability, and intentions related to the collection of IEP data. Furthermore, the scores produced by IDCIS administration can be used to make valid and reliable inferences about teachers’ levels of each construct in order to inform the creation and modification of future implementation supports, thus decreasing the gap between what is known and what is practiced in today’s classrooms related to data collection.