Browsing by Subject "assessment"
Now showing 1 - 20 of 30
- Results Per Page
- Sort Options
Item Analyzing Demographics: Assessing Library Use Across the Institution(2013-01-24) Nackerud, Shane; Fransen, Jan; Peterson, Kate; Mastel, KristenIn Fall 2011, staff at the University of Minnesota Libraries-Twin Cities undertook a project to measure how often, and in what ways, students used the Libraries' services. Partnering with the University's Office of Institutional Research, the team investigated ways to match library service usage to individual accounts while retaining patron privacy to determine who was – and was not – using the library. With complete data sets, the group was able to determine overall usage rates for undergraduate and graduate students and compare how students in different colleges used library services. This article discusses data gathering techniques, analysis, and initial findings.Item Arts for Academic Achievement: A Compilation of Evaluation Findings from 2004-2006(2007-03) Ingram, Debra; Meath, JudyThis report summarizes results of the first two years of a three-year evaluation of the Arts for Academic Achievement (AAA) program. To accomplish these goals, AAA provides schools a structure, resources, and support for collaborative projects between teachers and artists. The purpose of the projects is to increase the amount and quality of arts-based and arts-integrated learning by students. The major objectives of this study were to 1) examine student learning, as measured by standardized tests, in a larger set of grade levels, and 2) measure student effects not otherwise captured by standardized assessments.Item Assessing the Development of Students' Statistical Thinking: An Exploratory Study(2017-02) Le, LauraDeveloping students’ statistical thinking has been stressed as an important learning objective for statistics courses. In general, statistical thinking has been defined as “thinking like an expert applied statistician.” However, there is currently no consensus on the characteristics that make up statistical thinking. In addition, there is no known assessment that measures the complete construct of statistical thinking. The purpose of this study was to assess students’ statistical thinking in an introductory statistics course that is based on modeling and simulation. Specifically, the research question of interest was what components of students’ statistical thinking are revealed and developed in an introductory course that is based on modeling and simulation? To assess this, an assessment was created, called Modeling to Elicit Statistical Thinking (MODEST), that was based on a model of statistical thinking and utilized a type of problem that has been suggested to assess expert-like thinking (i.e., a Model-Eliciting Activity; MEA). To try to ensure that MODEST was an assessment of statistical thinking, several phases of feedback and pilot testing were carried out during the assessment development phase. In the field test phase, MODEST was administered online twice, at the beginning and at the end of the semester, to students enrolled in an introductory course that is based on modeling and simulation. Responses from 88 students were scored using a detailed scoring rubric to answer the research question. The results indicated that students appeared to enter the course with a moderate amount of statistical thinking (average score = 52%) and leave having developed some statistical thinking as a result of the course (average score difference = 6%; 95% CI: 2% to 10%). Even though the increase in their overall statistical thinking was significant, it was moderate (Cohen’s d = 0.34). Based on this, it appears that more could be done in the course to increase students’ statistical thinking. MODEST can be a valuable addition to the statistics education community by filling in the gap of assessing students’ statistical thinking. Both statistics education researchers and instructors would benefit from using MODEST to understand statistical thinking.Item Best practices for field days : assessment tool and observation protocol(St. Paul, MN: University of Minnesota Extension Service, 2009) Carlson, Stephan; Heimlick, Joe; Martin, StorksdieckThe Best Practices for Field Days (BPFD) Assessment Tool provides systematic observation methods to evaluate the success of Field Days in meeting intended educational outcomes. It uses evaluator observations to help organizers improve learning conditions and to help presenters develop their skills. Using this assessment tool will improve programs and enhance the student experience.Item Best practices for field days : Assessment tool and observation protocol(St. Paul, MN: University of Minnesota Extension Service, 2009) Carlson, Stephan; Heimlich, Joe; Storksdieck, Martin; Meyer, NathanItem Best Practices for Field Days: 2005 Report of Outcomes and Impacts: Making an Impact with Environmental Field Days: Workshops for Organizers and Presenters(St. Paul, MN: University of Minnesota Extension Service, 2005) Nate, MeyerInitiated in 2002 by members of the Environmental Science Education (ESE) Area of Expertise, Best Practices for Field Days (BPFD) is a University of Minnesota Extension Service professional development program for the people involved in field days. It involves learning how to design and deliver educational events that apply six research-based practices to maximize the educational impact of these events— - centering the event around a single theme, - assessing audience before the event, - planning the setting for effective education, - using appropriate teaching methods, - developing and implementing regular evaluation, and - integrating marketing. A variety of products and services encompass the program: technical articles, a curriculum & planning tools that can be purchased online, customized workshops and in-depth evaluations of events. Through maximizing the impact of field days for the 10 thousand+ students who participate annually, the BPFD program seeks to increase the educational return on thousands of volunteer hours and public dollars invested each year in these events. Program impacts include: a) collaboration, more economical, efficient & effective field day programs, b) creating an “interest pipeline” for youth to explore careers in natural resources, science & technology, c) increasing citizen environmental literacy and abilities to enact natural resource and environmental protection & enhancement through programs that reach young people.Item Best Practices for Field Days: 2008 Children’s Water Festival Evaluation: Presentation Skills for 29 Learning Stations(St. Paul, MN: University of Minnesota Extension Service, 2009) Carlson, Stephan; Wang, Hui-HuiTwenty-nine stations were observed by sixteen observers. Because the research studies could not control how many times that a station was observed, some of the stations were observed only once, while other stations were observed more than once. The station, “Water! Science Museum” was observed the most frequently, a total of sixteen times by sixteen different observers. The stations observed once by the sixteen observers are: “Well,Well,Well”, “DisappearingWaterfall Mystery”, “Streams Creatures”, “Lakes & Rivers & Oceans-Ohmy”, “BackyardWater Recycling”, and “Groundwater on the Move.” The following stations were not observed at all: “Water!Water! From the River to the River”and“Water Arcade.”Item Best Practices for Field Days: Environmental Field Days Assessment Tool: Focus Group Results(St. Paul, MN: University of Minnesota Extension Service, 2008) Marczak, Mary; Carlson, StephanThe following results are based on two focus groups conducted in October, 2007. The 14 participants (7 at each focus group) had gone through the Assessment Tool training on the use of the tools and also had used the tools to assess environmental field days. Focus group questions specifically addressed both the quality of the training as well as the tools (individual and holistic) themselves. The participants were asked not only to describe their experiences but also to provide recommendations to improve the training and the tools. During the focus group discussions it became clear that participants also wanted to address the actual day of the observation, or the field day experience itself. Thus, the results address these three key areas: 1) the day of the training; 2) the field day experience, and 3) the tools.Item Best Practices for Field Days: Modified Delphi used for Observation Tool Development(St. Paul, MN: University of Minnesota Extension Service, 2009) Heimlich, Joe; Carlson, Stephan; Tanner, Dawn; Storksdieck, MartinA team of 40 people from across the country were invited to the Best Practices for Field Days (BPFD) Delphi panel to develop an effective observation instrument for determining the quality of field day components that represent best practices. Thirty nine people accepted the invitation and 27 people participated.Item Beyond Butts in Seats: Creating campus and community partnerships through meaningful outreach(2015) Farrell, Shannon L.; Mastel, KristenIn order to stay relevant and meet the needs of our existing and potential users, libraries are forming partnerships and engaging users in numerous ways outside of the classroom. How do we measure the impact of our outreach programming? High attendance numbers may show that we had excellent swag and food at an event, but is counting heads a meaningful assessment measure? This poster will share examples of various kinds of outreach, discuss opportunities for forging partnerships, consider the impact of different outreach activities, and examine new assessment strategies to move beyond simple head counts.Item College Readiness Center Survey(2008) Stoffel, LolyannItem Item Customer Satisfaction Survey Results and Analysis(2007) Rousseau, MatthewItem A Game-Based Solution to the Lack of Training and Assessment Opportunities for Spatial Reasoning(2023-01) VanMeerten, NicolaasSpatial reasoning is an important skill that people use on a daily basis. There is also strong evidence that people with enhanced spatial reasoning skills are more likely to pursue successful careers related to Science, Technology, Engineering, and Mathematics (STEM). Spatial reasoning skills are also malleable, which suggests that spatial reasoning training and assessment could be used to enhance academic outcomes in STEM. However, there are relatively few readily accessible training or assessment opportunities for spatial reasoning. Commercial video games should be adapted to create more spatial reasoning training environments. Video games provide unique affordances that support training and learning, including: (1) delivering the appropriate level of challenge and (2) the ease of assessment integration. I found evidence that there is a relationship between performance in Optica, a mobile-puzzle game, and spatial reasoning skills among middle-school students. Specifically, I discovered a relationship between the number of levels completed in Optica and score on the PSVT: R by comparing multiple linear regression models with Akaike Information Criteria. Thus, Optica has shown potential as a suitable virtual environment for training and assessing spatial reasoning skills. Although there were limitations to this study, they can be remedied by updates to the design of the game, telemetry collection, and enhanced experimental design. I believe that Optica should be iterated upon to develop it into a fully-fledged game environment for training and assessing spatial reasoning skills, which will benefit many areas of STEM simultaneously.Item GIS Based Wetland Assessment Methodology for Urban Watershed Planning(1997) Snyder, Douglas J.Item Historic Murphy's Landing Program Evaluation(2006) Peterson, JamieItem The Image Assessment of Viet-Nam Among U.S. Tourists(2004-06) Tran-Tuan, HungTourism to Viet-Nam has not reached its full potential. The feasibility of attracting more international tourists to Viet-Nam is quite high, particularly from the USA market. However, as of 2003, Vietnamese marketing and tourism studies are neither complete nor comprehensive in the USA market. Therefore, Vietnamese tourism image assessment is necessary and timely. Using an onsite survey, this research focused on understanding the image of Viet-Nam among U.S. tourists in Viet-Nam. Results from 100 tourists indicated that Viet-Nam's image is overall positive. Respondents were mainly seniors, highly educated, and had discrete income to travel. A four-factor solution of image variables revealed that the world heritage sites, atmosphere and attractions, service value, and tourism quality were highly evaluated by tourists. Also, the Vietnamese people, with its dynamic society, represented the holistic image and the world heritage sites were determined as unique attractions. Respondents felt that traveling to Viet-Nam was an exciting experience, and in combination with the friendliness of Vietnamese, reflected the psychological characteristic of Viet-Nam's image. Recommendations for Vietnamese development have been made for environmental issues, price inequality, and research so planners, managers, and marketers from various levels can apply for a more appealing Viet-Nam to the USA market.Item An Investigation Into the Validity of Using a CSA to Inform Hypotheses Regarding Student Behavior(2015-05) Peterson, MeredithIdentifying the functions of challenging behavior can lead to interventions that can be effective in decreasing challenging behavior in students, thus leading to improved academic and social outcomes. The purpose of this study was to determine the degree to which a contingency space analysis (CSA) could lead to effective intervention for challenging behavior of middle school students in a general education classroom. Participants were four middle school students, previously identified by their classroom teachers to engage in persistent patterns of challenging behavior. A CSA was conducted with each participant and hypotheses as to the functions of each participants’ behavior were developed. Interventions targeting the functions of teacher attention, peer attention, and escape were then implemented and the results were compared to those of the CSA. Results indicated that the CSA accurately predicted the most effective intervention for three of the four participants.Item IT competence for all: Propel your staff to new heights(Haworth Press, 2008-12) Eells, Linda L.; Jaguszewski, Janice M.In 2005, the University of Minnesota Libraries charged a task force with the development of a list of core information technology (IT) skills that could be expected of all 300 staff, from technical services to reference services to stacks maintenance. Once this list was developed, the task force designed and administered an online self-assessment survey to identify gaps and patterns in staff computer skills. Both the development of the core competencies and the administration of the assessment are discussed. Also provided are recommendations for next steps, including using assessment reports and data gathered in the process to develop a training and professional development curriculum focused on the specific identified training needs of staff.