OIR Presentations
Persistent link for this collection
Browse
Browsing OIR Presentations by Issue Date
Now showing 1 - 20 of 63
Results Per Page
Sort Options
Item Identifying Students at Risk: Utilizing Survival Analysis to Study Student Athlete Attrition(2006-10) Radcliffe, Peter M.; Huesman, Ronald L. Jr.; Kellogg, John P.Item Modeling the Incidence and Timing of Student Attrition: A Survival Analysis Approach to Retention Analysis(2006-11) Radcliffe, Peter M.; Huesman, Ronald L. Jr.; Kellogg, John P.Item Modeling the Incidence and Timing of Student Attrition: A Survival Analysis Approach to Retention Analysis(2007-06) Radcliffe, Peter M.; Huesman, Ronald L. Jr.; Kellogg, John P.Item Modeling Student Academic Success: Does Usage of Campus Recreation Facilities Make a Difference?(2007-09) Huesman, Ronald L. Jr.; Brown, Anthony K.; Lee, Giljae; Kellogg, John P.; Radcliffe, Peter M.Item Correlations Between Average Faculty Salaries and Institutional Rankings for Top-ranked Institutions(2007-10) Goldfine, Leonard S.; Huesman, Ronald L. Jr.; Jones-White, Daniel R.Item Identifying Factors Related to Student Success: Utilizing Multinomial Logit Regression to Study Graduation in Higher Education(2007-10) Huesman, Ronald L. Jr.; Radcliffe, Peter M.; Jones-White, Daniel R.; Kellogg, John P.; Lee, GiljaeItem Hitting a Moving Target: Navigating the Landscape of Ever-Changing College Rankings(2008-05) Goldfine, Leonard S.; Jones-White, Daniel R.; Huesman, Ronald L. Jr.; Lee, GiljaeItem Redefining Student Success: Assessing Different Multinomial Regression Techniques for the Study of Student Retention and Graduation Across Institutions of Higher Education(2008-05) Jones-White, Daniel R.; Radcliffe, Peter M.; Huesman, Ronald L. Jr.; Kellogg, John P.Item Redefining Student Success: Assessing Different Multinomial Regression Techniques for the Study of Student Retention and Graduation Across Institutions of Higher Education(2009-06) Jones-White, Daniel R.; Radcliffe, Peter M.; Huesman, Ronald L. Jr.; Kellogg, John P.2008 AIR Best Paper presented at the 2009 Annual Association of Institutional Research Forum.Item Shifting Student Demographics and Their Impact on a Midwestern Higher Education Institution's Transformation: Preparing for Change(2009-06) Frazier, Christina; Howard, Rich; Banks, Barbara; Kellogg, John P.Item Counting Out Time: Utilizing Zero Modified Count Regression to Model Time-to-Degree Attainment(2009-06) Jones-White, Daniel R.; Radcliffe, Peter M.; Huesman, Ronald L. Jr.; Kellogg, John P.Item Priced Out? Does Financial Aid Affect Student Success?(2009-10) Jones-White, Daniel R.; Radcliffe, Peter M.; Lorenz, LindaItem Developing a Focused Structured Student Outcomes Assessment Program Experience at a Large Public University(2009-10) Huesman, Ronald L. Jr.Item What is it that satisfies faculty?: Rank as a consideration in factors related to job satisfaction.(2009-10) Johnson, Gina M.Item Reporting with the New Racial and Ethnic Data(2009-10-01) Lorenz, LindaItem Priced Out? Does Financial Aid Affect Student Success?(2010-06) Jones-White, Daniel R.; Radcliffe, Peter M.; Lorenz, LindaItem The Politics of Equity Research(2010-10) Goldfine, Leonard S.; Radcliffe, Peter M.The neutrality of an IR office can be put to the test when tasked with conducting an equity study. Even the best intentioned and well reasoned study is subject to political considerations that have little to do with the pursuit of truth. From considerations of what variables to include in a regression model to interpretation of results, what it said, how it is said, and from whom a message comes are all as important as any actual statistical results. This session presents a road map to some of the pitfalls an IR office can face when asked to perform an equity study. Resources from the literature as well as anecdotal experience are used to illustrate the often exasperating decisions and negotiations institutional researchers will have to face when moving beyond the realm of pure research and into studies that could have a large and immediate impact on the University and its employees and students lives.Item Serving to Learn: Does Community Based Learning Participation Contribute to More Desirable Student Outcomes?(2010-10) Jones-White, Daniel R.; Soria, Krista M.; Huesman, Ronald L. Jr.Given the increasing emphasis on public engagement on many college campuses, it is important to assess the extent to which engagement opportunities provide meaningful and valuable experiences for college students. While there is growing evidence to support the notion that public engagement opportunities (e.g. service-learning, volunteerism) contributes to student academic success, Eyler, Giles, and Gray (1999) caution that there is no consensus on the impact of engagement experiences and academic achievement indicators, such as grade point average. Given the apparent lack of agreement in the research, this study attempts to identify if there is a relationship between participation in community based learning activities and first-year achievement in college. To assess if there is an independent relationship between participating in different communitybased activities and first-year student grade point average, this study utilizes a sample of first-time, full-time freshmen students enrolled at the University of Minnesota-Twin Cities in Fall 2009 who participated in a unique survey of student engagement (the SERU survey) administered during Spring 2010.Item Plug and Play: Developing a Flexible Program Assessment Model(2010-10) Huesman, Ronald L. Jr.; Radcliffe, Peter M.; Jones-White, Daniel R.The presentation will outline a program assessment design for improving the educational and personal experiences of University of Minnesota students. A recent assessment of a scholarship program for at-risk students will be used to illustrate the approach. Donors, alumni relations and academic affairs units are often involved in the development of scholarship programs aimed at improving the success of at-risk students. Often these programs have a financial, programmatic and/or advisory component aimed at improving student success. Collaborations across these units with institutional research and assessment professionals can provide meaningful exchanges of ideas/perspectives and open up unique opportunities for assessing the impact of participation in these programs. Student success is often measured in terms of academic performance, retention and graduation rates. Plugging a general program participation variable into comprehensive regression model of student success provides a baseline for assessing the effectiveness of a program while controlling for the effects of other factors. Used in conjunction with qualitative approaches [i.e., focus groups, surveys] we can broaden our outcomes of interest as needed. Along with developing a standard reporting template, this approach provides a flexible framework for assessing similar programs in a timely, consistent, and responsible manner that serves multiple needs.