Hulme-Lowe, Christopher2016-08-192016-08-192016-04https://hdl.handle.net/11299/181741University of Minnesota Ph.D. dissertation. April 2016. Major: Psychology. Advisors: Niels Waller, David Weiss. 1 computer file (PDF); vii, 166 pages.Regularized parameter estimation has attracted considerable interest in the statistical and machine learning communities as a powerful estimation method that is viable when classic maximum likelihood estimation is not. In this paper, we describe a method of applying regularized estimation to IRT item parameter estimation. The method proposed herein, Regularized Marginal Maximum Likelihood (RMML) is based on the well known Marginal Maximum Likelihood (MML) method, but penalizes on the discrimination parameter estimates with the aim of eliminating poorly performing items. A series of Monte Carlo simulation studies compare RMML estimates to esti- mates from both MML and Bayesian estimation under a variety of conditions. The results of these simulations demonstrate that RMML is useful when item parameters must be estimated from a small sample of examinees and provide insight into directions for further research in this area.enRegularized Marginal Maximum Likelihood: The Use of Shrinkage and Selection Operators for Item Parameter Estimation in the Two-Parameter Logistic ModelThesis or Dissertation