DSpace DSpace

University of Minnesota Digital Conservancy >
University of Minnesota - Twin Cities >
School of Statistics >
Dr. Charles Geyer >

Please use this identifier to cite or link to this item: http://hdl.handle.net/11299/58433

Title: Estimating Normalizing Constants and Reweighting Mixtures
Authors: Geyer, Charles J.
Issue Date: 9-Jul-1994
Citation: Technical Report No. 568, School of Statistics University of Minnesota
Series/Report no.: Technical Report
Abstract: Markov chain Monte Carlo (the Metropolis-Hastings algorithm and the Gibbs sampler) is a general multivariate simulation method that permits sampling from any stochastic process whose density is known up to a constant of proportionality. It has recently received much attention as a method of carrying out Bayesian, likelihood, and frequentist inference in analytically intractable problems. Although many applications of Markov chain Monte Carlo do not need estimation of normalizing constants, three do: calculation of Bayes factors, calculation of likelihoods in the presence of missing data, and importance sampling from mixtures. Here reverse logistic regression is proposed as a solution to the problem of estimating normalizing constants, and convergence and asymptotic normality of the estimates are proved under very weak regularity conditions. Markov chain Monte Carlo is most useful when combined with importance reweighting so that a Monte Carlo sample from one distribution can be used for inference about many distributions. In Bayesian inference, reweighting permits the calculation of posteriors corresponding to a range of priors using a Monte Carlo sample from just one posterior. In likelihood inference, reweighting permits the calculation of the whole likelihood function using a Monte Carlo sample from just one distribution in the model. Given this estimate of the likelihood, a parametric bootstrap calculation of the sampling distribution of the maximum likelihood estimate can be done using just one more Monte Carlo sample. Although reweighting can save much calculation, it does not work well unless the distribution being reweighted places appreciable mass in all regions of interest. Hence it is often not advisable to sample from a distribution in the model. Reweighting a mixture of distributions in the model performs much better, but this cannot be done unless the mixture density is known and this requires knowledge of the normalizing constants, or at least good estimates such as those provided by reverse logistic regression.
URI: http://purl.umn.edu/58433
Appears in Collections:Dr. Charles Geyer

Files in This Item:

File Description SizeFormat
tr568r.ps214.61 kBPostscriptView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.