LI, Haoxiang2024-07-242024-07-242024-05https://hdl.handle.net/11299/264325University of Minnesota Ph.D. dissertation. May 2024. Major: Statistics. Advisors: Qian Qin, Galin Jones. 1 computer file (PDF); x, 100 pages.This dissertation focuses on two fundamental areas in Markov chain Monte Carlo (MCMC) research: uncertainty assessment and asymptotic properties for Monte Carlo estimators, and convergence analysis of MCMC algorithms. We establish a multivariate strong invariance principle (SIP) and give uncertainty assessment tools for time in-homogeneous cyclic MCMC samplers. We conduct convergence analysis of data augmentation algorithms for Bayesian robust multivariate linear regression with incomplete data. Time in-homogeneous cyclic Markov chain Monte Carlo (MCMC) samplers, including deterministic scan Gibbs samplers and Metropolis within Gibbs samplers, are extensively used for sampling from multi-dimensional distributions. We establish a multivariate strong invariance principle (SIP) for Markov chains associated with these samplers. The rate of this SIP essentially aligns with the tightest rate available for time homogeneous Markov chains. The SIP implies the strong law of large numbers (SLLN) and the central limit theorem (CLT), and plays an essential role in uncertainty assessments. Using the SIP, we give conditions under which the multivariate batch means estimator for estimating the covariance matrix in the multivariate CLT is strongly consistent. Additionally, we provide conditions for a multivariate fixed volume sequential termination rule, which is associated with the concept of effective sample size (ESS), to be asymptotically valid. Our uncertainty assessment tools are demonstrated through various numerical experiments. Gaussian mixtures are commonly used for modeling heavy-tailed error distributions in robust linear regression. Combining the likelihood of a multivariate robust linear regression model with a standard improper prior distribution yields an analytically intractable posterior distribution that can be sampled using a data augmentation algorithm. When the response matrix has missing entries, there are unique challenges to the application and analysis of the convergence properties of the algorithm. Conditions for geometric ergodicity are provided when the incomplete data have a ``monotone" structure. In the absence of a monotone structure, an intermediate imputation step is necessary for implementing the algorithm. In this case, we provide sufficient conditions for the algorithm to be Harris ergodic. Finally, we show that, when there is a monotone structure and intermediate imputation is unnecessary, intermediate imputation slows the convergence of the underlying Monte Carlo Markov chain, while post hoc imputation does not. An R package for the data augmentation algorithm is provided.enDrift and minorizationGeometric ergodicityMarkov chain Monte Carlo (MCMC)MCMC uncertainty assessmentStrong invariance principleTime in-homogeneous cyclic MCMC samplersUncertainty Assessment and Convergence Analysis for Markov Chain Monte Carlo AlgorithmsThesis or Dissertation