Bhattacharya, Riddhiman2022-11-142022-11-142022-07https://hdl.handle.net/11299/243076University of Minnesota Ph.D. dissertation. July 2022. Major: Statistics. Advisor: Tiefeng Jiang. 1 computer file (PDF); xii, 160 pages.Sampling and optimization are considered as the two key pillars in statistics with applications in many other fields like machine learning, physics, chemistry just to name a few. Although there has been a recent boom in literature for both optimization and sampling algorithms, some key questions on the properties of the aforementioned topics still remain unanswered. In this thesis we study some of these algorithms in detail and address some of the key questions. The common aspect that connects all the algorithms considered by us is that they are iterative in nature. We study the Langevin Monte Carlo (LMC) algorithm with incorrect gradient and establish asymptotic results. We also consider the Multiplicative Stochastic Gradient Descent Algorithm (M-SGD) and establish that the error term is asymptotically Gaussian. Also, we exhibit that the M-SGD algorithm is close to a certain diffusion, irrespective of the weights used, in order of the step size under suitable assumptions. Next, we establish the convergence of the algorithm and a CLT around the optimum in the regime of strong convexity. Lastly, we consider the preconditioned LMC algorithm and exhibit non-asymptotic bounds in the regime of strong convexity.enDiffusionLangevinM-SGDOptimizationSamplingOptimization and Sampling using Iterative AlgorithmsThesis or Dissertation