Kadkhodaie Elyaderani, Mojtaba2014-09-092014-09-092014-05https://hdl.handle.net/11299/165536University of Minnesota M.S. thesis. May 2014. Major: Electrical Engineering. Advisor: Zhi-Quan Luo. 1 computer file (PDF); v, 46 pages, appendix A.Consider a class of convex minimization problems for which the objective function is the sum of a smooth convex function and a non-smooth convex regularity term. This class of problems includes several popular applications such as compressive sensing and sparse group LASSO. In this thesis, we introduce a general class of approximate proximal splitting (APS) methods for solving such minimization problems. Methods in the APS class include many well-known algorithms such as the proximal splitting method (PSM), the block coordinate descent method (BCD) and the approximate gradient projection methods for smooth convex optimization. We establish the linear convergence of APS methods under a local error bound assumption. Since the latter is known to hold for compressive sensing and sparse group LASSO problems, our analysis implies the linear convergence of the BCD method for these problems without strong convexity assumption.en-USApproximate proximal splittingBlock coordinate descentConvex optimizationLinear convergence rateConvergence analysis of the approximate proximal splitting method for non-smooth convex optimizationThesis or Dissertation