Modern machine learning problems that emerge from real-world applications typically involve estimating high dimensional model parameters, whose number may be of the same order as or even significantly larger than the number of measurements. In such high dimensional settings, statistically-consistent estimation of true underlying models via classical approaches is often impossible, due to the lack of identifiability. A recent solution to this issue is through incorporating regularization functions into estimation procedures to promote intrinsic low-complexity structure of the underlying models. Statistical studies have established successful recovery of model parameters via structure-exploiting regularized estimators and computational efforts have examined efficient numerical procedures to accurately solve the associated optimization problems. In this dissertation, we study the statistical and computational aspects of some regularized estimators that are successful in reconstructing high dimensional models. The investigated estimation frameworks are motivated by their applications in different areas of engineering, such as structural health monitoring and recommendation systems. In particular, the group Lasso recovery guarantees provided in Chapter 2 will bring insight into the application of this estimator for localizing material defects in the context of a structural diagnostics problem. Chapter 3 describes the convergence study of an accelerated variant of the well-known alternating direction method of multipliers (ADMM) for minimizing strongly convex functions. The analysis is followed by several experimental evidence into the algorithm's applicability to a ranking problem. Finally, Chapter 4 presents a local convergence analysis of regularized factorization-based estimators for reconstructing low-rank matrices. Interestingly, the analysis of this chapter reveals the interplay between statistical and computational aspects of such (non-convex) estimators. Therefore, it can be useful in a wide variety of problems that involve low-rank matrix estimation.
University of Minnesota Ph.D. dissertation. September 2017. Major: Electrical/Computer Engineering. Advisor: Jarvis Haupt. 1 computer file (PDF); ix, 153 pages.
Kadkhodaie Elyaderani, Mojtaba.
A Computational and Statistical Study of Convex and Nonconvex Optimization with Applications to Structured Source Demixing and Matrix Factorization Problems.
Retrieved from the University of Minnesota Digital Conservancy,
Content distributed via the University of Minnesota's Digital Conservancy may be subject to additional license and use restrictions applied by the depositor.