A Framework for Nonconvex Robust Subspace Recovery
2018-07
Loading...
View/Download File
Persistent link to this item
Statistics
View StatisticsJournal Title
Journal ISSN
Volume Title
Title
A Framework for Nonconvex Robust Subspace Recovery
Authors
Published Date
2018-07
Publisher
Type
Thesis or Dissertation
Abstract
This thesis consists of three works from my Ph.D.~research. The first part is an overview of the problem of Robust Subspace Recovery. Then, the next parts present two algorithms for solving this problem along with supporting mathematical analysis. Robust subspace recovery involves finding an underlying low-dimensional subspace in a dataset that is possibly corrupted with outliers. While this problem is easy to state, it has been difficult to develop optimal algorithms due to its underlying nonconvexity. We give a comprehensive review of the algorithms developed for RSR in the first chapter of this thesis. After this, we will discuss our proposed solutions to this problem. The first proposed algorithm, which we refer to as Fast Median Subspace (FMS), is designed to robustly determine the underlying subspace of such datasets, while having lower computational complexity than existing accurate methods. We prove convergence of the FMS iterates to a stationary point. Further, under two special models of data, FMS converges to a point that is near to the global minimum with overwhelming probability. Under these models, we show that the iteration complexity is globally sublinear and locally r-linear. For one of the models, these results hold for any fixed fraction of outliers (less than 1). Numerical experiments on synthetic and real data demonstrate its competitive speed and accuracy. Our second proposed algorithm involves geodesic gradient descent on the Grassmannian manifold. In the accompanying mathematical analysis, we prove that an underlying subspace is the only stationary point and local minimizer in a specified neighborhood if a deterministic condition holds for a dataset. We further show that if the deterministic condition is satisfied, the geodesic gradient descent method over the Grassmannian manifold can exactly recover the underlying subspace with proper initialization. Proper initialization by principal component analysis is guaranteed with a similar stability condition. Under slightly stronger assumptions, the gradient descent method with an adaptive step size scheme achieves linear convergence. The practicality of the deterministic condition is demonstrated on some statistical models of data, and the method achieves almost state-of-the-art recovery guarantees on the Haystack Model. We show that our gradient method can exactly recover the underlying subspace for any fixed fraction of outliers (less than 1) provided that the sample size is large enough.
Description
University of Minnesota Ph.D. dissertation. July 2018. Major: Mathematics. Advisor: Gilad Lerman. 1 computer file (PDF); ix, 185 pages.
Related to
Replaces
License
Collections
Series/Report Number
Funding information
Isbn identifier
Doi identifier
Previously Published Citation
Other identifiers
Suggested citation
Maunu, Tyler. (2018). A Framework for Nonconvex Robust Subspace Recovery. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/201143.
Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.