Ngo, Thanh Trung2014-07-102014-07-102014-05https://hdl.handle.net/11299/163875University of Minnesota Ph.D. disseration. May 2014. Major: Computer Science. Advisor: Youcef Saad. 1 computer file (PDF);viii, 121 pages.High dimensional data usually have intrinsic low rank representations. These low rank representations not only reveal the hidden structure of the data but also reduce the computational cost of data analysis. Therefore, finding low dimensional approximations of the data is an essential task in many data mining applications.Classical low dimensional approximations rely on two universal tools: the eigenvalue decomposition and the singular value decomposition. These two different but related decompositions are of high importance in a large number of areas in science and engineering. As a result, research in numerical linear algebra has been conducted to derive efficient algorithms for solving eigenvalue and singular value problems. Because available solvers for these problems are so well developed, they are often used as black boxes in data analysis.This thesis explores numerical linear algebra techniques and extends well-known methods for low rank approximations to solve new problems in data analysis. Specifically, we carefully analyze the trace ratio optimization and argue that solving this problem can be done efficiently. We also propose efficient algorithms for low rank matrix approximations with missing entries. We also reformulate and analyze classical problems from a different perspective. This reveals the connection between the proposed methods and traditional methods in numerical linear algebra. The performance of the proposed algorithms is established both theoretically and through extensive experiments in dimension reduction and collaborative filtering.en-USData analysisDimension reductionLow dimensional approximationMatrix completionNumerical linear algebraOptimizationLow dimensional approximations: problems and algorithmsThesis or Dissertation