Yan, Qi2015-08-132015-08-132015-04https://hdl.handle.net/11299/173806University of Minnesota Ph.D. dissertation. April 2015. Major: Statistics. Advisor: Xiaotong Shen. 1 computer file (PDF); vii, 71 pages.In multi-response regression, pursuit of two different types of structures is essential to battle the curse of dimensionality. In this thesis, we seek a sparsest decomposition rep- resentation of a parameter matrix in terms of a sum of sparse and low rank matrices, among many overcomplete decompositions. On this basis, we propose a constrained method subject to two nonconvex constraints, respectively for sparseness and low- rank properties. Computationally, obtaining an exact global optimizer is rather chal- lenging. To overcome the difficulty, we use an alternating directions method solving a low-rank subproblem and a sparseness subproblem alternatively, where we derive an exact solution to the low-rank subproblem, as well as an exact solution in a special case and an approximated solution generally through a surrogate of the L0-constraint and difference convex programming, for the sparse subproblem. Theoretically, we es- tablish convergence rates of a global minimizer in the Hellinger-distance, providing an insight into why pursuit of two different types of decomposed structures is expected to deliver higher estimation accuracy than its counterparts based on either sparseness alone or low-rank approximation alone. Numerical examples are given to illustrate these aspects, in addition to an application to facial imagine recognition and multiple time series analysis. In regression analysis, variables can often be combined into groups based on prior knowledge, such as genomic data, which can be naturally divided into biologically meaningful groups. Luan and Li (2008) and Yin et al (2012) utilize the group struc- ture and propose a block coordinate descent procedure for group additive regression models and nonparametric additive models. Their simulation results demonstrate the good performance of the proposed algorithms in terms of support recovery and prediction accuracy. However, none of them investigate the asymptotic properties of their methods. In this thesis, we generalize a smoothing spline based group L2Boosting algorithm and study the theoretical property for estimation of high-dimensional ad- ditive models with group variables.enCoherent Pursuit and Boosting LearningThesis or Dissertation