Browsing by Subject "SVM"
Now showing 1 - 1 of 1
- Results Per Page
- Sort Options
Item Methodologies and Algorithms on Some Non-convex Penalized Models for Ultra High Dimensional Data(2016-06) Peng, BoIn recent years, penalized models have gained considerable importance on deal- ing with variable selection and estimation problems under high dimensional settings. Of all the candidates, the l1 penalized, or the LASSO model retains popular application in diverse fields with sophisticated methodology and mature algorithms. However, as a promising alternative of the LASSO, non-convex penalized methods, such as the smoothly clipped absolute deviation (SCAD) and minimax concave penalty (MCP) methods, produce asymptotically unbiased shrinkage estimates and owns attractive ad- vantages over the LASSO. In this thesis, we propose intact methodology and theory for multiple non-convex penalized models. The proposed theoretical framework includes estimator’s error bounds, oracle property and variable selection behaviors. Instead of common least square models, we focus on quantile regression and support vector ma- chines (SVMs) for exploration of heterogeneity and binary classification. Though we demonstrate current local linear approximation (LLA) optimization algorithm possesses those nice theoretical properties to achieve the oracle estimator in two iterations, the computation issue is highly challenging when p is large due to the non-smoothness of the loss function and the non-convexity of the penalty function. Hence, we also explore the potential of coordinate descent algorithms for fitting selected models, establishing convergence properties and presenting significant speed increase on current approaches. Simulated and real data analysis are carried out to examine the performance of non- convex penalized models and illustrate the outperformance of our algorithm in computational speed.