Wang, Lan2019-03-132019-03-132019https://hdl.handle.net/11299/202063$L_1$-regularized quantile regression ($l_1$-QR) provides a fundamental technique for analyzing high-dimensional economic data that are heterogeneous with potentially heavy-tailed random errors. This paper investigates conditional quantile estimation when the number of regressors is larger than the sample size. It establishes that $l_1$-QR possesses properties resembling those of $L_1$-regularized least squares regression (or LS-Lasso) under generally weaker conditions and enjoys near-optimal performance for a much richer class of error distributions, including some error distributions for which the performance of LS-Lasso is sub-optimal. Our results build upon and substantially generalize the earlier interesting work of Belloni and Chernozhukov (2011) and Wang (2013). We obtain interesting properties for $l_1$-QR in several novel directions under both the popular hard sparsity assumption and a more relaxed soft sparsity condition that allows many regressors to have small effects. These new theoretical guarantees fill important gaps in the literature and render strong support for the applicability of quantile regression in the high-dimensional setting.en$L_1$-regularized Quantile Regression with Many Regressors under Lean AssumptionsPreprint