The task of statistical regression is to learning conditional distribution of response given predictors. To learn the conditional distribution directly is hard so people instead focus on the functionals of the conditional distribution. Traditionally people model conditional mean which is an intuitive and useful functional of the conditional distribution. Nevertheless the conditional mean can hardly capture a full picture of the conditional distribution, for instance, distributional tail behaviors. Quantile Regression (QR) and Expectile Regression (ER) are introduced by Koenker and Efron respectively to learn the regression percentiles which provide broader views than conditional mean regarding gaining insights about the conditional distribution. In this thesis, we propose a new boosting algorithm to learn regression percentiles under the context of QR. Also, we provide LARS-like variable selection strategies for ER and provide the solution path. Finally, irregular problems of M-estimation is studied in this thesis. We discuss its connections to extreme values and some recent algorithms for solving QR.