Browsing by Subject "Regularization"
Now showing 1 - 5 of 5
- Results Per Page
- Sort Options
Item Existence, uniqueness and stability of solutions of generalized Tikhonov-Phillips functionals(University of Minnesota. Institute for Mathematics and Its Applications, 2011-08) Mazzieri, G.L.; Spies, Ruben D.; Temperini, Karina G.Item High Dimensional Statistical Models: Applications to Climate(2015-09) Chatterjee, SoumyadeepRecent years have seen enormous growth in collection and curation of datasets in various domains which often involve thousands or even millions of variables. Examples include social networking websites, geophysical sensor networks, cancer genomics, climate science, and many more. In many applications, it is of prime interest to understand the dependencies between variables, such that predictive models may be designed from knowledge of such dependencies. However, traditional statistical methods, such as least squares regression, are often inapplicable for such tasks, since the available sample size is much smaller than problem dimensionality. Therefore we require new models and methods for statistical data analysis which provide provable estimation guarantees even in such high dimensional scenarios. Further, we also require that such models provide efficient implementation and optimization routines. Statistical models which satisfy both these criteria will be important for solving prediction problems in many scientific domains. High dimensional statistical models have attracted interest from both the theoretical and applied machine learning communities in recent years. Of particular interest are parametric models, which considers estimation of coefficient vectors in the scenario where sample size is much smaller than the dimensionality of the problem. Although most existing work focuses on analyzing sparse regression methods using L1 norm regularizers, there exist other ``structured'' norm regularizers that encode more interesting structure in the sparsity induced on the estimated regression coefficients. In the first part of this thesis, we conduct a theoretical study of such structured regression methods. First, we prove statistical consistency of regression with hierarchical tree-structured norm regularizer known as hiLasso. Second, we formulate a generalization of the popular Dantzig Selector for sparse linear regression to any norm regularizer, called Generalized Dantzig Selector, and provide statistical consistency guarantees of estimation. Further, we provide the first known results on non-asymptotic rates of consistency for the recently proposed $k$-support norm regularizer. Finally, we show that in the presence of measurement errors in covariates, the tools we use for proving consistency in the noiseless setting are inadequate in proving statistical consistency. In the second part of the thesis, we consider application of regularized regression methods to statistical modeling problems in climate science. First, we consider application of Sparse Group Lasso, a special case of hiLasso, for predictive modeling of land climate variables from measurements of atmospheric variables over oceans. Extensive experiments illustrate that structured sparse regression provides both better performance and more interpretable models than unregularized regression and even unstructured sparse regression methods. Second, we consider application of regularized regression methods for discovering stable factors for predictive modeling in climate. Specifically, we consider the problem of determining dominant factors influencing winter precipitation over the Great Lakes Region of the US. Using a sparse linear regression method, followed by random permutation tests, we mine stable sets of predictive features from a pool of possible predictors. Some of the stable factors discovered through this process are shown to relate to known physical processes influencing precipitation over Great Lakes.Item Regularization methods for inverse problems.(2011-03) Orozco Rodr´ıguez, Jos´e AlbertoMany applications in industry and science require the solution of an inverse problem. To obtain a stable estimate of the solution of such problems, it is often necessary to im- plement a regularization strategy. In the first part of the present work, a multiplicative regularization strategy is analyzed and compared with Tikhonov regularization. In the second part, an inverse problem that arises in financial mathematics is analyzed and its solution is regularized. Tikhonov regularization for the solution of discrete ill-posed problems is well doc- umented in the literature. The L-curve criterion is one of a few techniques that are preferred for the selection of the Tikhonov parameter. A more recent regularization ap- proach less well known is a multiplicative regularization strategy, which unlike Tikhonov regularization, does not require the selection of a parameter. We analyze a multiplica- tive regularization strategy for the solution of discrete ill-posed problems by comparing it with Tikhonov regularization aided with the L-curve criterion. We then proceed to analyze the stability of a method for estimating the risk-neutral density (RND) for the price of an asset from option prices. RND estimation is an inverse problem. The method analyzed first applies the principle of maximum entropy, where the maximum entropy solution (MES) corresponds to the estimated RND. Next, it pro- vides an effective characterization of the constraint qualification (CQ) under which the MES can be computed by solving the dual problem, where an explicit function in finitely many variables is minimized. In our analysis, we show that the MES is stable under pa- rameter perturbation, but the parameters are unstable under data perturbation. When noisy data are used, we show how to project the data so that the CQ is satisfied and the method can be used. To stabilize the method, we use Tikhonov regularization and choose the penalty parameter via the L-curve method. We demonstrate with numerical examples that the method becomes then much more stable to perturbation in data. Accordingly, we perform a convergence analysis of the regularized solution.Item Regularized Learning of High-dimensional Sparse Graphical Models(2012-07) Xue, LingzhouHigh-dimensional graphical models are important tools for characterizing complex interactions within a large-scale system. In this thesis, our emphasis is to utilize the increasingly popular regularization technique to learn sparse graphical models, and our focus is on two types of graphs: Ising model for binary data and nonparanormal graphical model for continuous data. In the first part, we propose an efficient procedure for learning a sparse Ising model based on a non-concave penalized composite likelihood, which extends the methodology and theory of non-concave penalized likelihood. An efficient solution path algorithm is devised by using a novel coordinate-minorization-ascent algorithm. Asymptotic oracle properties of our proposed estimator are established with NP-dimensionality. We demonstrate its finite sample performance via simulation studies and real applications to study the Human Immunodeficiency Virus type 1 protease structure. In the second part, we study the nonparanormal graphical model that is much more robust than the Gaussian graphical model while retains the good interpretability of the latter. In this thesis we show that the nonparanormal graphical model can be efficiently estimated by using a unified regularized rank estimation scheme which does not require estimating those unknown transformation functions in the nonparanormal graphical model. In particular, we study the rank-based Graphical LASSO, the rank-based Dantzig selector and the rank-based CLIME. We establish their theoretical properties in the setting where the dimension is nearly exponentially large relative to the sample size. It is shown that the proposed rank-based estimators work as well as their oracle counterparts in both simulated and real data.Item Sparse representations for limited data tomography(University of Minnesota. Institute for Mathematics and Its Applications, 2007-11) Liao, Hstau Y.; Sapiro, Guillermo