Browsing by Subject "Sufficient dimension reduction"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Partial sufficient dimension reduction in regression.(2011-07) Kim, Do HyangIn this thesis we propose a new model-based reduction method to reduce the dimension of one set of predictors while maintaining another set of predictors and a response if the response is present. Based on the probabilistic PCA model (Tipping and Bishop 1999) and the PFC model (Cook 2007), we develop new models in the partial dimension reduction context: partial probabilistic PCA models, partial PFC models, and combining models. We estimate the parameters of interest for the partial sufficient reduction using the maximum likelihood method. Methods are also proposed for prediction in partial PFC models.Item Sufficient dimension reduction and variable selection.(2010-12) Chen, XinSufficient dimension reduction (SDR) in regression was first introduced by Cook (2004). It reduces the dimension of the predictor space without loss of information and it is very helpful when the number of predictors is large. It alleviates the “curse of dimensionality” for many statistical methods. In this thesis, we study the properties of a dimension reduction method named “continuum regression”; we propose a unified method – coordinate-independent sparse estimation (CISE) – that can simultaneously achieve sparse sufficient dimension reduction and screen out irrelevant and redundant variables efficiently; we also introduce a new dimension reduction method called “principal envelope models”.Item Sufficient dimension reduction for complex data structures(2014-06) Ding, ShanshanData with complex structures, such as array-valued predictors, or responses, are commonly encountered in modern statistical applications. Such data typically contain intrinsic relationship among the entries of each array-valued variable. Conventional sufficient dimension reduction (SDR) methods cannot efficiently utilize the data structures and are inappropriate for the complex data. In this thesis, we propose a class of sufficient dimension reduction methods, including model-based dimension reduction methods: dimension folding principal component analysis (PCA) and dimension folding principal fitted components (PFC), moment-based sufficient dimension reduction methods: tensor sliced inverse regression (SIR), and envelope methods to tackle data with array-valued predictors, or responses. The proposed methods can simultaneously reduce a predictor's, or a response's, multiple dimensions without losing any information in prediction or classification. We study the asymptotic properties of these methods and demonstrate their efficiency in both theoretical and numerical studies.