Data with complex structures, such as array-valued predictors, or responses, are commonly encountered in modern statistical applications. Such data typically contain intrinsic relationship among the entries of each array-valued variable. Conventional sufficient dimension reduction (SDR) methods cannot efficiently utilize the data structures and are inappropriate for the complex data. In this thesis, we propose a class of sufficient dimension reduction methods, including model-based dimension reduction methods: dimension folding principal component analysis (PCA) and dimension folding principal fitted components (PFC), moment-based sufficient dimension reduction methods: tensor sliced inverse regression (SIR), and envelope methods to tackle data with array-valued predictors, or responses. The proposed methods can simultaneously reduce a predictor's, or a response's, multiple dimensions without losing any information in prediction or classification. We study the asymptotic properties of these methods and demonstrate their efficiency in both theoretical and numerical studies.