Howland, PegPark, Haesun2020-09-022020-09-022003-09-25https://hdl.handle.net/11299/215584Linear discriminant analysis (LDA) has been used for decades to extract features that preserve class separability. It is classically defined as an optimization problem involving covariance matrices that represent the scatter within and between clusters. The requirement that one of these matrices be nonsingular restricts its application to data sets in which the dimension of the data does not exceed the sample size. Recently, the applicability of LDA has been extended by using the generalized singular value decomposition (GSVD) to circumvent the nonsingularity requirement. Alternatively, many studies have taken a two-stage approach in which the first stage reduces the dimension of the data enough so that it can be followed by classical LDA. In this paper, we justify the two-stage approach by establishing its equivalence to the single-stage LDA/GSVD method, provided either principal component analysis or latent semanticindexing is used in the first stage over a certain range ofintermediate dimensions. We also present a computationally simpler choice for the first stage, and conclude with a discussion of the relative merits of each approach.en-USEquivalence of Several Two-stage Methods for Linear Discriminant AnalysisReport