Browsing by Author "Park, Cheonghee"
Now showing 1 - 8 of 8
- Results Per Page
- Sort Options
Item A Comparison of Generalized LDA Algorithms for Undersampled Problems(2003-12-11) Park, Cheonghee; Park, HaesunLinear Discriminant Analysis (LDA) is a dimension reduction method which finds an optimal linear transformation that maximizes the between-class scatter and minimizes the within-class scatter. In undersampled problems where the number of samples is smaller than the dimension of data space, it is difficult to apply the LDA due to the singularity of scatter matrices caused by high dimensionality. In order to make the LDA applicable for undersampled problems, several generalizations of the LDA have been proposed recently. In this paper, we present the theoretical and algorithmic relationships among several generalized LDA algorithms and compare their computational complexities and performances in text classification and face recognition. Towards a practical dimension reduction method for high dimensional data, an efficient algorithm is also proposed.Item A Comparitive Study of Linear and Nonlinear Feature Extraction Methods(2004-11-17) Park, Cheonghee; Park, Haesun; Pardalos, PanosLinear Discriminant Analysis (LDA) is a dimension reduction method which finds an optimal linear transformation that maximizes the between-class scatter and minimizes the withinclass scatter. However, in undersampled problems where the number of samples is smaller than the dimension of data space, it is difficult to apply the LDA due to the singularity of scatter matrices caused by high dimensionality. In order to make the LDA applicable, several generalizations of the LDA have been proposed. This paper presents theoretical and algorithmic relationships among several generalized LDA algorithms. Utilizing the relationships among them, computationally efficient approaches to these algorithms are proposed. We also present nonlinear extensions of these LDA algorithms. The original data space is mapped to a feature space by an implicit nonlinear mapping through kernel methods. A generalized eigenvalue problem is formulated in the transformed feature space and generalized LDA algorithms are applied to solve the problem. Performances and computational complexities of these linear and nonlinear discriminant analysis algorithms are compared theoretically and experimentally.Item A Fast Dimension Reduction Algorithm with Applications on Face Recognition and Text Classification(2003-12-19) Park, Cheonghee; Park, HaesunIn undersampled problems where the number of samples is smaller than the dimension of data space, it is difficult to apply Linear Discriminant Analysis (LDA) due to the singularity of scatter matrices caused by high dimensionality. We propose a fast dimension reduction method based on a simple modification of Principal Component Analysis (PCA) and the orthogonal decomposition. The proposed algorithm is an efficient way to perform LDA for undersampled problems. Our experimental results in face recognition and text classification demonstrate the effectiveness of our proposed method.Item A new optimization criterion for generalized discriminant analysis on undersampled problems(2003-06-10) Ye, Jieping; Janardan, Ravi; Park, Cheonghee; Park, HaesunWe present a new optimization criterion for discriminant analysis. The new criterion extends the optimization criteria of the classical linear discriminant analysis (LDA) by introducing the pseudo-inverse when the scatter matrices are singular. It is applicable regardless of the relative sizes of the data dimension and sample size,overcoming a limitation of the classical LDA. Recently, a new algorithm called LDA/GSVD for structure-preserving dimension reduction has been introduced, which extends the classical LDA to very high-dimensional undersampled problems by using the generalized singular value decomposition (GSVD). The solution from the LDA/GSVD algorithm is a special case of the solution for our generalized criterion in this paper, which is also based on GSVD. We also present an approximate solution for our GSVD-based solution, which reduces computational complexity by finding sub-clusters of each cluster, and using their centroids to capture the structure of each cluster. This reduced problem yields much smaller matrices of which the GSVD can be applied efficiently. Experiments on text data, with up to 7000 dimensions, show that the approximation algorithm produces results that are close to those produced by the exact algorithm.Item An Efficient Algorithm for LDA Utilizing the Relationship between LDA and the generalized Minimum Squared Error Solution(2004-03-03) Park, Cheonghee; Park, HaesunIn this paper, we study the relationship between Linear Discriminant Analysis(LDA) and the generalized Minimum Squared Error (MSE) solution. We show that the generalized MSE solution is equivalent to applying a certain classification rule in the space transformed by LDA. The relationship of the MSE solution with Fisher Discriminant Analysis (FDA) is extended to multi-class problems and also undersampled problems where the classical LDA is not applicable due to the singularity of scatter matrices. We propose an efficient algorithm for LDA that can be performed through the relationship with the MSE procedure without solving the eigenvalue problem. Extensive experiments verify the theoretical results and also demonstrate that the classification rule induced by MSE procedure can be effectively applied in the dimension reduced space by LDA.Item Fingerprint Classification Using Nonlinear Discriminant Analysis(2003-09-16) Park, Cheonghee; Park, HaesunWe present a new approach for fingerprint classification based on nonlinear feature extraction. Utilizing the Discrete Fourier Transform, we construct reliable and efficient directional images to contain the representative part of local ridge orientations in fingerprints and apply kernel discriminant analysis to the constructed directionalimages, reducing the dimension dramatically and extracting most discriminant features. Kernel Discriminant Analysis is a nonlinear extension of Linear Discriminant Analysis (LDA) based on kernel functions. It performs LDA in the feature space transformed by a kernel-based nonlinear mapping,extracting optimal features to maximize class separability in the reduced dimensional space. We show the effectiveness of the feature extraction method in fingerprint classification. Experimental results show the proposed method demonstrates competitive performance compared with other published results.Item Kernel Discriminant Analysis based on Generalized Singular Value Decomposition(2003-03-28) Park, Cheonghee; Park, HaesunIn Linear Discriminant Analysis (LDA), a dimension reducing linear transformation is found in order to better distinguish clusters from each other in the reduced dimensional space. However, LDA has a limitation that one of the scatter matrices is required to be nonsingular and the nonlinearly clustered structure is not easily captured. We propose a nonlinear discriminant analysis based on kernel functions and the generalized singular value decomposition called KDA/GSVD, which is a nonlinear extension of LDA and works regardless of the nonsingularityof the scatter matrices in either the input space or feature space. Our experimental results show that our method is a very effective nonlinear dimension reduction method.Item Nonlinear Feature Extraction based on Centroids and Kernel Functions(2002-12-19) Park, Cheonghee; Park, HaesunA nonlinear feature extraction method is presented which canreduce the data dimension down to the number of clusters, providing dramatic savings in computational costs. The dimension reducing nonlinear transformation is obtained by implicitly mapping the input data into a feature space using a kernel function, and then finding a linear mappingbased on an orthonormal basis of centroids in the feature space that maximally separates the between-cluster relationship. The experimental results demonstrate that our method is capable of extracting nonlinear features effectively so that competitive performance of classification can be obtained with linear classifiers in the dimension reduced space.