Show simple item record

dc.contributor.advisorNarasimha Murty, M
dc.contributor.authorPrakash, M
dc.date.accessioned2025-12-01T09:02:17Z
dc.date.available2025-12-01T09:02:17Z
dc.date.submitted1996
dc.identifier.urihttps://etd.iisc.ac.in/handle/2005/7521
dc.description.abstractThe learning subspace methods (LSMs) of classification are decision-theoretic pattern recognition methods where the primary model for a class is a linear subspace of the Euclidean pattern space. Classification is based on the orthogonal projections on these subspaces. Classification of a pattern is independent of its magnitude, and this property is desirable in certain applications. The decision surfaces are quadratic. The LSMs have a potential to extract the required features automatically. They are extremely fast at the time of classification; and their hardware realization is easy. The limitations of the LSMs include an ability to obtain only quadratic decision surfaces, and poor design scalability. In this thesis, we have proposed new LSMs to overcome these limitations. The proposed methods use weighted and multi-subspace representations. The weighted representation associates different weights with different basis vectors in the computation of the orthogonal projection distances. The multi-subspace representation uses more than one subspace to represent each class. This representation obtains a piecewise approximation and helps to overcome the limitation of quadratic decision surfaces. By combining the weighted representation and Hebbian learning appropriately, scalability is improved. Scalability is also improved by an ability to obtain the required number of subspaces for each class and an ability to store partial computations. Based on experimental results, we conclude that the learning subspace methods are good general-purpose classifiers on problems where classification is independent of magnitude. The design complexity is low. The classification speed is high. Their generalization is comparable to other classifiers.
dc.language.isoen_US
dc.relation.ispartofseriesT04045
dc.rightsI grant Indian Institute of Science the right to archive and to make available my thesis or dissertation in whole or in part in all forms of media, now hereafter known. I retain all proprietary rights, such as patent rights. I also retain the right to use in future works (such as articles or books) all or part of this thesis or dissertation
dc.subjectLearning Subspace Methods
dc.subjectOrthogonal Projection
dc.subjectMulti-Subspace Representation
dc.titleLearning subspace methods using weighted and multi-subspace representations
dc.typeThesis
dc.degree.namePhD
dc.degree.levelDoctoral
dc.degree.grantorIndian Institute of Science
dc.degree.disciplineEngineering


Files in this item

This item appears in the following Collection(s)

Show simple item record