• Login
    View Item 
    •   etd@IISc
    • Division of Electrical, Electronics, and Computer Science (EECS)
    • Computer Science and Automation (CSA)
    • View Item
    •   etd@IISc
    • Division of Electrical, Electronics, and Computer Science (EECS)
    • Computer Science and Automation (CSA)
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Learning subspace methods using weighted and multi-subspace representations

    View/Open
    T04045.pdf (12.75Mb)
    Author
    Prakash, M
    Metadata
    Show full item record
    Abstract
    The learning subspace methods (LSMs) of classification are decision-theoretic pattern recognition methods where the primary model for a class is a linear subspace of the Euclidean pattern space. Classification is based on the orthogonal projections on these subspaces. Classification of a pattern is independent of its magnitude, and this property is desirable in certain applications. The decision surfaces are quadratic. The LSMs have a potential to extract the required features automatically. They are extremely fast at the time of classification; and their hardware realization is easy. The limitations of the LSMs include an ability to obtain only quadratic decision surfaces, and poor design scalability. In this thesis, we have proposed new LSMs to overcome these limitations. The proposed methods use weighted and multi-subspace representations. The weighted representation associates different weights with different basis vectors in the computation of the orthogonal projection distances. The multi-subspace representation uses more than one subspace to represent each class. This representation obtains a piecewise approximation and helps to overcome the limitation of quadratic decision surfaces. By combining the weighted representation and Hebbian learning appropriately, scalability is improved. Scalability is also improved by an ability to obtain the required number of subspaces for each class and an ability to store partial computations. Based on experimental results, we conclude that the learning subspace methods are good general-purpose classifiers on problems where classification is independent of magnitude. The design complexity is low. The classification speed is high. Their generalization is comparable to other classifiers.
    URI
    https://etd.iisc.ac.in/handle/2005/7521
    Collections
    • Computer Science and Automation (CSA) [531]

    etd@IISc is a joint service of SERC & J R D Tata Memorial (JRDTML) Library || Powered by DSpace software || DuraSpace
    Contact Us | Send Feedback | Thesis Templates
    Theme by 
    Atmire NV
     

     

    Browse

    All of etd@IIScCommunities & CollectionsTitlesAuthorsAdvisorsSubjectsBy Thesis Submission DateThis CollectionTitlesAuthorsAdvisorsSubjectsBy Thesis Submission Date

    My Account

    LoginRegister

    etd@IISc is a joint service of SERC & J R D Tata Memorial (JRDTML) Library || Powered by DSpace software || DuraSpace
    Contact Us | Send Feedback | Thesis Templates
    Theme by 
    Atmire NV