Show simple item record

dc.contributor.advisorShevade, Shirish
dc.contributor.authorBapat, Tanuja
dc.date.accessioned2013-06-20T09:54:39Z
dc.date.accessioned2018-07-31T04:38:22Z
dc.date.available2013-06-20T09:54:39Z
dc.date.available2018-07-31T04:38:22Z
dc.date.issued2013-06-20
dc.date.submitted2011
dc.identifier.urihttps://etd.iisc.ac.in/handle/2005/2065
dc.identifier.abstracthttp://etd.iisc.ac.in/static/etd/abstracts/2661/G24953-Abs.pdfen_US
dc.description.abstractMany real-world problems like hand-written digit recognition or semantic scene classification are treated as multiclass or multi-label classification prob-lems. Solutions to these problems using support vector machines (SVMs) are well studied in literature. In this work, we focus on building sparse max-margin classifiers for multiclass and multi-label classification. Sparse representation of the resulting classifier is important both from efficient training and fast inference viewpoints. This is true especially when the training and test set sizes are large.Very few of the existing multiclass and multi-label classification algorithms have given importance to controlling the sparsity of the designed classifiers directly. Further, these algorithms were not found to be scalable. Motivated by this, we propose new formulations for sparse multiclass and multi-label classifier design and also give efficient algorithms to solve them. The formulation for sparse multi-label classification also incorporates the prior knowledge of label correlations. In both the cases, the classification model is designed using a common set of basis vectors across all the classes. These basis vectors are greedily added to an initially empty model, to approximate the target function. The sparsity of the classifier can be controlled by a user defined parameter, dmax which indicates the max-imum number of common basis vectors. The computational complexity of these algorithms for multiclass and multi-label classifier designisO(lk2d2 max), Where l is the number of training set examples and k is the number of classes. The inference time for the proposed multiclass and multi-label classifiers is O(kdmax). Numerical experiments on various real-world benchmark datasets demonstrate that the proposed algorithms result in sparse classifiers that require lesser number of basis vectors than required by state-of-the-art algorithms, to attain the same generalization performance. Very small value of dmax results in significant reduction in inference time. Thus, the proposed algorithms provide useful alternatives to the existing algorithms for sparse multiclass and multi-label classifier design.en_US
dc.language.isoen_USen_US
dc.relation.ispartofseriesG24953en_US
dc.subjectArtificial Intelligenceen_US
dc.subjectMachine Learningen_US
dc.subjectMulticlass Classificationen_US
dc.subjectMulti-label Classificationen_US
dc.subjectSparse Max-Margin Multiclass Classifier Designen_US
dc.subjectSparse Max-Margin Classifiersen_US
dc.subjectSparse Max-Margin Multi-label Classifier Designen_US
dc.subjectSupport Vector Machine (SVM)en_US
dc.subjectSparse Classifiersen_US
dc.subject.classificationComputer Scienceen_US
dc.titleSparse Multiclass And Multi-Label Classifier Design For Faster Inferenceen_US
dc.typeThesisen_US
dc.degree.nameMSc Enggen_US
dc.degree.levelMastersen_US
dc.degree.disciplineFaculty of Engineeringen_US


Files in this item

This item appears in the following Collection(s)

Show simple item record