Show simple item record

dc.contributor.advisorSathiya Keerthi, S ; Narasimha Murty, M
dc.contributor.authorShevade, S K
dc.date.accessioned2025-10-07T10:51:54Z
dc.date.available2025-10-07T10:51:54Z
dc.date.submitted2001
dc.identifier.urihttps://etd.iisc.ac.in/handle/2005/7147
dc.description.abstractThis thesis proposes some efficient algorithms for training Support Vector Machines (SVMs) for large-scale datasets with emphasis on the regression problem. Training a support vector machine leads to solving a convex quadratic programming problem. It is challenging to solve the large-scale quadratic programming problem arising in different real-world applications when the memory available is limited. An important source of inefficiency in the SMO (Sequential Minimal Optimization) algorithm for regression is removed by getting clues from the Karush-Kuhn-Tucker (KKT) conditions and two modified versions of the original SMO algorithm for regression are devised. These modified algorithms scale well and perform significantly better than the original SMO algorithm. The proof of finite convergence for the modified SMO algorithms is given. An efficient implementation of the SMO algorithm for solving the I/-SVM regression problem is presented and evaluated on large real-world datasets. Extension of these ideas for solving the za-SVM classification problem is discussed. The I/-SVM problem formulation for novelty detection is modified and an efficient SMO algorithm for solving this problem is presented. It is compared with the modified Nearest Point Algorithm (NPA) used for finding the distance o f a convex polytope from the origin. A new linear programming formulation for solving the regression problem is proposed and it is empirically verified that the linear regularizer enforces sparseness into the SVM design. The special structure of this linear program is utilized to determine the values of hyperparameters in the formulation in a systematic way.
dc.language.isoen_US
dc.relation.ispartofseriesT05070
dc.rightsI grant Indian Institute of Science the right to archive and to make available my thesis or dissertation in whole or in part in all forms of media, now hereafter known. I retain all proprietary rights, such as patent rights. I also retain the right to use in future works (such as articles or books) all or part of this thesis or dissertation
dc.subjectSupport Vector Machines (SVMs)
dc.subjectSequential Minimal Optimization (SMO)
dc.subjectRegression and Classification
dc.titleSome efficient algorithms for support vector mechines
dc.typeThesis
dc.degree.levelPhD
dc.degree.levelDoctoral
dc.degree.grantorIndian Institute of Science
dc.degree.disciplineEngineering


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record