Show simple item record

dc.contributor.advisorBhattacharyya, Chiranjib
dc.contributor.authorJagarlapudi, Saketha Nath
dc.date.accessioned2010-07-08T06:48:22Z
dc.date.accessioned2018-07-31T04:39:49Z
dc.date.available2010-07-08T06:48:22Z
dc.date.available2018-07-31T04:39:49Z
dc.date.issued2010-07-08
dc.date.submitted2008
dc.identifier.urihttps://etd.iisc.ac.in/handle/2005/733
dc.description.abstractThis thesis explores Chance-Constrained Programming (CCP) in the context of learning. It is shown that chance-constraint approaches lead to improved algorithms for three important learning problems — classification with specified error rates, large dataset classification and Ordinal Regression (OR). Using moments of training data, the CCPs are posed as Second Order Cone Programs (SOCPs). Novel iterative algorithms for solving the resulting SOCPs are also derived. Borrowing ideas from robust optimization theory, the proposed formulations are made robust to moment estimation errors. A maximum margin classifier with specified false positive and false negative rates is derived. The key idea is to employ chance-constraints for each class which imply that the actual misclassification rates do not exceed the specified. The formulation is applied to the case of biased classification. The problems of large dataset classification and ordinal regression are addressed by deriving formulations which employ chance-constraints for clusters in training data rather than constraints for each data point. Since the number of clusters can be substantially smaller than the number of data points, the resulting formulation size and number of inequalities are very small. Hence the formulations scale well to large datasets. The scalable classification and OR formulations are extended to feature spaces and the kernelized duals turn out to be instances of SOCPs with a single cone constraint. Exploiting this speciality, fast iterative solvers which outperform generic SOCP solvers, are proposed. Compared to state-of-the-art learners, the proposed algorithms achieve a speed up as high as 10000 times, when the specialized SOCP solvers are employed. The proposed formulations involve second order moments of data and hence are susceptible to moment estimation errors. A generic way of making the formulations robust to such estimation errors is illustrated. Two novel confidence sets for moments are derived and it is shown that when either of the confidence sets are employed, the robust formulations also yield SOCPs.en_US
dc.language.isoen_USen_US
dc.relation.ispartofseriesG22339en_US
dc.subjectMachine Learningen_US
dc.subjectClassificationen_US
dc.subjectDataset Classificationen_US
dc.subjectOrdinal Regression (OR)en_US
dc.subjectChance-Constrained Programming (CCP)en_US
dc.subjectClassification - Algorithmsen_US
dc.subjectOrdinal Regression - Algorithmsen_US
dc.subjectMachine Learning - Algorithmsen_US
dc.subjectSecond Order Cone Programs (SOCPs)en_US
dc.subjectMaximum Margin Classificationen_US
dc.subjectFocused Crawlingen_US
dc.subjectLarge Datasetsen_US
dc.subjectError Ratesen_US
dc.subject.classificationComputer Scienceen_US
dc.titleLearning Algorithms Using Chance-Constrained Programsen_US
dc.typeThesisen_US
dc.degree.namePhDen_US
dc.degree.levelDoctoralen_US
dc.degree.disciplineFaculty of Engineeringen_US


Files in this item

This item appears in the following Collection(s)

Show simple item record