Show simple item record

dc.contributor.advisorTyagi, Himanshu
dc.contributor.authorJha, Shubham Kumar
dc.date.accessioned2026-01-20T08:43:50Z
dc.date.available2026-01-20T08:43:50Z
dc.date.submitted2025
dc.identifier.urihttps://etd.iisc.ac.in/handle/2005/8277
dc.description.abstractThis thesis aims to systematically study the role of side-information for different problems arising in distributed quantization and optimization. In the first part of the thesis, we study a universal version of the classic Wyner-Ziv problem from information theory, where the side-information is a noisy version of the sender’s Gaussian observations with noise variance unknown to the sender. For this problem, we develop a universally rate-optimal and practical quantization scheme based on Polar codes for all values of unknown noise variance. In the second part of the thesis, we study the efficacy of the Wyner-Ziv compression strategy for gradient compression for distributed first-order smooth optimization. We begin by establishing an information-theoretic lower bound on optimization accuracy when only finite precision gradients are used. Also, we develop quantizers for the gradient compression, which match the aforementioned lower bound when used with the standard first-order optimization algorithms. Finally, we study the fundamental limits of distributed optimization over wireless channels. We begin with a setting where a server is optimizing an objective function using only one client, and the client supplies gradient estimates over a white Gaussian channel. We provide fundamental limits on optimization accuracy when only a finite number of channel uses are permitted. We then extend our study to the multiple clients setting where all the clients communicate their gradient estimates over a multiple-access channel. For this case, too, we establish an information-theoretic lower bound on the optimization accuracy for a given number of channel uses and develop a computationally tractable communication scheme that optimally adapts to the channel conditions. Our scheme involves constructing a side-information by exploiting the closeness among the clients’ distributions and then leveraging that to enhance the overall performance.en_US
dc.description.sponsorshipPMRFen_US
dc.language.isoen_USen_US
dc.relation.ispartofseries;ET01245
dc.rightsI grant Indian Institute of Science the right to archive and to make available my thesis or dissertation in whole or in part in all forms of media, now hereafter known. I retain all proprietary rights, such as patent rights. I also retain the right to use in future works (such as articles or books) all or part of this thesis or dissertationen_US
dc.subjectDistributed Systemsen_US
dc.subjectOptimizationen_US
dc.subjectWireless Federated Learningen_US
dc.subjectMulti-agent learningen_US
dc.subjectInformation Theoryen_US
dc.subjectMulti-terminal source codingen_US
dc.subjectWyner-Ziv problemen_US
dc.subject.classificationResearch Subject Categories::INTERDISCIPLINARY RESEARCH AREASen_US
dc.titleQuantization using Side-Information for Distributed Optimization: Theory and Practiceen_US
dc.typeThesisen_US
dc.degree.namePhDen_US
dc.degree.levelDoctoralen_US
dc.degree.grantorIndian Institute of Scienceen_US
dc.degree.disciplineEngineeringen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record