Show simple item record

dc.contributor.advisorMurthy, Chandra R
dc.contributor.authorSharma, Abhay
dc.date.accessioned2017-07-12T11:37:40Z
dc.date.accessioned2018-07-31T04:48:50Z
dc.date.available2017-07-12T11:37:40Z
dc.date.available2018-07-31T04:48:50Z
dc.date.issued2017-07-12
dc.date.submitted2015
dc.identifier.urihttps://etd.iisc.ac.in/handle/2005/2645
dc.identifier.abstracthttp://etd.iisc.ac.in/static/etd/abstracts/3449/G26729-Abs.pdfen_US
dc.description.abstractConsider a large population containing a small number of defective items. A commonly encountered goal is to identify the defective items, for example, to isolate them. In the classical non-adaptive group testing (NAGT) approach, one groups the items into subsets, or pools, and runs tests for the presence of a defective itemon each pool. Using the outcomes the tests, a fundamental goal of group testing is to reliably identify the complete set of defective items with as few tests as possible. In contrast, this thesis studies a non-defective subset identification problem, where the primary goal is to identify a “subset” of “non-defective” items given the test outcomes. The main contributions of this thesis are: We derive upper and lower bounds on the number of nonadaptive group tests required to identify a given number of non-defective items with arbitrarily small probability of incorrect identification as the population size goes to infinity. We show that an impressive reduction in the number of tests is achievable compared to the approach of first identifying all the defective items and then picking the required number of non-defective items from the complement set. For example, in the asymptotic regime with the population size N → ∞, to identify L nondefective items out of a population containing K defective items, when the tests are reliable, our results show that O _ K logK L N _ measurements are sufficient when L ≪ N − K and K is fixed. In contrast, the necessary number of tests using the conventional approach grows with N as O _ K logK log N K_ measurements. Our results are derived using a general sparse signal model, by virtue of which, they are also applicable to other important sparse signal based applications such as compressive sensing. We present a bouquet of computationally efficient and analytically tractable nondefective subset recovery algorithms. By analyzing the probability of error of the algorithms, we obtain bounds on the number of tests required for non-defective subset recovery with arbitrarily small probability of error. By comparing with the information theoretic lower bounds, we show that the upper bounds bounds on the number of tests are order-wise tight up to a log(K) factor, where K is the number of defective items. Our analysis accounts for the impact of both the additive noise (false positives) and dilution noise (false negatives). We also provide extensive simulation results that compare the relative performance of the different algorithms and provide further insights into their practical utility. The proposed algorithms significantly outperform the straightforward approaches of testing items one-by-one, and of first identifying the defective set and then choosing the non-defective items from the complement set, in terms of the number of measurements required to ensure a given success rate. We investigate the use of adaptive group testing in the application of finding a spectrum hole of a specified bandwidth in a given wideband of interest. We propose a group testing based spectrum hole search algorithm that exploits sparsity in the primary spectral occupancy by testing a group of adjacent sub-bands in a single test. This is enabled by a simple and easily implementable sub-Nyquist sampling scheme for signal acquisition by the cognitive radios. Energy-based hypothesis tests are used to provide an occupancy decision over the group of sub-bands, and this forms the basis of the proposed algorithm to find contiguous spectrum holes of a specified bandwidth. We extend this framework to a multistage sensing algorithm that can be employed in a variety of spectrum sensing scenarios, including non-contiguous spectrum hole search. Our analysis allows one to identify the sparsity and SNR regimes where group testing can lead to significantly lower detection delays compared to a conventional bin-by-bin energy detection scheme. We illustrate the performance of the proposed algorithms via Monte Carlo simulations.en_US
dc.language.isoen_USen_US
dc.relation.ispartofseriesG26729en_US
dc.subjectNon-Adaptive Group Testing (NAGT)en_US
dc.subjectAcoustic Noiseen_US
dc.subjectNoisy Nonadaptive Group Testingen_US
dc.subjectNon-Defective Subset Identificationen_US
dc.subjectNon-Defective Subset Recovery Algorithmsen_US
dc.subjectCognitive Radiosen_US
dc.subjectGroup Testing Spectrum Hole Search Algorithmen_US
dc.subjectAdditive Noiseen_US
dc.subjectDilution Noiseen_US
dc.subjectApproximation Algorithmsen_US
dc.subjectHealthy Subset Identificationen_US
dc.subjectData Streamingen_US
dc.subject.classificationCommunication Engineeringen_US
dc.titleFinding A Subset Of Non-defective Items From A Large Population : Fundamental Limits And Efficient Algorithmsen_US
dc.typeThesisen_US
dc.degree.namePhDen_US
dc.degree.levelDoctoralen_US
dc.degree.disciplineFaculty of Engineeringen_US


Files in this item

This item appears in the following Collection(s)

Show simple item record