Bayesian Techniques for Joint Sparse Signal Recovery: Theory and Algorithms
Abstract
This thesis contributes new theoretical results, solution concepts, and algorithms concerning the Bayesian
recovery of multiple joint sparse vectors from noisy and underdetermined linear measurements.
The thesis is written in two parts. The first part focuses on the recovery of nonzero support of multiple
joint sparse vectors from their linear compressive measurements, an important canonical problem in multisensor
signal processing. The support recovery performance of a well known Bayesian inference technique
called Multiple Sparse Bayesian Learning (MSBL) is analyzed using tools from large deviation theory.
New improved sufficient conditions are derived for perfect support recovery in MSBL with arbitrarily high
probability. We show that the support error probability in MSBL decays exponentially fast with the number
of joint sparse vectors and the rate of decay depends on the restricted eigenvalues and null space structure
of the self Khatri-Rao product of the sensing matrix used to generate the measurements. New insights into
MSBL’s objective are developed which enhance our understanding of MSBL’s ability to recover supports of
size greater than the number of measurements available per joint sparse vector. These new insights are
formalized into a novel covariance matching framework for sparsity pattern recovery.
Next, we characterize the restricted isometry property of a generic Khatri-Rao product matrix in terms of
its restricted isometry constants (RICs). Upper bounds for the RICs of Khatri-Rao product matrices are of
independent interest as they feature in the sample complexity analysis of several linear inverse problems
of fundamental importance, including the above support recovery problem. We derive deterministic and
probabilistic upper bounds for the RICs of Khatri-Rao product between two matrices. The newly obtained
RIC bounds are then used to derive performance bounds for MSBL based support recovery.
Building upon the new insights about MSBL, a novel covariance matching based support recovery
algorithm is conceived. It uses a R´enyi divergence objective which reverts to the MSBL’s objective in a special
case. We show that the R´enyi divergence objective can be expressed as a difference of two submodular set
functions, and hence it can be optimized via an iterative majorization-minimization procedure to generate
the support estimate. The resulting algorithm is empirically shown to be several times faster than existing
support recovery methods with comparable performance.
The second part of the thesis focuses on developing decentralized extensions of MSBL for in-network
estimation of multiple joint sparse vectors from linear compressive measurements using a network of nodes.
A common issue while implementing decentralized algorithms is the high cost associated with the exchange
of information between the network nodes. To mitigate this problem, we examine two different approaches
to reduce the amount of inter-node communication in the network. In the first decentralized extension of
MSBL, the network nodes exchange information only via a small set of predesignated bridge nodes. For
this bridge node based network topology, the MSBL optimization is then performed using decentralized
Alternating Directions Method of Multipliers (ADMM). The convergence of decentralized ADMM in a bridge
node based network topology for a generic consensus optimization is separately analyzed and a linear rate
of convergence is established. Our second decentralized extension of MSBL reduces the communication
complexity by adaptively censoring the information exchanged between the nodes of the network by
exploiting the inherent sparse nature of the exchanged information. The performance of the proposed
decentralized schemes is evaluated using both simulated as well as real-world data.