Improved Derivative-based Regularizations for Imaging Inverse problems
Abstract
Images undergo degradation during the capturing process due to physical
limitations inherent to the capturing devices. Addressing this degradation and
recovering high-quality images constitute the image recovery problem, a crucial
concern with diverse applications across various fields such as biology, astronomy,
and medicine. The enhancement of captured image resolution significantly
influences these disciplines. Examples of this challenge include tasks like
reconstructing computed tomography images, magnetic resonance imaging, image
deconvolution, and microscopic image reconstruction.
Image recovery is frequently approached using regularization techniques, with
derivative-based regularizations being popular due to their ability to exploit image
smoothness, yielding interpretable results devoid of artifacts.
Total Variation
regularization (TV), proposed by Rudin, Osher, and Fatemi, is a seminal approach
for image recovery. TV involves the norm of the image’s gradient, aggregated over
all pixel locations. As TV encourages minimal values in the derivative norm, it
leads to piece-wise constant solutions, resulting in what is known as the "staircase
effect." To mitigate this effect, the Hessian Schatten norm regularization (HSN)
employs second-order derivatives, represented by the pth norm of eigenvalues
in the image hessian, summed across all pixels. HSN demonstrates superior
structure-preserving properties compared to TV. However, HSN solutions tend to
be overly smoothed. To address this, we introduce a non-convex shrinkage penalty
applied to the Hessian’s eigenvalues, deviating from the convex lp norm. It is
important to note that the shrinkage penalty is not defined directly in closed form,
but specified indirectly through its proximal operation. This makes constructing
a provably convergent algorithm difficult as the singular values are also defined
through a non-linear operation. However, we were able to derive a provably
convergent algorithm using proximal operations. We prove the convergence
by establishing that the proposed regularization adheres to restricted proximal
regularity. The images recovered by this regularization were sharper than the
convex counterparts.
In the subsequent work, we extend the concept of the Hessian-Schatten norm.
By encompassing Schatten norms of the Hessian and introducing a smoothness
constraint, we broaden the scope of Hessian Schatten norm. The resulting
regularization can be derived as a Lagrange dual of the Hessian Schatten norm,
akin to the total generalized variation. The proposed regularization generalizes
TV-1, TV-2, HSN, and second-order Total Generalized Variation. Furthermore,
we present an efficient variable splitting scheme for solving image restoration
challenges.
Total Generalized Variation (TGV) represents an important generalization of
Total Variation. TGV involves multiple orders of derivatives, with higher-order
TGV leading to improved recovered image quality. This enhancement has been
validated through numerical experiments in image denoising. Consequently, a
demand arises for an algorithm capable of solving TGV for any order. While
various methods address TGV regularization, many are confined to second-order
TGV, and only a few explore orders greater than three for image recovery with TGV
regularization. To our knowledge, no algorithm resolves image recovery challenges
employing TGV regularization for orders exceeding three under a general forward
model. This challenge arises from the intricate nature of TGV representation. We
surmount this obstacle by presenting two simple matrix based representations of
TGV: the direct and compact forms. We prove the equivalence of both forms with
the original TGV definition. Leveraging the compact representation, we propose a
generalized ADMM-based algorithm to solve TGV regularization for any order