Some information theoretic and signal detection problems in the content of nearly Gaissian densities
Abstract
It is well known that the use of the Gaussian assoimption
when the governing prohahility density function is only -nearly
Gaussian causes serious errors in estimation and detection theory.
Defining a nearly Gaussian pdf f(x) to "be of the foim f(x) =
o£l.g(x) +ah(x) where 0<a<1, a + a = 1, g(x) is a pure Gaussian
p.d.f. and h(x) some unknown p.d.f, v/e attempt to investigate the
Gaussian assumption in Information Theory. The entropy of a random
variable with such .continuous density f(x) is maximissed over sCLl
densities h(x) which have specified variance. This result is used
in determining the "behaviour of the capacity of an additive Gaussian
channel whose input is from a nearly Gaussian source, and the Rate
Distortion function of such a source. It is shown numerically that
hoth of these show a significant decrease from the pvire Gaussian
case. A rotust encoding scheme for discrete sources with in accurately known prohahilities of the formapj^ +a i=1 »...» N
m
where p^ and are probahilities, is presented and investigations
of some other questions in Information Theory are made.
The Fisher Information of a continuous nearly Gaussian
density is minimized over all pdfs h(x) which have a specified
variance.
This result is used in the problem of detecting a cons-tant
signal in nearly Gaussian noise. A non-linearity is obi^ajJied which
is shown to provide an improvement of a known result in robist
detection.

