Edgeworth approximations of the Kullback-Leibler distance toward problems in image analysis, (with J.-J. Lin and R. A. Levine), submitted for publication, 2001.
Evaluation of syntheses or simulated data is often done subjectively
through visual comparisons with the original samples. This subjective
evaluation is particularly dominant in the area of texture modeling
and simulation. In order to objectively evaluate the similarity
(or difference) between original samples and syntheses, we propose an
approximation for the Kullback-Leibler distance based on Edgeworth
expansions (EKLD). We use this approximation to study the sampling
distribution of the original and synthesized images. As part of our
development, we present numerical examples
to study the behavior of EKLD for sample mean distributions and
illustrate the advantages of our approach for evaluating the
differential entropy and choosing the least statistically dependent
basis from wavelet packet dictionaries. Finally, we introduce how to
use EKLD in statistical image processing to validate synthetic
representations of images.
Keywords: Differential entropy, cumulants, least statistically
dependent basis, wavelet packet dictionary, image processing.
Get the full paper: gzipped PS file or PDF file.
me if you have any comments or questions!
back to Naoki's Publication Page