Today I read a published paper titled “Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities”
Some of the math was a bit obtuse, but a useful method that could be used for fingerprinting malware, tagging and label similar images using information divergence. The fact that this technique does not require a Monte-Carlo simulation is significant.
The abstract is:
The Kullback-Leibler divergence between two mixture models is a core primitive in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures does not admit closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration or approximated using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy and the Kullback-Leibler divergence of mixtures. We illustrate the versatile method by reporting on our experiments for approximating the Kullback-Leibler divergence between univariate exponential mixtures, Gaussian mixtures and Rayleigh mixtures.