Inicio  /  Algorithms  /  Vol: 15 Par: 11 (2022)  /  Artículo
ARTÍCULO
TITULO

Generalizing the Alpha-Divergences and the Oriented Kullback?Leibler Divergences with Quasi-Arithmetic Means

Frank Nielsen    

Resumen

The family of ?? a -divergences including the oriented forward and reverse Kullback?Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others. Choosing a suitable ?? a -divergence can either be done beforehand according to some prior knowledge of the application domains or directly learned from data sets. In this work, we generalize the ?? a -divergences using a pair of strictly comparable weighted means. Our generalization allows us to obtain in the limit case ???1 a ? 1 the 1-divergence, which provides a generalization of the forward Kullback?Leibler divergence, and in the limit case ???0 a ? 0 , the 0-divergence, which corresponds to a generalization of the reverse Kullback?Leibler divergence. We then analyze the condition for a pair of weighted quasi-arithmetic means to be strictly comparable and describe the family of quasi-arithmetic ?? a -divergences including its subfamily of power homogeneous ?? a -divergences. In particular, we study the generalized quasi-arithmetic 1-divergences and 0-divergences and show that these counterpart generalizations of the oriented Kullback?Leibler divergences can be rewritten as equivalent conformal Bregman divergences using strictly monotone embeddings. Finally, we discuss the applications of these novel divergences to k-means clustering by studying the robustness property of the centroids.