Correntropy

Introduction

Correntropy is a nonlinear similarity measure between two random variables.

History

The name Correntropy comes from correlation and entropy.

Definition

There are several possible definitions depending on the interpretation which are not necessarily equivalent. Let X,Y be two random variables.

Ideal view

$V(X,Y) = \operatorname{E}\left[\delta(X - Y)\right]$

where δ is the Dirac delta distribution function.

Similarity view

$V(X,Y) = \operatorname{E}\left[\kappa(X, Y)\right]$

where κ is a non-negative definite function. It is not necessary that κ is shift invariant as in the ideal view case.

Estimator view

$\hat{V}(X,Y) = \frac{1}{N}\sum_{k=1}^N \kappa(x(k), y(k))$

where $\{(x(k),y(k))\}_{k=1}^N$ is the set of observations.