Correntropy is a nonlinear similarity measure between two random variables.
The name Correntropy comes from correlation and entropy.
There are several possible definitions depending on the interpretation which are not necessarily equivalent. Let X,Y be two random variables.
where δ is the Dirac delta distribution function.
where κ is a non-negative definite function. It is not necessary that κ is shift invariant as in the ideal view case.
where is the set of observations.
- Santamaria, I.; Pokharel, P. & Príncipe, J. C. Generalized Correlation Function: Definition, Properties, and Application to Blind Equalization IEEE Transactions on Signal Processing, 2006, 54, 2187-2197
- Liu, W.; Pokharel, P. P. & Príncipe, J. C. Correntropy: Properties and Applications in Non-Gaussian Signal Processing IEEE Trans. on Signal Processing,