Information Theoretic Learning

From ITL Wiki
Revision as of 19:43, 12 January 2016 by Carlos (Talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Welcome to Information Theoretic Learning

NSF logo.gif

ITL Brief Introduction

Information Theoretic Learning (ITL) was initiated in the late 90’s at CNEL and has been a center piece of the research effort. ITL uses descriptors from information theory (entropy and divergences) estimated directly from the data to substitute the conventional statistical descriptors of variance and covariance. ITL can be used in the adaptation of linear or nonlinear filters and also in unsupervised and supervised machine learning applications. See the ITL Resource Center for tutorials, examples and Matlab code.
UF logo.gif


Current Projects

  1. Correntropy Dependence Measure
  2. Nonlinearity tests based on Correntropy
  3. Pitch Detection Based on Correntropy
  4. Nonlinear Granger Causality based on correntropy
  5. Compressive sampling based on correntropy
  6. ITL Feature Extraction for mine recognition
  7. The Principle of Relevant Entropy
  1. On-line KLMS is intrinsically regularized
  2. Nonlinear adaptive filters in RKHS
  3. Active Learning Strategies


Invited Talks




Related Publications












Personal tools
Namespaces
Variants
Actions
Navigation
EEL 6935
Toolbox