Chapters
From ITL Wiki


Chapter 1
 A review of adaptive Systems and Information Theory
 Problem formulation: Wiener filtering
 The adaptive linear combiner
 Descriptors of Information Theory: entropy and divergence
 Source Coding Theorem
Chapter 2
 Adaptive Information filtering
 Definition of Renyi's entropy
 Renyi's Nonparametric estimators and properties Mean and variance
 Information potential and forces
 Divergence measures Quadratic distances
 Information potential and forces in the joint space
 Fast Computation of IP and CIP
Chapter 3
 Algorithms for Adaptive Information Filtering
 Error entropy criterion
 Algorithms for adaptation
 Minimum error entropy (MEE)
 Recursive information potential MEE
 Stochastic Information Gradient
 Normalized MEE
 Fixed point MEE
 Adaptation of linear filters with divergence
 Backpropagation of information forces
 Fast Renyi's entropy calculations
Chapter 4
 Supervised Applications of Information Theoretic Learning
 Supervised applications of ITL
 MEE and M estimation
 Noise robust properties of MEE
 Nonlinear system identification
 Nonlinear channel equalization
 Feature extraction with ITL
 Classification with ITL
Chapter 5
 Unsupervised learning with ITL
 Clustering evaluation function
 Differential entropy clustering
 Clustering algorithm based on cross information potential
 Information Theoretic Clustering
 A novel principle for unsupervised learning
 Hebbian learning and maximum entropy
 Blind deconvolution with ITL
 Independent component analysis with ITL
Chapter 6
 ITL and Kernel Methods
 Definition of RKHS
 Information Potential as a central moment of the projected data
 Interpretation of SVMs in ITL terms
 A RKHS for ITL
 Adaptive Algorithms in RKHS: KLMS, KRLS, KAPA
Chapter 7
 Generalized Similarity Measures in RKHS
 Definition of Correntropy and its Applications
 Definition of the Correntropy RKHS
 Correntropy Matched Filters
 Correntropy Wiener Filters
 Correntropy Principal Component Analysis
 Other Correntropy based Algorithms