
 Description:
Kernel Adaptive Filtering Toolbox
A Matlab benchmarking toolbox for kernel adaptive filtering.
Kernel adaptive filtering algorithms are online and adaptive regression algorithms based on kernels. They are suitable for nonlinear filtering, prediction, tracking and nonlinear regression in general. This toolbox includes algorithms, demos, and tools to compare their performance.
Maintainer: Steven Van Vaerenbergh
Contributors:
 Miguel LazaroGredilla
 Sohan Seth
 Masahiro Yukawa
 Masaaki Takizawa
 Osamu Toda
 Dominik Rzepka
 Pantelis Bouboulis
Official web: https://sourceforge.net/projects/kafbox
This toolbox is a collaborative effort: every developer wishing to contribute code or suggestions can do so. More info below.
Directories included in the toolbox
data/
 data setsdemo/
 demos and test fileslib/
 algorithm libraries and utilitiesSetup

Run
install.m
to add the toolbox folders to the path. 
Type
savepath
to save the changes to the path.
Octave / Matlab pre2008a
This toolbox uses the
classdef
command which is not supported in Matlab pre2008a and not yet in Octave. The older 0.x versions of this toolbox do not useclassdef
and can therefore be used with all versions of Matlab and Octave. http://sourceforge.net/projects/kafbox/files/Usage
Each kernel adaptive filtering algorithm is implemented as a Matlab class. To use one, first define its options:
options = struct('nu',1E4,'kerneltype','gauss','kernelpar',32);
Next, create an instance of the filter. E.g., for an instance of the KRLS algorithm that uses the ALD criterion run:
kaf = aldkrls(options);
One iteration of training is performed by feeding one inputoutput data pair to the filter:
kaf = kaf.train(x,y);
The outputs for one or more test inputs are evaluated as follows:
Y_test = kaf.evaluate(X_test);
Example: timeseries prediction
Code from
demo/demo_prediction.m
% Demo: 1step ahead prediction on Lorenz attractor timeseries data [X,Y] = kafbox_data(struct('file','lorenz.dat','embedding',6)); % make a kernel adaptive filter object of class aldkrls with options: % ALD threshold 1E4, Gaussian kernel, and kernel width 32 kaf = aldkrls(struct('nu',1E4,'kerneltype','gauss','kernelpar',32)); %% RUN ALGORITHM N = size(X,1); Y_est = zeros(N,1); for i=1:N, if ~mod(i,floor(N/10)), fprintf('.'); end % progress indicator, 10 dots Y_est(i) = kaf.evaluate(X(i,:)); % predict the next output kaf = kaf.train(X(i,:),Y(i)); % train with one inputoutput pair end fprintf('\n'); SE = (YY_est).^2; % test error %% OUTPUT fprintf('MSE after first 1000 samples: %.2fdB\n\n',10*log10(mean(SE(1001:end))));
Result:
MSE after first 1000 samples: 40.17dB
Citing KAFBOX
If you use this toolbox in your research please cite "A Comparative Study of Kernel Adaptive Filtering Algorithms":
@conference{vanvaerenbergh2013comparative, author = {Steven Van Vaerenbergh and Santamar{\'\i}a, Ignacio}, booktitle = {2013 IEEE Digital Signal Processing (DSP) Workshop and IEEE Signal Processing Education (SPE)}, title = {A Comparative Study of Kernel Adaptive Filtering Algorithms}, year = {2013}, note = {Software available at \url{http://sourceforge.net/projects/kafbox/}} }
Included algorithms
 Approximate Linear Dependency Kernel Recursive LeastSquares (ALDKRLS), as proposed in Y. Engel, S. Mannor, and R. Meir. "The kernel recursive leastsquares algorithm", IEEE Transactions on Signal Processing, volume 52, no. 8, pages 22752285, 2004.
 SlidingWindow Kernel Recursive LeastSquares (SWKRLS), as proposed in S. Van Vaerenbergh, J. Via, and I. Santamaria. "A slidingwindow kernel RLS algorithm and its application to nonlinear channel identification", 2006 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Toulouse, France, 2006.
 Naive Online Regularized Risk Minimization Algorithm (NORMA), as proposed in J. Kivinen, A. Smola and C. Williamson. "Online Learning with Kernels", IEEE Transactions on Signal Processing, volume 52, no. 8, pages 21652176, 2004.
 Kernel LeastMeanSquare (KLMS), as proposed in W. Liu, P.P. Pokharel, and J.C. Principe, "The Kernel LeastMeanSquare Algorithm," IEEE Transactions on Signal Processing, vol.56, no.2, pp.543554, Feb. 2008.
 FixedBudget Kernel Recursive LeastSquares (FBKRLS), as proposed in S. Van Vaerenbergh, I. Santamaria, W. Liu and J. C. Principe, "FixedBudget Kernel Recursive LeastSquares", 2010 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2010), Dallas, Texas, U.S.A., March 2010.
 Kernel Recursive LeastSquares Tracker (KRLST), as proposed in S. Van Vaerenbergh, M. LazaroGredilla, and I. Santamaria, "Kernel Recursive LeastSquares Tracker for TimeVarying Regression," Neural Networks and Learning Systems, IEEE Transactions on , vol.23, no.8, pp.13131326, Aug. 2012.
 Quantized Kernel Least Mean Squares (QKLMS), as proposed in Chen B., Zhao S., Zhu P., Principe J.C., "Quantized Kernel Least Mean Square Algorithm," IEEE Transactions on Neural Networks and Learning Systems, vol.23, no.1, Jan. 2012, pages 2232.
 Random Fourier Feature Kernel Least Mean Square (RFFKLMS) algorithm, as proposed in Abhishek Singh, Narendra Ahuja and Pierre Moulin, "Online learning with kernels: Overcoming the growing sum problem," 2012 IEEE International Workshop on Machine Learning for Signal Processing (MLSP), Sept. 2012.
 Extended Kernel Recursive Least Squares (EXKRLS), as proposed in W. Liu and I. Park and Y. Wang and J.C. Principe, "Extended kernel recursive least squares algorithm", IEEE Transactions on Signal Processing, volume 57, number 10, pp. 38013814, oct. 2009.
 GaussianProcess based estimation of the parameters of KRLST, as proposed in Steven Van Vaerenbergh, Ignacio Santamaria, and Miguel LazaroGredilla, "Estimation of the forgetting factor in kernel recursive least squares," 2012 IEEE International Workshop on Machine Learning for Signal Processing (MLSP), Sept. 2012.
 Kernel Affine Projection (KAP) algorithm with Coherence Criterion, as proposed in C. Richard, J.C.M. Bermudez, P. Honeine, "Online Prediction of Time Series Data With Kernels," IEEE Transactions on Signal Processing, vol.57, no.3, pp.1058,1067, March 2009.
 Kernel Normalized LeastMeanSquare (KNLMS) algorithm with Coherence Criterion, as proposed in C. Richard, J.C.M. Bermudez, P. Honeine, "Online Prediction of Time Series Data With Kernels," IEEE Transactions on Signal Processing, vol.57, no.3, pp.1058,1067, March 2009.
 Recursive LeastSquares algorithm with exponential weighting (RLS), as described in S. Haykin, "Adaptive Filtering Theory (3rd Ed.)", Prentice Hall, Chapter 13.
 Multikernel Normalized Least Mean Square algorithm with Coherencebased Sparsification (MKNLMSCS), as proposed in M. Yukawa, "Multikernel Adaptive Filtering", IEEE Transactions on Signal Processing, vol.60, no.9, pp.46724682, Sept. 2012.
 Parallel HYperslab Projection along Affine SubSpace (PHYPASS) algorithm, as proposed in M. Takizawa and M. Yukawa, "An Efficient DataReusing Kernel Adaptive Filtering Algorithm Based on Parallel Hyperslab Projection Along Affine Subspace," 2013 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pp.35573561, May 2013.
 Fixedbudget kernel least mean squares (FBKLMS) algorithm, as proposed in D. Rzepka, "Fixedbudget kernel least mean squares," 2012 IEEE 17th Conference on Emerging Technologies & Factory Automation (ETFA), Krakow, Poland, Sept. 2012.
 Leaky Kernel Affine Projection Algorithm (LKAPA, including KAPA1 and KAPA3) and Normalized Leaky Kernel Affine Projection Algorithm (NLKAPA, including KAPA2 and KAPA4), as proposed in W. Liu and J.C. Principe, "Kernel Affine Projection Algorithms", EURASIP Journal on Advances in Signal Processing, Volume 2008, Article ID 784292, 12 pages.
 Kernel Affine Projection Subgradient Method (KAPSM), as proposed in K. Slavakis, S. Theodoridis, and I. Yamada, "Online kernelbased classification using adaptive projection algorithms," IEEE Transactions on Signal Processing, Vol. 56, No. 7, pp. 27812796, 2008.
 Kernel Least Mean Squares algorithm with CoherenceSparsification criterion and L1norm regularization (KLMSCSL1) and with active L1norm regularization (KLMSCSAL1), as proposed in Wei Gao, Jie Chen, Cédric Richard, Jianguo Huang, and Rémi Flamary, "Kernel LMS algorithm with forwardbackward splitting for dictionary learning," 2013 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2013), Vancouver, Canada, March 2013.
 Mixture Kernel Least Mean Square (MXKLMS) algorithm, as proposed in R. Pokharel, S. Seth, and J.C. Principe, "Mixture kernel least mean square," The 2013 International Joint Conference on Neural Networks (IJCNN), pp.17, 49 Aug. 2013.
How to contribute code to the toolbox
Option 1: email it to me (steven@gtas.dicom.unican.es)
Option 2: fork the toolbox on GitHub, push your changes, then send me a pull request.
License
This source code is released under the FreeBSD License.

Run
 Changes to previous version:

Improvements and demo script for profiler

Initial version of documentation

Several new algorithms

Improvements and demo script for profiler
 BibTeX Entry: Download
 Corresponding Paper BibTeX Entry: Download
 URL: Project Homepage
 Supported Operating Systems: Platform Independent
 Data Formats: Any Format Supported By Matlab
 Tags: Regression, Online Learning, Kernel Methods, Gaussian Processes
 Archive: download here
Other available revisons

Version Changelog Date 1.4 
Improvements and demo script for profiler

Initial version of documentation

Several new algorithms
May 26, 2014, 18:24:23 1.3 Inclusion of Gaussian process based parameter estimation, and several new regression algorithms.
October 21, 2013, 18:15:23 1.2 Initial Announcement on mloss.org.
September 2, 2013, 20:22:31 
Improvements and demo script for profiler
Comments
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.