A mutual information library for C/C++ and Mex bindings for MATLAB.
This toolbox is aimed at people who wish to use mutual information for feature selection, and provides a range of information theoretic functions. All functions estimate the probabilities from the supplied data vectors. Some example implementations of common mutual information based feature selection algorithms are provided in both C and MATLAB, CMIM - (Fleuret 2004), mRMR - (Peng et al 2005), DISR - (Bontempi & Meyer 2006). The implementations contained in here are early versions of those in the FEAST library, and the implementations in FEAST should be used in preference to the ones in MIToolbox.
All functions discretise the inputs by rounding down to the nearest integer.
This toolbox was developed to support our work in feature selection, which resulted in the paper "Conditional Likelihood Maximisation: A Unifying Framework for Information Theoretic Feature Selection", G Brown, A Pocock, M-J Zhao, M Lujan. JMLR 2012 (link). Please cite this paper if you use our toolbox.
The feature selection algorithms developed for that paper form the FEAST toolbox, published at mloss here.
List of functions: Entropy, Conditional Entropy, Joint Entropy, Mutual Information, Conditional Mutual Information, Renyi's Entropy, Renyi's Mutual Information, Weighted Entropy, Weighted Conditional Entropy Creating a joint random variable
A Java implementation of the Shannon entropy functions is available from my GitHub page here.
- Changes to previous version:
Added weighted entropy functions. Fixed a few memory handling bugs.
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.