Matlab code for performing variational inference in the Indian Buffet Process with a linear-Gaussian likelihood model.
We provide two kinds of variational approximations (discussed in depth in the corresponding paper and technical report) that trade-off between the speed and accuracy of the inference. In general, we find that this software outperforms similarly optimised Gibbs samplers for large, complex datasets, but is less efficient for smaller/low-dimensional data. Also provided is code for several (tunable) heuristics for improving the optimisation steps in the inference routines.
Archive includes a toy dataset and a sample test file.
- Changes to previous version:
Initial Announcement on mloss.org.
No one has posted any comments yet. Perhaps you'd like to be the first?
Leave a comment
You must be logged in to post comments.