Projects that are tagged with gaussian process.


Logo JMLR GPstuff 4.4

by avehtari - April 15, 2014, 15:26:49 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7891 views, 2167 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-04-11 Version 4.4

New features

  • Monotonicity constraint for the latent function.

    • Riihimäki and Vehtari (2010). Gaussian processes with monotonicity information. Journal of Machine Learning Research: Workshop and Conference Proceedings, 9:645-652.
  • State space implementation for GP inference (1D) using Kalman filtering.

    • For the following covariance functions: Squared-Exponential, Matérn-3/2 & 5/2, Exponential, Periodic, Constant
    • Särkkä, S., Solin, A., Hartikainen, J. (2013). Spatiotemporal learning via infinite-dimensional Bayesian filtering and smoothing. IEEE Signal Processing Magazine, 30(4):51-61.
    • Simo Sarkka (2013). Bayesian filtering and smoothing. Cambridge University Press.
    • Solin, A. and Särkkä, S. (2014). Explicit link between periodic covariance functions and state space models. AISTATS 2014.

Improvements

  • GP_PLOT function for quick plotting of GP predictions
  • GP_IA now warns if it detects multimodal posterior distributions
  • much faster EP with log-Gaussian likelihood (numerical integrals -> analytical results)
  • faster WAIC with GP_IA array (numerical integrals -> analytical results)
  • New demos demonstrating new features etc.
    • demo_minimal, minimal demo for regression and classification
    • demo_kalman1, demo_kalman2
    • demo_monotonic, demo_monotonic2

Plus bug fixes


Logo BayesOpt, a Bayesian Optimization toolbox 0.6

by rmcantin - March 26, 2014, 17:48:17 CET [ Project Homepage BibTeX Download ] 4396 views, 998 downloads, 2 subscriptions

About: BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python.

Changes:

-Complete refactoring of inner parts of the library. The code is easier to understand/modify and it allow simpler integration with new algorithms.

-Updated to the latest version of NLOPT (2.4.1). Wrapper code symplified.

-Error codes replaced with exceptions in C++ interface. Library is exception safe.

-API modified to support new learning methods for kernel hyperparameters (e.g: MCMC). Warning: config parameters about learning have changed. Code using previous versions might not work. Some of the learning methods (like MCMC) are not yet implemented.

-Added configuration of random numbers (can be fixed for debugging). Fixed issue with random numbers using different sources or random number with potential correlations. Now all the elements are guaranteed to use the same instance of the random engine.

-Improved numerical results (e.g.: hyperparameter optimization is done in log space)

-More examples and tests.

-Fixed bugs.

-The number of inner iterations have been increased by default, so overall optimization time using default configuration might be slower, but with improved results.


About: Toeblitz is a MATLAB/Octave package for operations on positive definite Toeplitz matrices. It can solve Toeplitz systems Tx = b in O(n*log(n)) time and O(n) memory, compute matrix inverses T^(-1) (with free log determinant) in O(n^2) time and memory, compute log determinants (without inverses) in O(n^2) time and O(n) memory, and compute traces of products A*T for any matrix A, in minimal O(n^2) time and memory.

Changes:

Adding tar directly instead of via link


Logo GP RTSS 1.0

by marc - March 21, 2012, 08:43:52 CET [ BibTeX BibTeX for corresponding Paper Download ] 1410 views, 455 downloads, 1 subscription

About: Gaussian process RTS smoothing (forward-backward smoothing) based on moment matching.

Changes:

Initial Announcement on mloss.org.


About: This local and parallel computation toolbox is the Octave and Matlab implementation of several localized Gaussian process regression methods: the domain decomposition method (Park et al., 2011, DDM), partial independent conditional (Snelson and Ghahramani, 2007, PIC), localized probabilistic regression (Urtasun and Darrell, 2008, LPR), and bagging for Gaussian process regression (Chen and Ren, 2009, BGP). Most of the localized regression methods can be applied for general machine learning problems although DDM is only applicable for spatial datasets. In addition, the GPLP provides two parallel computation versions of the domain decomposition method. The easiness of being parallelized is one of the advantages of the localized regression, and the two parallel implementations will provide a good guidance about how to materialize this advantage as software.

Changes:

Initial Announcement on mloss.org.