Projects that also appeared in JMLR.
Showing Items 1-20 of 42 on page 1 of 3: 1 2 3 Next

Logo JMLR GPstuff 4.4

by avehtari - April 15, 2014, 15:26:49 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 7928 views, 2172 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-04-11 Version 4.4

New features

  • Monotonicity constraint for the latent function.

    • Riihimäki and Vehtari (2010). Gaussian processes with monotonicity information. Journal of Machine Learning Research: Workshop and Conference Proceedings, 9:645-652.
  • State space implementation for GP inference (1D) using Kalman filtering.

    • For the following covariance functions: Squared-Exponential, Matérn-3/2 & 5/2, Exponential, Periodic, Constant
    • Särkkä, S., Solin, A., Hartikainen, J. (2013). Spatiotemporal learning via infinite-dimensional Bayesian filtering and smoothing. IEEE Signal Processing Magazine, 30(4):51-61.
    • Simo Sarkka (2013). Bayesian filtering and smoothing. Cambridge University Press.
    • Solin, A. and Särkkä, S. (2014). Explicit link between periodic covariance functions and state space models. AISTATS 2014.

Improvements

  • GP_PLOT function for quick plotting of GP predictions
  • GP_IA now warns if it detects multimodal posterior distributions
  • much faster EP with log-Gaussian likelihood (numerical integrals -> analytical results)
  • faster WAIC with GP_IA array (numerical integrals -> analytical results)
  • New demos demonstrating new features etc.
    • demo_minimal, minimal demo for regression and classification
    • demo_kalman1, demo_kalman2
    • demo_monotonic, demo_monotonic2

Plus bug fixes


Logo JMLR Information Theoretical Estimators 0.57

by szzoli - April 10, 2014, 18:35:22 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 32326 views, 6971 downloads, 2 subscriptions

About: ITE (Information Theoretical Estimators) is capable of estimating many different variants of entropy, mutual information, divergence, association measures, cross quantities and kernels on distributions. Thanks to its highly modular design, ITE supports additionally (i) the combinations of the estimation techniques, (ii) the easy construction and embedding of novel information theoretical estimators, and (iii) their immediate application in information theoretical optimization problems.

Changes:
  • Kullback-Leibler divergence estimation based on maximum likelihood estimation + analytical formula in the chosen exponential family: added.

  • A new sampling based entropy estimator with KDE correction on the left/right sides: added.

  • Quick tests: updated with the new estimators.


Logo JMLR Tapkee 1.0

by blackburn - April 10, 2014, 02:45:58 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4153 views, 1103 downloads, 0 subscriptions

About: Tapkee is an efficient and flexible C++ template library for dimensionality reduction.

Changes:

Initial Announcement on mloss.org.


Logo JMLR dlib ml 18.7

by davis685 - April 10, 2014, 01:47:08 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 69274 views, 12137 downloads, 2 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools that facilitate creating complex software in C++ to solve real world problems.

Changes:

The major new feature in this release is a Python API for training histogram-of-oriented-gradient based object detectors and examples showing how to use this type of detector to perform real-time face detection. Additionally, this release also adds simpler interfaces for learning to solve assignment and multi-target tracking problems.


Logo JMLR MOA Massive Online Analysis Nov-13

by abifet - April 4, 2014, 03:50:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9579 views, 3933 downloads, 1 subscription

About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license.

Changes:

New version November 2013


Logo JMLR MultiBoost 1.2.02

by busarobi - March 31, 2014, 16:13:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 19526 views, 3416 downloads, 1 subscription

About: MultiBoost is a multi-purpose boosting package implemented in C++. It is based on the multi-class/multi-task AdaBoost.MH algorithm [Schapire-Singer, 1999]. Basic base learners (stumps, trees, products, Haar filters for image processing) can be easily complemented by new data representations and the corresponding base learners, without interfering with the main boosting engine.

Changes:

Major changes :

  • The “early stopping” feature can now based on any metric output with the --outputinfo command line argument.

  • Early stopping now works with --slowresume command line argument.

Minor fixes:

  • More informative output when testing.

  • Various compilation glitch with recent clang (OsX/Linux).


Logo JMLR EnsembleSVM 2.0

by claesenm - March 31, 2014, 08:06:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3118 views, 1074 downloads, 1 subscription

About: The EnsembleSVM library offers functionality to perform ensemble learning using Support Vector Machine (SVM) base models. In particular, we offer routines for binary ensemble models using SVM base classifiers. Experimental results have shown the predictive performance to be comparable with standard SVM models but with drastically reduced training time. Ensemble learning with SVM models is particularly useful for semi-supervised tasks.

Changes:

The library has been updated and features a variety of new functionality as well as more efficient implementations of original features. The following key improvements have been made:

  1. Support for multithreading in training and prediction with ensemble models. Since both of these are embarassingly parallel, this has induced a significant speedup (3-fold on quad-core).
  2. Extensive programming framework for aggregation of base model predictions which allows highly efficient prototyping of new aggregation approaches. Additionally we provide several predefined strategies, including (weighted) majority voting, logistic regression and nonlinear SVMs of your choice -- be sure to check out the esvm-edit tool! The provided framework also allows you to efficiently program your own, novel aggregation schemes.
  3. Full code transition to C++11, the latest C++ standard, which enabled various performance improvements. The new release requires moderately recent compilers, such as gcc 4.7.2+ or clang 3.2+.
  4. Generic implementations of convenient facilities have been added, such as thread pools, deserialization factories and more.

The API and ABI have undergone significant changes, many of which are due to the transition to C++11.


Logo JMLR fastclime 1.2.3

by colin1898 - March 10, 2014, 08:54:41 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 654 views, 118 downloads, 1 subscription

About: The package "fastclime" provides a method of recover the precision matrix efficiently by applying parametric simplex method. The computation is based on a linear optimization solver. It also contains a generic LP solver and a parameterized LP solver using parametric simplex method.

Changes:

Initial Announcement on mloss.org.


Logo JMLR SHOGUN 3.2.0

by sonne - February 17, 2014, 20:31:36 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 51700 views, 10629 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarEmpty StarEmpty Star
(based on 6 votes)

About: The SHOGUN machine learning toolbox's focus is on large scale learning methods with focus on Support Vector Machines (SVM), providing interfaces to python, octave, matlab, r and the command line.

Changes:

This is mostly a bugfix release:

Features

  • Fully support python3 now
  • Add mini-batch k-means [Parijat Mazumdar]
  • Add k-means++ [Parijat Mazumdar]
  • Add sub-sequence string kernel [lambday]

Bugfixes

  • Compile fixes for upcoming swig3.0
  • Speedup for gaussian process' apply()
  • Improve unit / integration test checks
  • libbmrm uninitialized memory reads
  • libocas uninitialized memory reads
  • Octave 3.8 compile fixes [Orion Poplawski]
  • Fix java modular compile error [Bjoern Esser]

Logo JMLR BudgetedSVM v1.1

by nemanja - February 12, 2014, 20:53:45 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 545 views, 80 downloads, 1 subscription

About: BudgetedSVM is an open-source C++ toolbox for scalable non-linear classification. The toolbox can be seen as a missing link between LibLinear and LibSVM, combining the efficiency of linear with the accuracy of kernel SVM. We provide an Application Programming Interface for efficient training and testing of non-linear classifiers, supported by data structures designed for handling data which cannot fit in memory. We also provide command-line and Matlab interfaces, providing users with an efficient, easy-to-use tool for large-scale non-linear classification.

Changes:

Initial Announcement on mloss.org.


Logo JMLR Darwin 1.7

by sgould - January 10, 2014, 01:33:01 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 22833 views, 4857 downloads, 2 subscriptions

About: A platform-independent C++ framework for machine learning, graphical models, and computer vision research and development.

Changes:

Version 1.7:

  • Log file now shows the command line
  • Utility application added for viewing multi-class segmentation legend
  • Added LBP filter response features to multi-class segmentation model
  • Added drwnColourHistogram class
  • Added k-means segmentation method for creating superpixels
  • Application visualizeSuperpixels and mex routines for loading and saving superpixels
  • Improved mex parsing of Matlab objects to support more matrix types
  • Bug fix in drwnOptimizer (thanks to Subarna Tripathi)
  • Updated copyright notice to 2007-2014
  • Other bug fixes and performance improvements

Version 1.6.1:

  • Maximum size of drwnShowDebuggingImage can be set from command line
  • Windows MSVC projects updated to link against OpenCV 2.4.6
  • Fixes for gcc 4.7 (thanks to Sarma Tangirala)
  • Bug fixes and performance improvements

Version 1.6:

  • Changed vision code from OpenCV 1.x C API to OpenCV 2.x C++ API
  • Added drwnHistogram class by Jason Corso
  • Added separate EPSG, EPSF and EPSX parameters to drwnOptimizer and changed signature of solve function
  • Added "-outUnary" option to inferPixelLabels for writing out unary potentials
  • Improved Matlab mex interfaces
  • Added drwnFeatureTransformFactory and improved drwnFactory class
  • Added drwnLinearTransform class
  • Bug fixes and performance improvements

Logo JMLR MLPACK 1.0.8

by rcurtin - January 7, 2014, 05:47:22 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 27658 views, 5536 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 1 vote)

About: A scalable, fast C++ machine learning library, with emphasis on usability.

Changes:
  • Memory leak in NeighborSearch index-mapping code fixed.
  • GMMs can be trained using the existing model as a starting point by specifying an additional boolean parameter to GMM::Estimate().
  • Logistic regression implementation added in methods/logistic_regression.
  • Version information is now obtainable via mlpack::util::GetVersion() or the _MLPACKVERSION_MAJOR, _MLPACKVERSION_MINOR, and _MLPACKVERSION_PATCH macros.
  • Fix typos in allkfn and allkrann output.

Logo JMLR Sally 0.8.2

by konrad - December 25, 2013, 13:38:59 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 16267 views, 3261 downloads, 2 subscriptions

About: A Tool for Embedding Strings in Vector Spaces

Changes:

Support for new version of libarchive. Several major and minor bug fixes.


About: The CTBN-RLE is a C++ package of executables and libraries for inference and learning algorithms for continuous time Bayesian networks (CTBNs).

Changes:

compilation problems fixed


Logo JMLR Waffles 2013-12-09

by mgashler - December 9, 2013, 18:04:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 20384 views, 6348 downloads, 1 subscription

About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.)

Changes:

Changed the license from LGPL to CC0. Added classes for stackable autoencoders and restricted boltzmann machines. Polished up the GBayesianNetwork class and add examples and unit tests. Added support for CMake. Made the build process also support clang, and be more mac-friendly. Simplified some important classes, including GMatrix and GNeuralNet. Enforced const correctness in more places. Nixed most uses of smart pointers. Made all learning algorithms thread-safe. Added thread-parallelism to several ensemble methods. Added support for binary division trees. Added some common activation functions. Added a tool to generate a vector of meta statistics about a dataset. Added several small-but-useful tools. Simplified the docs and web site.


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.4

by hn - November 11, 2013, 14:46:52 CET [ Project Homepage BibTeX Download ] 14705 views, 3882 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • derivatives w.r.t. inducing points xu in infFITC, infFITC_Laplace, infFITC_EP so that one can treat the inducing points either as fixed given quantities or as additional hyperparameters
  • new GLM likelihood likExp for inter-arrival time modeling
  • new GLM likelihood likWeibull for extremal value regression
  • new GLM likelihood likGumbel for extremal value regression
  • new mean function meanPoly depending polynomially on the data
  • infExact can deal safely with the zero noise variance limit
  • support of GP warping through the new likelihood function likGaussWarp

About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference.

Changes:

added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes

generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used


Logo JMLR CARP 3.3

by volmeln - November 7, 2013, 15:48:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12488 views, 3993 downloads, 1 subscription

About: CARP: The Clustering Algorithms’ Referee Package

Changes:

Generalized overlap error and some bugs have been fixed


Logo JMLR CAM Java 3.1

by wangny - October 14, 2013, 22:46:03 CET [ Project Homepage BibTeX Download ] 4137 views, 1574 downloads, 1 subscription

About: The CAM R-Java software provides a noval way to solve blind source separation problem.

Changes:

In this version, we fix the problem of not working under newest R version R-3.0.


Logo JMLR scikitlearn 0.14.1

by fabianp - October 4, 2013, 15:01:45 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 10701 views, 3733 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: The scikit-learn project is a machine learning library in Python.

Changes:

Update for 0.14.1


Showing Items 1-20 of 42 on page 1 of 3: 1 2 3 Next