All entries.
Showing Items 11-20 of 519 on page 2 of 52: Previous 1 2 3 4 5 6 7 Next Last

Logo JMLR MOA Massive Online Analysis Nov-13

by abifet - April 4, 2014, 03:50:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9579 views, 3933 downloads, 1 subscription

About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license.

Changes:

New version November 2013


Logo SAMOA 0.0.1

by gdfm - April 2, 2014, 17:09:08 CET [ Project Homepage BibTeX Download ] 220 views, 39 downloads, 1 subscription

About: SAMOA is a platform for mining on big data streams. It is a distributed streaming machine learning (ML) framework that contains a programing abstraction for distributed streaming ML algorithms.

Changes:

Initial Announcement on mloss.org.


Logo r-cran-CoxBoost 1.4

by r-cran-robot - April 1, 2014, 00:00:04 CET [ Project Homepage BibTeX Download ] 14604 views, 2918 downloads, 1 subscription

About: Cox models by likelihood based boosting for a single survival endpoint or competing risks

Changes:

Fetched by r-cran-robot on 2014-04-01 00:00:04.738601


Logo r-cran-Boruta 3.0.0

by r-cran-robot - April 1, 2014, 00:00:04 CET [ Project Homepage BibTeX Download ] 4617 views, 981 downloads, 0 subscriptions

About: A wrapper algorithm for all-relevant feature selection

Changes:

Fetched by r-cran-robot on 2014-04-01 00:00:04.400248


Logo JMLR MultiBoost 1.2.02

by busarobi - March 31, 2014, 16:13:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 19526 views, 3416 downloads, 1 subscription

About: MultiBoost is a multi-purpose boosting package implemented in C++. It is based on the multi-class/multi-task AdaBoost.MH algorithm [Schapire-Singer, 1999]. Basic base learners (stumps, trees, products, Haar filters for image processing) can be easily complemented by new data representations and the corresponding base learners, without interfering with the main boosting engine.

Changes:

Major changes :

  • The “early stopping” feature can now based on any metric output with the --outputinfo command line argument.

  • Early stopping now works with --slowresume command line argument.

Minor fixes:

  • More informative output when testing.

  • Various compilation glitch with recent clang (OsX/Linux).


Logo JMLR EnsembleSVM 2.0

by claesenm - March 31, 2014, 08:06:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3118 views, 1074 downloads, 1 subscription

About: The EnsembleSVM library offers functionality to perform ensemble learning using Support Vector Machine (SVM) base models. In particular, we offer routines for binary ensemble models using SVM base classifiers. Experimental results have shown the predictive performance to be comparable with standard SVM models but with drastically reduced training time. Ensemble learning with SVM models is particularly useful for semi-supervised tasks.

Changes:

The library has been updated and features a variety of new functionality as well as more efficient implementations of original features. The following key improvements have been made:

  1. Support for multithreading in training and prediction with ensemble models. Since both of these are embarassingly parallel, this has induced a significant speedup (3-fold on quad-core).
  2. Extensive programming framework for aggregation of base model predictions which allows highly efficient prototyping of new aggregation approaches. Additionally we provide several predefined strategies, including (weighted) majority voting, logistic regression and nonlinear SVMs of your choice -- be sure to check out the esvm-edit tool! The provided framework also allows you to efficiently program your own, novel aggregation schemes.
  3. Full code transition to C++11, the latest C++ standard, which enabled various performance improvements. The new release requires moderately recent compilers, such as gcc 4.7.2+ or clang 3.2+.
  4. Generic implementations of convenient facilities have been added, such as thread pools, deserialization factories and more.

The API and ABI have undergone significant changes, many of which are due to the transition to C++11.


Logo Libra 1.0.1

by lowd - March 30, 2014, 09:42:00 CET [ Project Homepage BibTeX Download ] 8701 views, 1810 downloads, 1 subscription

About: The Libra Toolkit is a collection of algorithms for learning and inference with discrete probabilistic models, including Bayesian networks, Markov networks, dependency networks, sum-product networks, arithmetic circuits, and mixtures of trees.

Changes:

Version 1.0.1 (3/30/2014):

  • Several new algorithms -- acmn, learning ACs using MNs; idspn, SPN structure learning; mtlearn, learning mixtures of trees
  • Several new support programs -- spquery, for exact inference in SPNs; spn2ac, for converting SPNs to ACs
  • Renamed aclearnstruct to acbn
  • Replaced aclearnstruct -noac with separate bnlearn program
  • ...and many more small changes and fixes, throughout!

Logo XGBoost v0.1

by crowwork - March 27, 2014, 07:09:52 CET [ Project Homepage BibTeX Download ] 281 views, 45 downloads, 1 subscription

About: eXtreme gradient boosting (tree) library. Features: - Sparse feature format allows easy handling of missing values, and improve computation efficiency. - Efficient parallel implementation that optimizes memory and computation.

Changes:

Initial Announcement on mloss.org.


Logo BayesOpt, a Bayesian Optimization toolbox 0.6

by rmcantin - March 26, 2014, 17:48:17 CET [ Project Homepage BibTeX Download ] 4404 views, 1003 downloads, 2 subscriptions

About: BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python.

Changes:

-Complete refactoring of inner parts of the library. The code is easier to understand/modify and it allow simpler integration with new algorithms.

-Updated to the latest version of NLOPT (2.4.1). Wrapper code symplified.

-Error codes replaced with exceptions in C++ interface. Library is exception safe.

-API modified to support new learning methods for kernel hyperparameters (e.g: MCMC). Warning: config parameters about learning have changed. Code using previous versions might not work. Some of the learning methods (like MCMC) are not yet implemented.

-Added configuration of random numbers (can be fixed for debugging). Fixed issue with random numbers using different sources or random number with potential correlations. Now all the elements are guaranteed to use the same instance of the random engine.

-Improved numerical results (e.g.: hyperparameter optimization is done in log space)

-More examples and tests.

-Fixed bugs.

-The number of inner iterations have been increased by default, so overall optimization time using default configuration might be slower, but with improved results.


Logo Social Impact theory based Optimizer library 1.0.2

by rishem - March 24, 2014, 08:29:00 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1929 views, 473 downloads, 1 subscription

About: This is an optimization library based on Social Impact Theory(SITO). The optimizer works in the same way as PSO and GA.

Changes:

A new variant 'Continuous Opinion Dynamics Optimizer (CODO)' has been implemented in this version. Minor changes in implementation of objective function.


Showing Items 11-20 of 519 on page 2 of 52: Previous 1 2 3 4 5 6 7 Next Last