All entries.
Showing Items 31-40 of 537 on page 4 of 54: Previous 1 2 3 4 5 6 7 8 9 Next Last

Logo JMLR Java Machine Learning Library 0.1.5

by thomas - August 20, 2009, 23:47:45 CET [ Project Homepage BibTeX Download ] 18477 views, 2615 downloads, 1 subscription

About: Java-ML is a collection of machine learning and data mining algorithms, which aims to be a readily usable and easily extensible API for both software developers and research scientists.

Changes:

new release


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.4

by hn - November 11, 2013, 14:46:52 CET [ Project Homepage BibTeX Download ] 18288 views, 4360 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • derivatives w.r.t. inducing points xu in infFITC, infFITC_Laplace, infFITC_EP so that one can treat the inducing points either as fixed given quantities or as additional hyperparameters
  • new GLM likelihood likExp for inter-arrival time modeling
  • new GLM likelihood likWeibull for extremal value regression
  • new GLM likelihood likGumbel for extremal value regression
  • new mean function meanPoly depending polynomially on the data
  • infExact can deal safely with the zero noise variance limit
  • support of GP warping through the new likelihood function likGaussWarp

Logo MLDemos 0.5.1

by basilio - March 2, 2013, 16:06:13 CET [ Project Homepage BibTeX Download ] 17677 views, 4211 downloads, 2 subscriptions

About: MLDemos is a user-friendly visualization interface for various machine learning algorithms for classification, regression, clustering, projection, dynamical systems, reward maximisation and reinforcement learning.

Changes:

New Visualization and Dataset Features Added 3D visualization of samples and classification, regression and maximization results Added Visualization panel with individual plots, correlations, density, etc. Added Editing tools to drag/magnet data, change class, increase or decrease dimensions of the dataset Added categorical dimensions (indexed dimensions with non-numerical values) Added Dataset Editing panel to swap, delete and rename dimensions, classes or categorical values Several bug-fixes for display, import/export of data, classification performance

New Algorithms and methodologies Added Projections to pre-process data (which can then be classified/regressed/clustered), with LDA, PCA, KernelPCA, ICA, CCA Added Grid-Search panel for batch-testing ranges of values for up to two parameters at a time Added One-vs-All multi-class classification for non-multi-class algorithms Trained models can now be kept and tested on new data (training on one dataset, testing on another) Added a dataset generator panel for standard toy datasets (e.g. swissroll, checkerboard,...) Added a number of clustering, regression and classification algorithms (FLAME, DBSCAN, LOWESS, CCA, KMEANS++, GP Classification, Random Forests) Added Save/Load Model option for GMMs and SVMs Added Growing Hierarchical Self Organizing Maps (original code by Michael Dittenbach) Added Automatic Relevance Determination for SVM with RBF kernel (Thanks to Ashwini Shukla!)


Logo JMLR Nieme 1.0

by francis - April 2, 2009, 10:57:38 CET [ Project Homepage BibTeX Download ] 17309 views, 2260 downloads, 1 subscription

Rating Whole StarWhole StarWhole Star1/2 StarEmpty Star
(based on 3 votes)

About: Nieme is a C++ machine learning library for large-scale classification, regression and ranking. It provides a simple interface available in C++, Python and Java and a user interface for visualization.

Changes:

Released Nieme 1.0


About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference.

Changes:

added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes

generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used


Logo JMLR LIBLINEAR 1.32

by biconnect - September 3, 2008, 17:35:24 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 16536 views, 1869 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 2 votes)

About: LIBLINEAR is a linear classifier for data with millions of instances and features. It supports L2-regularized logistic regression (LR), L2-loss linear SVM, L1-loss linear SVM, and multi-class SVM

Changes:

Initial Announcement on mloss.org.


Logo MDP Modular toolkit for Data Processing 3.3

by otizonaizit - October 4, 2012, 15:17:33 CET [ Project Homepage BibTeX Download ] 16517 views, 4255 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: MDP is a Python library of widely used data processing algorithms that can be combined according to a pipeline analogy to build more complex data processing software. The base of available algorithms includes signal processing methods (Principal Component Analysis, Independent Component Analysis, Slow Feature Analysis), manifold learning methods ([Hessian] Locally Linear Embedding), several classifiers, probabilistic methods (Factor Analysis, RBM), data pre-processing methods, and many others.

Changes:

What's new in version 3.3?

  • support sklearn versions up to 0.12
  • cleanly support reload
  • fail gracefully if pp server does not start
  • several bug-fixes and improvements

Logo JMLR LPmade 1.2.2

by rlichten - April 2, 2012, 17:11:59 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 16437 views, 6605 downloads, 1 subscription

About: Link Prediction Made Easy

Changes:

v1.2.2

  • Fixed MAJOR issue related to github migration several months ago. The original github commit neglected to import empty folders. This caused parts of the project compilation procedure to fail. Any users of LPmade who downloaded the most recent version from github over the last several months would have encountered this build error and should download the most recent version. This change updates the network library makefile to create the empty folders and gets around the issue. Very sorry to anybody that this may have inconvenienced, but thanks for hanging in there if you diagnosed and solved it yourself.

  • Fixed issue with auroc on 32-bit architectures that caused integer wraparounds that produced incorrect results.


Logo Elefant 0.4

by kishorg - October 17, 2009, 08:48:19 CET [ Project Homepage BibTeX Download ] 16270 views, 7274 downloads, 2 subscriptions

Rating Whole StarWhole Star1/2 StarEmpty StarEmpty Star
(based on 2 votes)

About: Elefant is an open source software platform for the Machine Learning community licensed under the Mozilla Public License (MPL) and developed using Python, C, and C++. We aim to make it the platform [...]

Changes:

This release contains the Stream module as a first step in the direction of providing C++ library support. Stream aims to be a software framework for the implementation of large scale online learning algorithms. Large scale, in this context, should be understood as something that does not fit in the memory of a standard desktop computer.

Added Bundle Methods for Regularized Risk Minimization (BMRM) allowing to choose from a list of loss functions and solvers (linear and quadratic).

Added the following loss classes: BinaryClassificationLoss, HingeLoss, SquaredHingeLoss, ExponentialLoss, LogisticLoss, NoveltyLoss, LeastMeanSquareLoss, LeastAbsoluteDeviationLoss, QuantileRegressionLoss, EpsilonInsensitiveLoss, HuberRobustLoss, PoissonRegressionLoss, MultiClassLoss, WinnerTakesAllMultiClassLoss, ScaledSoftMarginMultiClassLoss, SoftmaxMultiClassLoss, MultivariateRegressionLoss

Graphical User Interface provides now extensive documentation for each component explaining state variables and port descriptions.

Changed saving and loading of experiments to XML (thereby avoiding storage of large input data structures).

Unified automatic input checking via new static typing extending Python properties.

Full support for recursive composition of larger components containing arbitrary statically typed state variables.


Logo r-cran-CoxBoost 1.4

by r-cran-robot - September 1, 2014, 00:00:04 CET [ Project Homepage BibTeX Download ] 16218 views, 3281 downloads, 2 subscriptions

About: Cox models by likelihood based boosting for a single survival endpoint or competing risks

Changes:

Fetched by r-cran-robot on 2014-09-01 00:00:04.950391


Showing Items 31-40 of 537 on page 4 of 54: Previous 1 2 3 4 5 6 7 8 9 Next Last