About: MLDemos is a user-friendly visualization interface for various machine learning algorithms for classification, regression, clustering, projection, dynamical systems, reward maximisation and reinforcement learning. Changes:New Visualization and Dataset Features Added 3D visualization of samples and classification, regression and maximization results Added Visualization panel with individual plots, correlations, density, etc. Added Editing tools to drag/magnet data, change class, increase or decrease dimensions of the dataset Added categorical dimensions (indexed dimensions with non-numerical values) Added Dataset Editing panel to swap, delete and rename dimensions, classes or categorical values Several bug-fixes for display, import/export of data, classification performance New Algorithms and methodologies Added Projections to pre-process data (which can then be classified/regressed/clustered), with LDA, PCA, KernelPCA, ICA, CCA Added Grid-Search panel for batch-testing ranges of values for up to two parameters at a time Added One-vs-All multi-class classification for non-multi-class algorithms Trained models can now be kept and tested on new data (training on one dataset, testing on another) Added a dataset generator panel for standard toy datasets (e.g. swissroll, checkerboard,...) Added a number of clustering, regression and classification algorithms (FLAME, DBSCAN, LOWESS, CCA, KMEANS++, GP Classification, Random Forests) Added Save/Load Model option for GMMs and SVMs Added Growing Hierarchical Self Organizing Maps (original code by Michael Dittenbach) Added Automatic Relevance Determination for SVM with RBF kernel (Thanks to Ashwini Shukla!)
|
About: Multi-core non-parametric and bursty topic models (HDP-LDA, DCMLDA, and other variants of LDA) implemented in C using efficient Gibbs sampling, with hyperparameter sampling and other flexible controls. Changes:Corrected the new normalised Gamma model for topics so it works with multicore. Improvements to documentation. Added an asymptotic version of the generalised Stirling numbers so it longer fails when they run out of bounds on bigger data.
|
About: peewit provides services for programming, running and result examination of machine learning experiments. It does not include any ML algorithms, has no GUI, and presumes certain uniformity of the experimental layout. But it does not make assumptions on the type of task under study. The current version-number is 0.10. Changes:v-cube with side-cubes
|
About: R/Weka interface Changes:Fetched by r-cran-robot on 2012-02-01 00:00:11.330277
|
About: The scikit-learn project is a machine learning library in Python. Changes:Update for 0.18 .1
|
About: Variational Bayesian inference tools for Python Changes:
|
About: Python module to ease pattern classification analyses of large datasets. It provides high-level abstraction of typical processing steps (e.g. data preparation, classification, feature selection, [...] Changes:
This release aggregates all the changes occurred between official
releases in 0.4 series and various snapshot releases (in 0.5 and 0.6
series). To get better overview of high level changes see
:ref:
Also adapts changes from 0.4.6 and 0.4.7 (see corresponding changelogs).
This is a special release, because it has never seen the general public.
A summary of fundamental changes introduced in this development version
can be seen in the :ref: Most notably, this version was to first to come with a comprehensive two-day workshop/tutorial.
A bugfix release
A bugfix release
|
About: Kernel-based Learning Platform (KeLP) is Java framework that supports the implementation of kernel-based learning algorithms, as well as an agile definition of kernel functions over generic data representation, e.g. vectorial data or discrete structures. The framework has been designed to decouple kernel functions and learning algorithms, through the definition of specific interfaces. Once a new kernel function has been implemented, it can be automatically adopted in all the available kernel-machine algorithms. KeLP includes different Online and Batch Learning algorithms for Classification, Regression and Clustering, as well as several Kernel functions, ranging from vector-based to structural kernels. It allows to build complex kernel machine based systems, leveraging on JSON/XML interfaces to instantiate prediction models without writing a single line of code. Changes:In addition to minor improvements and bug fixes, this release includes:
Check out this new version from our repositories. API Javadoc is already available. Your suggestions will be very precious for us, so download and try KeLP 2.2.2!
|
About: A Java framework for statistical analysis and classification of biological sequences Changes:New classes and packages:
New features and improvements:
|
About: A mutual information library for C and Mex bindings for MATLAB. Aimed at feature selection, and provides simple methods to calculate mutual information, conditional mutual information, entropy, conditional entropy, Renyi entropy/mutual information, and weighted variants of Shannon entropies/mutual informations. Works with discrete distributions, and expects column vectors of features. Changes:Fixed a Windows compilation bug. MIToolbox v3 should now compile using Visual Studio.
|
About: The Libra Toolkit is a collection of algorithms for learning and inference with discrete probabilistic models, including Bayesian networks, Markov networks, dependency networks, sum-product networks, arithmetic circuits, and mixtures of trees. Changes:Version 1.1.2d (12/29/2015):
|
About: MSVMpack is a Multi-class Support Vector Machine (M-SVM) package. It is dedicated to SVMs which can handle more than two classes without relying on decomposition methods and implements the four M-SVM models from the literature: Weston and Watkins M-SVM, Crammer and Singer M-SVM, Lee, Lin and Wahba M-SVM, and the M-SVM2 of Guermeur and Monfrini. Changes:
|
About: L1 (lasso and fused lasso) and L2 (ridge) penalized estimation in GLMs and in the Cox model Changes:Fetched by r-cran-robot on 2013-04-01 00:00:06.939105
|
About: Lasso and elastic-net regularized generalized linear models Changes:Fetched by r-cran-robot on 2013-04-01 00:00:05.081872
|
About: Python Framework for Vector Space Modelling that can handle unlimited datasets (streamed input, online algorithms work incrementally in constant memory). Changes:
|
About: DiffSharp is a functional automatic differentiation (AD) library providing gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products as higher-order functions. It allows exact and efficient calculation of derivatives, with support for nesting. Changes:Fixed: Bug fix in forward AD implementation of Sigmoid and ReLU for D, DV, and DM (fixes #16, thank you @mrakgr) Improvement: Performance improvement by removing several more Parallel.For and Array.Parallel.map operations, working better with OpenBLAS multithreading Added: Operations involving incompatible dimensions of DV and DM will now throw exceptions for warning the user
|
About: Link Prediction Made Easy Changes:v1.2.2
|
About: Classification, Regression and Feature Evaluation Changes:Fetched by r-cran-robot on 2018-01-01 00:00:07.164852
|
About: The glm-ie toolbox contains scalable estimation routines for GLMs (generalised linear models) and SLMs (sparse linear models) as well as an implementation of a scalable convex variational Bayesian inference relaxation. We designed the glm-ie package to be simple, generic and easily expansible. Most of the code is written in Matlab including some MEX files. The code is fully compatible to both Matlab 7.x and GNU Octave 3.2.x. Probabilistic classification, sparse linear modelling and logistic regression are covered in a common algorithmical framework allowing for both MAP estimation and approximate Bayesian inference. Changes:added factorial mean field inference as a third algorithm complementing expectation propagation and variational Bayes generalised non-Gaussian potentials so that affine instead of linear functions of the latent variables can be used
|
About: Locally Weighted Projection Regression (LWPR) is a recent algorithm that achieves nonlinear function approximation in high dimensional spaces with redundant and irrelevant input dimensions. At its [...] Changes:Version 1.2.4
|