About: SVDFeature is a toolkit for developing generic collaborative filtering algorithms by defining features. Changes:JMLR MLOSS version.
|
About: TurboParser is a free multilingual dependency parser based on linear programming developed by André Martins. It is based on joint work with Noah Smith, Mário Figueiredo, Eric Xing, Pedro Aguiar. Changes:This version introduces a number of new features:
Note: The runtimes above are approximate, and based on experiments with a desktop machine with a Intel Core i7 CPU 3.4 GHz and 8GB RAM. To run this software, you need a standard C++ compiler. This software has the following external dependencies: AD3, a library for approximate MAP inference; Eigen, a template library for linear algebra; google-glog, a library for logging; gflags, a library for commandline flag processing. All these libraries are free software and are provided as tarballs in this package. This software has been tested on Linux, but it should run in other platforms with minor adaptations.
|
About: The VLFeat open source library implements popular computer vision algorithms including affine covariant feature detectors, HOG, SIFT, MSER, k-means, hierarchical k-means, agglomerative information bottleneck, SLIC superpixels, and quick shift. It is written in C for efficiency and compatibility, with interfaces in MATLAB for ease of use, and detailed documentation throughout. It supports Windows, Mac OS X, and Linux. The latest version of VLFeat is 0.9.16. Changes:VLFeat 0.9.16: Added VL_COVDET() (covariant feature detectors). This function implements the following detectors: DoG, Hessian, Harris Laplace, Hessian Laplace, Multiscale Hessian, Multiscale Harris. It also implements affine adaptation, estiamtion of feature orientation, computation of descriptors on the affine patches (including raw patches), and sourcing of custom feature frame. Addet the auxiliary function VL_PLOTSS(). This is the second point update supported by the PASCAL Harvest programme. VLFeat 0.9.15: Added VL_HOG() (HOG features). Added VL_SVMPEGASOS() and a vastly improved SVM implementation. Added IHASHSUM (hashed counting). Improved INTHIST (integral histogram). Added VL_CUMMAX(). Improved the implementation of VL_ROC() and VL_PR(). Added VL_DET() (Detection Error Trade-off (DET) curves). Improved the verbosity control to AIB. Added support for Xcode 4.3, improved support for past and future Xcode versions. Completed the migration of the old test code in toolbox/test, moving the functionality to the new unit tests toolbox/xtest. Improved credits. This is the first point update supported by the PASCAL Harvest (several more to come shortly).
|
About: The Kernel-Machine Library is a free (released under the LGPL) C++ library to promote the use of and progress of kernel machines. Changes:Updated mloss entry (minor fixes).
|
About: The K-tree is a scalable approach to clustering inspired by the B+-tree and k-means algorithms. Changes:Release of K-tree implementation in Python. This is targeted at a research and rapid prototyping audience.
|
About: The source code of the mldata.org site - a community portal for machine learning data sets. Changes:Initial Announcement on mloss.org.
|
About: KReator is an integrated development environment (IDE) for relational probabilistic knowledge representation languages. At the moment, KReator supports Bayesian Logic Programs (BLPs), Markov Logic Networks (MLNs), Relational Maximum Entropy (RME), Relational Bayesian Networks (RBN), and Probabilistic Prolog (ProbLog). Changes:
|
About: Pyriel is a Python system for learning classification rules from data. Unlike other rule learning systems, it is designed to learn rule lists that maximize the area under the ROC curve (AUC) instead of accuracy. Pyriel is mostly an experimental research tool, but it's robust and fast enough to be used for lightweight industrial data mining. Changes:1.5 Changed CF (confidence factor) to do LaPlace smoothing of estimates. New flag "--score-for-class C" causes scores to be computed relative to a given (positive) class. For two-class problems. Fixed bug in example sampling code (--sample n) Fixed bug keeping old-style example formats (terminated by dot) from working. More code restructuring.
|
About: OpenViBE is an opensource platform that enables to design, test and use Brain-Computer Interfaces (BCI). Broadly speaking, OpenViBE can be used in many real-time Neuroscience applications [...] Changes:New release 0.8.0.
|
About: The SUMO Toolbox is a Matlab toolbox that automatically builds accurate surrogate models (also known as metamodels or response surface models) of a given data source (e.g., simulation code, data set, script, ...) within the accuracy and time constraints set by the user. The toolbox minimizes the number of data points (which it selects automatically) since they are usually expensive. Changes:Incremental update, fixing some cosmetic issues, coincides with JMLR publication.
|
About: Moses is a statistical machine translation system that allows you to automatically train translation models for any language pair. All you need is a collection of translated texts (parallel corpus). An efficient search algorithm finds quickly the highest probability translation among the exponential number of choices. Changes:Initial Announcement on mloss.org.
|
About: jblas is a fast linear algebra library for Java. jblas is based on BLAS and LAPACK, the de-facto industry standard for matrix computations, and uses state-of-the-art implementations like ATLAS for all its computational routines, making jBLAS very fast. Changes:Changes from 1.0:
|
About: redsvd is a library for solving several matrix decomposition (SVD, PCA, eigen value decomposition) redsvd can handle very large matrix efficiently, and optimized for a truncated SVD of sparse matrices. For example, redsvd can compute a truncated SVD with top 20 singular values for a 100K x 100K matrix with 10M nonzero entries in about two second. Changes:Initial Announcement on mloss.org.
|
About: A stochastic variant of the mirror descent algorithm employing Langford and Zhang's truncated gradient idea to minimize L1 regularized loss minimization problems for classification and regression. Changes:Fixed major bug in implementation. The components of the iterate where the current example vector is zero were not being updated correctly. Thanks to Jonathan Chang for pointing out the error to us.
|
About: PSVM - Support vector classification, regression and feature extraction for non-square dyadic data, non-Mercer kernels. Changes:Initial Announcement on mloss.org.
|
About: A set of Perl programs for generating and manipulating ROC curves. Changes:Initial Announcement on mloss.org.
|
About: Given many points in ROC (Receiver Operator Characteristics) space, computes the convex hull. Changes:Initial Announcement on mloss.org.
|
About: A (randomized) coordinate descent procedure to minimize L1 regularized loss for classification and regression purposes. Changes:Fixed some I/O bugs. Lines that ended with whitespace were not read correctly in the previous version.
|
About: SHARK is a modular C++ library for the design and optimization of adaptive systems. It provides various machine learning and computational intelligence techniques. Changes:
|
About: Elefant is an open source software platform for the Machine Learning community licensed under the Mozilla Public License (MPL) and developed using Python, C, and C++. We aim to make it the platform [...] Changes:This release contains the Stream module as a first step in the direction of providing C++ library support. Stream aims to be a software framework for the implementation of large scale online learning algorithms. Large scale, in this context, should be understood as something that does not fit in the memory of a standard desktop computer. Added Bundle Methods for Regularized Risk Minimization (BMRM) allowing to choose from a list of loss functions and solvers (linear and quadratic). Added the following loss classes: BinaryClassificationLoss, HingeLoss, SquaredHingeLoss, ExponentialLoss, LogisticLoss, NoveltyLoss, LeastMeanSquareLoss, LeastAbsoluteDeviationLoss, QuantileRegressionLoss, EpsilonInsensitiveLoss, HuberRobustLoss, PoissonRegressionLoss, MultiClassLoss, WinnerTakesAllMultiClassLoss, ScaledSoftMarginMultiClassLoss, SoftmaxMultiClassLoss, MultivariateRegressionLoss Graphical User Interface provides now extensive documentation for each component explaining state variables and port descriptions. Changed saving and loading of experiments to XML (thereby avoiding storage of large input data structures). Unified automatic input checking via new static typing extending Python properties. Full support for recursive composition of larger components containing arbitrary statically typed state variables.
|