Projects supporting the binary data format.


Logo Armadillo library 6.500

by cu24gjf - January 27, 2016, 12:11:29 CET [ Project Homepage BibTeX Download ] 74518 views, 15161 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 3 votes)

About: Armadillo is a template C++ linear algebra library aiming towards a good balance between speed and ease of use, with a function syntax similar to MATLAB. Matrix decompositions are provided through optional integration with LAPACK, or one of its high performance drop-in replacements (eg. Intel MKL, OpenBLAS).

Changes:
  • added stand-alone kmeans() function for clustering data
  • added trunc(), ind2sub() and sub2ind()
  • added conv2() for 2D convolution
  • extended conv() to optionally provide central convolution
  • expanded each_col(), each_row() and each_slice() to handle C++11 lambda functions
  • faster handling of multiply-and-accumulate by accu() when using Intel MKL, ATLAS or OpenBLAS
  • fixes for corner cases in gmm_diag class

Logo Harry 0.4.1

by konrad - January 3, 2016, 14:56:55 CET [ Project Homepage BibTeX Download ] 6810 views, 1498 downloads, 3 subscriptions

About: A Tool for Measuring String Similarity

Changes:

Minor bug fixes for libarchive code


Logo A Pattern Recognizer In Lua with ANNs v0.4.1

by pakozm - December 3, 2015, 15:01:36 CET [ Project Homepage BibTeX Download ] 6319 views, 1457 downloads, 2 subscriptions

About: APRIL-ANN toolkit (A Pattern Recognizer In Lua with Artificial Neural Networks). This toolkit incorporates ANN algorithms (as dropout, stacked denoising auto-encoders, convolutional neural networks), with other pattern recognition methods as hidden makov models (HMMs) among others.

Changes:
  • Updated home repository link to follow april-org github organization.
  • Improved serialize/deserialize functions, reimplemented all the serialization procedure.
  • Added exceptions support to LuaPkg and APRIL-ANN, allowing to capture C++ errors into Lua code.
  • Added set class.
  • Added series class.
  • Added data_frame class, similar to Python Pandas DataFrame.
  • Serialization and deserilization have been updated with more robust and reusable API, implemented in util.serialize() and util.deserialize() functions.
  • Added matrix.ext.broadcast utility (similar to broadcast in numpy).
  • Added ProbablisitcMatrixANNComponent, which allow to implement probabilistic mixtures of posteriors and/or likelihoods.
  • Added batch normalization ANN component.
  • Allowing matrix.join to add new axis.
  • Added methods prod(), cumsum() and cumprod() at matrix classes.
  • Added methods count_eq() and count_neq() at matrix classes.
  • Serializable objects API have been augmented with methods ctor_name() and
    ctor_params() in Lua, refered to luaCtorName() and luaCtorParams() in C++.
  • Added cast.to to dynamic cast C++ objects pushed into Lua, allowing to convert base class objects into any of its derived classes.
  • Added matrix.sparse as valid values for targets in ann.loss.mse and
    ann.loss.cross_entropy.
  • Changed matrix metamethods __index and __newindex, allowing to use
    matrix objects with standard Lua operator[].
  • Added matrix.masked_fill and matrix.masked_copy matrix.
  • Added matrix.indexed_fill and matrix.indexed_copy matrix.
  • Added ann.components.probabilistic_matrix, and its corresponding specializations ann.components.left_probabilistic_matrix and
    ann.components.right_probabilistic_matrix.
  • Added operator[] in the right side of matrix operations.
  • Added ann.components.transpose.
  • Added max_gradients_norm in traianble.supervised_trainer, to avoid gradients exploding.
  • Added ann.components.actf.sparse_logistic a logistic activation function with sparsity penalty.
  • Simplified math.add, math.sub, ... and other math extensions for reductions, their original behavior can be emulated by using bind function.
  • Added bind function to freeze any positional argument of any Lua function.
  • Function stats.boot uses multiple_unpack to allow a table of sizes and the generation of multiple index matrices.
  • Added multiple_unpack Lua function.
  • Added __tostring metamethod to numeric memory blocks in Lua.
  • Added dataset.token.sparse_matrix, a dataset which allow to traverse by rows a sparse matrix instance.
  • Added matrix.sparse.builders.dok, a builder which uses the Dictionary-of-Keys format to construct a sparse matrix from scratch.
  • Added method data to numeric matrix classes.
  • Added methods values, indices, first_index to sparse matrix class.
  • Fixed bugs when reading bad formed CSV files.
  • Fixed bugs at statistical distributions.
  • FloatRGB bug solved on equal (+=, -=, ...) operators. This bug affected ImageRGB operations such as resize.
  • Solved problems when chaining methods in Lua, some objects end to be garbage collected.
  • Improved support of strings in auto-completion (rlcompleter package).
  • Solved bug at SparseMatrix<T> when reading it from a file.
  • Solved bug in Image<T>::rotate90_cw methods.
  • Solved bug in SparseMatrix::toDense() method.

C/C++

  • Better LuaTable accessors, using [] operator.
  • Implementation of matrix __index, __newindex and __call metamethods in C++.
  • Implementation of matProd(), matCumSum() and matCumProd() functions.
  • Implementation of matCountEq() and matCountNeq() functions for
    Matrix<T>.
  • Updated matrix_ext_operations.h to change API of matrix operations. All functions have been overloaded to accept an in-place operation and another version which receives a destination matrix.
  • Adding iterators to language models.
  • Added MatrixScalarMap2 which receives as input2 a SparaseMatrix instance. This functions needs to be generalized to work with CPU and CUDA.
  • The method SparseMatrix<T>::fromDenseMatrix() uses a DOKBuilder object to build the sparse matrix.
  • The conversion of a Matrix<T> into a SparseMatrix<T> has been changed from a constructor overload to the static method
    SparseMatrix<T>::fromDenseMatrix().
  • Added support for IPyLua.
  • Optimized matrix access for confusion matrix.
  • Minor changes in class.lua.
  • Improved binding to avoid multiple object copies when pushing C++ objects.
  • Added Git commit hash and compilation time.

Logo Salad 0.6.0

by chwress - December 1, 2015, 16:17:35 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 8952 views, 1688 downloads, 3 subscriptions

About: A Content Anomaly Detector based on n-Grams

Changes:

After a full year of development we proudly present you several new features, plenty of bug fixes and better performance :)

  • It now is possible to process data on bit granularity salad [train|inspect] --binary
  • Performance improvements while simultaneously preserving and further advancing readability of the source code.
  • Suppress the verbose output of Salad salad [train|predict] -q
  • Extend the (unit) testing framework to support test of the overall application and memchecks using valgrind.
  • Testing mode was renamed: salad dbg -> salad test
  • Allow to select either client or server-side data when processing network communication.
  • libfoodstoragebox A library encapsulating advanced data structures such as bloom filters.
  • Fixes for a critical bug when using group input and several minor issues.
  • An optionally compressed, text-based model file format salad train -F (txt|archive)
  • The default hashset ('simple2') makes use of djb2 hash
  • Flawless builds using gcc, mingw and clang

About: Efficient and Flexible Distributed/Mobile Deep Learning Framework, for python, R, Julia and more

Changes:

This version comes with Distributed and Mobile Examples


Logo JMLR dlib ml 18.18

by davis685 - October 29, 2015, 01:48:44 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 128758 views, 21278 downloads, 4 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.

Changes:

This release has focused on build system improvements, both for the Python API and C++ builds using CMake. This includes adding a setup.py script for installing the dlib Python API as well as a make install target for installing a C++ shared library for non-Python use.


Logo Universal Java Matrix Package 0.3.0

by arndt - July 31, 2015, 14:23:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12885 views, 2451 downloads, 3 subscriptions

About: The Universal Java Matrix Package (UJMP) is a data processing tool for Java. Unlike JAMA and Colt, it supports multi-threading and is therefore much faster on current hardware. It does not only support matrices with double values, but instead handles every type of data as a matrix through a common interface, e.g. CSV files, Excel files, images, WAVE audio files, tables in SQL data bases, and much more.

Changes:

Updated to version 0.3.0


Logo LMW Tree 1.0

by cdevries - May 30, 2015, 11:42:23 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1479 views, 303 downloads, 2 subscriptions

About: Learning M-Way Tree - Web Scale Clustering - EM-tree, K-tree, k-means, TSVQ, repeated k-means, clustering, random projections, random indexing, hashing, bit signatures

Changes:

Initial Announcement on mloss.org.


Logo JMLR Sally 1.0.0

by konrad - March 26, 2015, 17:01:35 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 34440 views, 6665 downloads, 3 subscriptions

About: A Tool for Embedding Strings in Vector Spaces

Changes:

Support for explicit selection of granularity added. Several minor bug fixes. We have reached 1.0


Logo JMLR SHOGUN 4.0.0

by sonne - February 5, 2015, 09:09:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 105018 views, 14894 downloads, 6 subscriptions

Rating Whole StarWhole StarWhole StarEmpty StarEmpty Star
(based on 6 votes)

About: The SHOGUN machine learning toolbox's focus is on large scale learning methods with focus on Support Vector Machines (SVM), providing interfaces to python, octave, matlab, r and the command line.

Changes:

This release features the work of our 8 GSoC 2014 students [student; mentors]:

  • OpenCV Integration and Computer Vision Applications [Abhijeet Kislay; Kevin Hughes]
  • Large-Scale Multi-Label Classification [Abinash Panda; Thoralf Klein]
  • Large-scale structured prediction with approximate inference [Jiaolong Xu; Shell Hu]
  • Essential Deep Learning Modules [Khaled Nasr; Sergey Lisitsyn, Theofanis Karaletsos]
  • Fundamental Machine Learning: decision trees, kernel density estimation [Parijat Mazumdar ; Fernando Iglesias]
  • Shogun Missionary & Shogun in Education [Saurabh Mahindre; Heiko Strathmann]
  • Testing and Measuring Variable Interactions With Kernels [Soumyajit De; Dino Sejdinovic, Heiko Strathmann]
  • Variational Learning for Gaussian Processes [Wu Lin; Heiko Strathmann, Emtiyaz Khan]

It also contains several cleanups and bugfixes:

Features

  • New Shogun project description [Heiko Strathmann]
  • ID3 algorithm for decision tree learning [Parijat Mazumdar]
  • New modes for PCA matrix factorizations: SVD & EVD, in-place or reallocating [Parijat Mazumdar]
  • Add Neural Networks with linear, logistic and softmax neurons [Khaled Nasr]
  • Add kernel multiclass strategy examples in multiclass notebook [Saurabh Mahindre]
  • Add decision trees notebook containing examples for ID3 algorithm [Parijat Mazumdar]
  • Add sudoku recognizer ipython notebook [Alejandro Hernandez]
  • Add in-place subsets on features, labels, and custom kernels [Heiko Strathmann]
  • Add Principal Component Analysis notebook [Abhijeet Kislay]
  • Add Multiple Kernel Learning notebook [Saurabh Mahindre]
  • Add Multi-Label classes to enable Multi-Label classification [Thoralf Klein]
  • Add rectified linear neurons, dropout and max-norm regularization to neural networks [Khaled Nasr]
  • Add C4.5 algorithm for multiclass classification using decision trees [Parijat Mazumdar]
  • Add support for arbitrary acyclic graph-structured neural networks [Khaled Nasr]
  • Add CART algorithm for classification and regression using decision trees [Parijat Mazumdar]
  • Add CHAID algorithm for multiclass classification and regression using decision trees [Parijat Mazumdar]
  • Add Convolutional Neural Networks [Khaled Nasr]
  • Add Random Forests algorithm for ensemble learning using CART [Parijat Mazumdar]
  • Add Restricted Botlzmann Machines [Khaled Nasr]
  • Add Stochastic Gradient Boosting algorithm for ensemble learning [Parijat Mazumdar]
  • Add Deep contractive and denoising autoencoders [Khaled Nasr]
  • Add Deep belief networks [Khaled Nasr]

Bugfixes

  • Fix reference counting bugs in CList when reference counting is on [Heiko Strathmann, Thoralf Klein, lambday]
  • Fix memory problem in PCA::apply_to_feature_matrix [Parijat Mazumdar]
  • Fix crash in LeastAngleRegression for the case D greater than N [Parijat Mazumdar]
  • Fix memory violations in bundle method solvers [Thoralf Klein]
  • Fix fail in library_mldatahdf5.cpp example when http://mldata.org is not working properly [Parijat Mazumdar]
  • Fix memory leaks in Vowpal Wabbit, LibSVMFile and KernelPCA [Thoralf Klein]
  • Fix memory and control flow issues discovered by Coverity [Thoralf Klein]
  • Fix R modular interface SWIG typemap (Requires SWIG >= 2.0.5) [Matt Huska]

Cleanup and API Changes

  • PCA now depends on Eigen3 instead of LAPACK [Parijat Mazumdar]
  • Removing redundant and fixing implicit imports [Thoralf Klein]
  • Hide many methods from SWIG, reducing compile memory by 500MiB [Heiko Strathmann, Fernando Iglesias, Thoralf Klein]

Logo libAGF 0.9.8

by Petey - December 6, 2014, 02:35:39 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 13436 views, 2601 downloads, 2 subscriptions

About: C++ software for statistical classification, probability estimation and interpolation/non-linear regression using variable bandwidth kernel estimation.

Changes:

New in Version 0.9.8:

  • bug fixes: svm file conversion works properly and is more general

  • non-hierarchical multi-borders has 3 options for solving for the conditional probabilities: matrix inversion, voting, and matrix inversion over-ridden by voting, with re-normalization

  • multi-borders now works with external binary classifiers

  • random numbers resolve a tie when selecting classes based on probabilities

  • pair of routines, sort_discrete_vectors and search_discrete_vectors, for classification based on n-d binning (still experimental)

  • command options have been changed with many new additions, see QUICKSTART file or run the relevant commands for details


About: This library implements the Optimum-Path Forest classifier for unsupervised and supervised learning.

Changes:

Initial Announcement on mloss.org.


Logo MOSIS 0.55

by claasahl - March 9, 2014, 17:35:40 CET [ BibTeX Download ] 5002 views, 1638 downloads, 2 subscriptions

About: MOSIS is a modularized framework for signal processing, stream analysis, machine learning and stream mining applications.

Changes:
  • Move "flow"-related classes into package "de.claas.mosis.flow" (e.g. Node and Link).
  • Refined and improved "flow"-related tests (e.g. Iterator and Node tests).
  • Refactored tests for data formats (e.g. PlainText and JSON tests).
  • Added visitor design pattern for graph-based functions (e.g. initialization and processing).
  • Documented parameters of Processor implementations.

Logo DRVQ 1.0.1-beta

by iavr - January 18, 2014, 17:26:34 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2327 views, 545 downloads, 1 subscription

About: DRVQ is a C++ library implementation of dimensionality-recursive vector quantization, a fast vector quantization method in high-dimensional Euclidean spaces under arbitrary data distributions. It is an approximation of k-means that is practically constant in data size and applies to arbitrarily high dimensions but can only scale to a few thousands of centroids. As a by-product of training, a tree structure performs either exact or approximate quantization on trained centroids, the latter being not very precise but extremely fast.

Changes:

Initial Announcement on mloss.org.


Logo Neural network designer 1.1.1

by bragi - December 28, 2012, 11:38:10 CET [ Project Homepage BibTeX Download ] 5177 views, 1234 downloads, 1 subscription

About: a dbms for resonating neural networks. Create and use different types of machine learning algorithms.

Changes:

AIML compatible (AIML files can be imported); new 'Grid channel' for developing board games; improved topics editor; new demo project: ALice (from AIML); lots of bug-fixes and speed improvements


Logo Isoline Retrieval SVN rev. 7

by Petey - February 21, 2012, 16:56:09 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3400 views, 711 downloads, 1 subscription

About: Software to perform isoline retrieval, retrieve isolines of an atmospheric parameter from a nadir-looking satellite.

Changes:

Added screenshot, keywords


Logo sccan 0.0

by stnava - January 13, 2011, 18:14:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4497 views, 1120 downloads, 1 subscription

About: A work in progress

Changes:

Initial Announcement on mloss.org.


Logo OpenViBE 0.8.0

by k3rl0u4rn - October 1, 2010, 16:15:08 CET [ Project Homepage BibTeX Download ] 13203 views, 3651 downloads, 1 subscription

Rating Whole StarWhole StarWhole StarEmpty StarEmpty Star
(based on 1 vote)

About: OpenViBE is an opensource platform that enables to design, test and use Brain-Computer Interfaces (BCI). Broadly speaking, OpenViBE can be used in many real-time Neuroscience applications [...]

Changes:

New release 0.8.0.


Logo JMLR Surrogate Modeling Toolbox 7.0.2

by dgorissen - September 4, 2010, 07:48:59 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 15301 views, 4294 downloads, 1 subscription

About: The SUMO Toolbox is a Matlab toolbox that automatically builds accurate surrogate models (also known as metamodels or response surface models) of a given data source (e.g., simulation code, data set, script, ...) within the accuracy and time constraints set by the user. The toolbox minimizes the number of data points (which it selects automatically) since they are usually expensive.

Changes:

Incremental update, fixing some cosmetic issues, coincides with JMLR publication.


Logo JMLR FastInf 1.0

by arielj - June 4, 2010, 14:04:37 CET [ Project Homepage BibTeX Download ] 9726 views, 3354 downloads, 1 subscription

About: The library is focused on implementation of propagation based approximate inference methods. Also implemented are a clique tree based exact inference, Gibbs sampling, and the mean field algorithm.

Changes:

Initial Announcement on mloss.org.