Projects that are tagged with classification.
Showing Items 1-20 of 77 on page 1 of 4: 1 2 3 4 Next

About: Nowadays this is very popular to use the deep architectures in machine learning. Deep Belief Networks (DBNs) are deep architectures that use a stack of Restricted Boltzmann Machines (RBM) to create a powerful generative model using training data. DBNs have many abilities such as feature extraction and classification that are used in many applications including image processing, speech processing, text categorization, etc. This paper introduces a new object oriented toolbox with the most important abilities needed for the implementation of DBNs. According to the results of the experiments conducted on the MNIST (image), ISOLET (speech), and the 20 Newsgroups (text) datasets, it was shown that the toolbox can learn automatically a good representation of the input from unlabeled data with better discrimination between different classes. Also on all the aforementioned datasets, the obtained classification errors are comparable to those of the state of the art classifiers. In addition, the toolbox supports different sampling methods (e.g. Gibbs, CD, PCD and our new FEPCD method), different sparsity methods (quadratic, rate distortion and our new normal method), different RBM types (generative and discriminative), GPU based, etc. The toolbox is a user-friendly open source software in MATLAB and Octave and is freely available on the website.

Changes:

New in toolbox

  • Bug fix in changing learning rate.
  • Expanded generateData function in using after backpropagation.
  • Expanded reconstructData function in using after backpropagation.

cardinal


Logo KeLP 2.0.1

by kelpadmin - January 13, 2016, 12:47:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5711 views, 1416 downloads, 3 subscriptions

About: Kernel-based Learning Platform (KeLP) is Java framework that supports the implementation of kernel-based learning algorithms, as well as an agile definition of kernel functions over generic data representation, e.g. vectorial data or discrete structures. The framework has been designed to decouple kernel functions and learning algorithms, through the definition of specific interfaces. Once a new kernel function has been implemented, it can be automatically adopted in all the available kernel-machine algorithms. KeLP includes different Online and Batch Learning algorithms for Classification, Regression and Clustering, as well as several Kernel functions, ranging from vector-based to structural kernels. It allows to build complex kernel machine based systems, leveraging on JSON/XML interfaces to instantiate prediction models without writing a single line of code.

Changes:

In addition to minor bug fixes, this release includes:

  • Soft Confidence Weighted Classification algorithm: a brand new online learning algorithm from Wang, J., Zhao, P., Hoi, S.C.: Exact soft confidence-weighted learning. In Proceedings of the ICML 2012. ACM, New York, NY, USA (2012)

  • Optimization of the kernel caching mechanism

  • The Smooth Partial Tree Kernel and the Partial Tree Kernel now have the possibility to specify a maximum branching factor (parameter: maxSubseqLeng) in the tree fragments considered by the kernel operation.

Check out this new version from our repositories. API Javadoc is already available. Your suggestions will be very precious for us, so download and try KeLP 2.0.1!


Logo NaN toolbox 2.8.5

by schloegl - January 5, 2016, 12:10:15 CET [ Project Homepage BibTeX Download ] 41982 views, 8725 downloads, 3 subscriptions

About: NaN-toolbox is a statistics and machine learning toolbox for handling data with and without missing values.

Changes:

Changes in v.2.8.5 - bug fix: trimmean - compiler support for gcc-5 and clang - fix typos

For details see the CHANGELOG at http://pub.ist.ac.at/~schloegl/matlab/NaN/CHANGELOG


Logo MLweb 0.1.3

by lauerfab - December 17, 2015, 10:29:35 CET [ Project Homepage BibTeX Download ] 2684 views, 659 downloads, 3 subscriptions

About: MLweb is an open source project that aims at bringing machine learning capabilities into web pages and web applications, while maintaining all computations on the client side. It includes (i) a javascript library to enable scientific computing within web pages, (ii) a javascript library implementing machine learning algorithms for classification, regression, clustering and dimensionality reduction, (iii) a web application providing a matlab-like development environment.

Changes:
  • Improve NaiveBayes classifier
  • Add online training functions for KNN and NaiveBayes
  • Fix save/load workspace in LALOLab
  • Fix nullspace()
  • Small bug fixes

Logo Probabilistic Classification Vector Machine 0.22

by fmschleif - November 10, 2015, 13:16:19 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3833 views, 852 downloads, 3 subscriptions

About: PCVM library a c++/armadillo implementation of the Probabilistic Classification Vector Machine.

Changes:

30.10.2015 * code has been revised in some places fixing also some errors different multiclass schemes and hdf5 file support added. Some speed ups and memory savings by better handling of intermediate objects.

27.05.2015: - Matlab binding under Windows available. Added a solution file for VS'2013 express to compile a matlab mex binding. Can not yet confirm that under windows the code is really using multiple cores (under linux it does)

29.04.2015 * added an implementation of the Nystroem based PCVM includes: Nystroem based singular value decomposition (SVD), eigenvalue decomposition (EVD) and pseudo-inverse calculation (PINV)

22.04.2015 * implementation of the PCVM released


Logo Apache Mahout 0.11.1

by gsingers - November 9, 2015, 16:12:06 CET [ Project Homepage BibTeX Download ] 19821 views, 5180 downloads, 3 subscriptions

About: Apache Mahout is an Apache Software Foundation project with the goal of creating both a community of users and a scalable, Java-based framework consisting of many machine learning algorithm [...]

Changes:

Apache Mahout introduces a new math environment we call Samsara, for its theme of universal renewal. It reflects a fundamental rethinking of how scalable machine learning algorithms are built and customized. Mahout-Samsara is here to help people create their own math while providing some off-the-shelf algorithm implementations. At its core are general linear algebra and statistical operations along with the data structures to support them. You can use is as a library or customize it in Scala with Mahout-specific extensions that look something like R. Mahout-Samsara comes with an interactive shell that runs distributed operations on a Spark cluster. This make prototyping or task submission much easier and allows users to customize algorithms with a whole new degree of freedom. Mahout Algorithms include many new implementations built for speed on Mahout-Samsara. They run on Spark 1.3+ and some on H2O, which means as much as a 10x speed increase. You’ll find robust matrix decomposition algorithms as well as a Naive Bayes classifier and collaborative filtering. The new spark-itemsimilarity enables the next generation of cooccurrence recommenders that can use entire user click streams and context in making recommendations.


Logo KeBABS 1.4.1

by UBod - November 3, 2015, 11:33:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9857 views, 1786 downloads, 3 subscriptions

About: Kernel-Based Analysis of Biological Sequences

Changes:
  • new method to compute prediction profiles from models trained with mixture kernels
  • correction for position specific kernel with offsets
  • corrections for prediction profile of motif kernel
  • additional hint on help page of kbsvm

Logo Cognitive Foundry 3.4.2

by Baz - October 30, 2015, 06:53:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 24788 views, 4198 downloads, 4 subscriptions

About: The Cognitive Foundry is a modular Java software library of machine learning components and algorithms designed for research and applications.

Changes:
  • General:
    • Upgraded MTJ to 1.0.3.
  • Common:
    • Added package for hash function computation including Eva, FNV-1a, MD5, Murmur2, Prime, SHA1, SHA2
    • Added callback-based forEach implementations to Vector and InfiniteVector, which can be faster for iterating through some vector types.
    • Optimized DenseVector by removing a layer of indirection.
    • Added method to compute set of percentiles in UnivariateStatisticsUtil and fixed issue with percentile interpolation.
    • Added utility class for enumerating combinations.
    • Adjusted ScalarMap implementation hierarchy.
    • Added method for copying a map to VectorFactory and moved createVectorCapacity up from SparseVectorFactory.
    • Added method for creating square identity matrix to MatrixFactory.
    • Added Random implementation that uses a cached set of values.
  • Learning:
    • Implemented feature hashing.
    • Added factory for random forests.
    • Implemented uniform distribution over integer values.
    • Added Chi-squared similarity.
    • Added KL divergence.
    • Added general conditional probability distribution.
    • Added interfaces for Regression, UnivariateRegression, and MultivariateRegression.
    • Fixed null pointer exception that can happen in K-means with an empty cluster.
    • Fixed name of maxClusters property on AgglomerativeClusterer (was called maxMinDistance).
  • Text:
    • Improvements to LDA Gibbs sampler.

Logo JMLR dlib ml 18.18

by davis685 - October 29, 2015, 01:48:44 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 128583 views, 21259 downloads, 4 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.

Changes:

This release has focused on build system improvements, both for the Python API and C++ builds using CMake. This includes adding a setup.py script for installing the dlib Python API as well as a make install target for installing a C++ shared library for non-Python use.


Logo SALSA.jl 0.0.5

by jumutc - September 28, 2015, 17:28:56 CET [ Project Homepage BibTeX Download ] 836 views, 152 downloads, 1 subscription

About: SALSA (Software lab for Advanced machine Learning with Stochastic Algorithms) is an implementation of the well-known stochastic algorithms for Machine Learning developed in the high-level technical computing language Julia. The SALSA software package is designed to address challenges in sparse linear modelling, linear and non-linear Support Vector Machines applied to large data samples with user-centric and user-friendly emphasis.

Changes:

Initial Announcement on mloss.org.


Logo WEKA 3.7.13

by mhall - September 11, 2015, 04:55:02 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 53855 views, 8000 downloads, 4 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 6 votes)

About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...]

Changes:

In core weka:

  • Numerically stable implementation of variance calculation in core Weka classes - thanks to Benjamin Weber
  • Unified expression parsing framework (with compiled expressions) is now employed by filters and tools that use mathematical/logical expressions - thanks to Benjamin Weber
  • Developers can now specify GUI and command-line options for their Weka schemes via a new unified annotation-based mechanism
  • ClassConditionalProbabilities filter - replaces the value of a nominal attribute in a given instance with its probability given each of the possible class values
  • GUI package manager's available list now shows both packages that are not currently installed, and those installed packages for which there is a more recent version available that is compatible with the base version of Weka being used
  • ReplaceWithMissingValue filter - allows values to be randomly (with a user-specified probability) replaced with missing values. Useful for experimenting with methods for imputing missing values
  • WrapperSubsetEval can now use plugin evaluation metrics

In packages:

  • alternatingModelTrees package - alternating trees for regression
  • timeSeriesFilters package, contributed by Benjamin Weber
  • distributedWekaSpark package - wrapper for distributed Weka on Spark
  • wekaPython package - execution of CPython scripts and wrapper classifier/clusterer for Scikit Learn schemes
  • MLRClassifier in RPlugin now provides access to almost all classification and regression learners in MLR 2.4

Logo JMLR GPstuff 4.6

by avehtari - July 15, 2015, 15:08:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 28948 views, 6806 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2015-07-09 Version 4.6

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Use Pareto smoothed importance sampling (Vehtari & Gelman, 2015) for

  • importance sampling leave-one-out cross-validation (gpmc_loopred.m)

  • importance sampling integration over hyperparameters (gp_ia.m)

  • importance sampling part of the logistic Gaussian process density estimation (lgpdens.m)

  • references:

    • Aki Vehtari and Andrew Gelman (2015). Pareto smoothed importance sampling. arXiv preprint arXiv:1507.02646.
    • Aki Vehtari, Andrew Gelman and Jonah Gabry (2015). Efficient implementation of leave-one-out cross-validation and WAIC for evaluating fitted Bayesian models.
  • New covariance functions

    • gpcf_additive creates a mixture over products of kernels for each dimension reference: Duvenaud, D. K., Nickisch, H., & Rasmussen, C. E. (2011). Additive Gaussian processes. In Advances in neural information processing systems, pp. 226-234.
    • gpcf_linearLogistic corresponds to logistic mean function
    • gpcf_linearMichelismenten correpsonds Michelis Menten mean function

Improvements - faster EP moment calculation for lik_logit

Several minor bugfixes


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.6

by hn - July 6, 2015, 12:31:28 CET [ Project Homepage BibTeX Download ] 30803 views, 7135 downloads, 4 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • added a new inference function infGrid_Laplace allowing to use non-Gaussian likelihoods for large grids

  • fixed a bug due to Octave evaluating norm([]) to a tiny nonzero value, modified all lik/lik*.m functions reported by Philipp Richter

  • small bugfixes in covGrid and infGrid

  • bugfix in predictive variance of likNegBinom due to Seth Flaxman

  • bugfix in infFITC_Laplace as suggested by Wu Lin

  • bugfix in covPP{iso,ard}


Logo Simple Generalized Learning Vector Quantization 1.0

by fmschleif - June 4, 2015, 10:49:49 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1621 views, 408 downloads, 2 subscriptions

About: Simple and hopefully clean and easy to follow implementation of the Generalized Learning Vector Quantizer (GLVQ) with variants for metric adaptation (RGLVQ, GMLVQ, LiRaM).

Changes:

Initial Announcement on mloss.org.


Logo java machine learning platform 1.0

by openpr_nlpr - April 2, 2015, 09:02:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1673 views, 288 downloads, 2 subscriptions

About: Jmlp is a java platform for both of the machine learning experiments and application. I have tested it on the window platform. But it should be applicable in the linux platform due to the cross-platform of Java language. It contains the classical classification algorithm (Discrete AdaBoost.MH, Real AdaBoost.MH, SVM, KNN, MCE,MLP,NB) and feature reduction(KPCA,PCA,Whiten) etc.

Changes:

Initial Announcement on mloss.org.


Logo Hivemall 0.3

by myui - March 13, 2015, 17:08:22 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 8200 views, 1367 downloads, 3 subscriptions

About: Hivemall is a scalable machine learning library running on Hive/Hadoop.

Changes:
  • Supported Matrix Factorization
  • Added a support for TF-IDF computation
  • Supported AdaGrad/AdaDelta
  • Supported AdaGradRDA classification
  • Added normalization scheme

Logo JMLR Mulan 1.5.0

by lefman - February 23, 2015, 21:19:05 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 21391 views, 7510 downloads, 2 subscriptions

About: Mulan is an open-source Java library for learning from multi-label datasets. Multi-label datasets consist of training examples of a target function that has multiple binary target variables. This means that each item of a multi-label dataset can be a member of multiple categories or annotated by many labels (classes). This is actually the nature of many real world problems such as semantic annotation of images and video, web page categorization, direct marketing, functional genomics and music categorization into genres and emotions.

Changes:

Learners

  • MLCSSP.java: Added the MLCSSP algorithm (from ICML 2013)
  • Enhancements of multi-target regression capabilities
  • Improved CLUS support
  • Added pairwise classifier and pairwise transformation

Measures/Evaluation

  • Providing training data in the Evaluator is unnecessary in the case of specific measures.
  • Examples with missing ground truth are not skipped for measures that handle missing values.
  • Added logistics and squared error losses and measures

Bug fixes

  • IndexOutOfBounds in calculation of MiAP and GMiAP
  • Bug fix in Rcut.java
  • When in rank/score mode the meta-data contained additional unecessary attributes. (Newton Spolaor)

API changes

  • Upgrade to Java 7
  • Upgrade to Weka 3.7.10

Miscalleneous

  • Small changes and improvements in the wrapper classes for the CLUS library
  • ENTCS13FeatureSelection.java (new experiment)
  • Enumeration is now used for specifying the type of meta-data. (Newton Spolaor)

Logo Hub Miner 1.1

by nenadtomasev - January 22, 2015, 16:33:51 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 3003 views, 590 downloads, 2 subscriptions

About: Hubness-aware Machine Learning for High-dimensional Data

Changes:
  • BibTex support for all algorithm implementations, making all of them easy to reference (via algref package).

  • Two more hubness-aware approaches (meta-metric-learning and feature construction)

  • An implementation of Hit-Miss networks for analysis.

  • Several minor bug fixes.

  • The following instance selection methods were added: HMScore, Carving, Iterative Case Filtering, ENRBF.

  • The following clustering quality indexes were added: Folkes-Mallows, Calinski-Harabasz, PBM, G+, Tau, Point-Biserial, Hubert's statistic, McClain-Rao, C-root-k.

  • Some more experimental scripts have been included.

  • Extensions in the estimation of hubness risk.

  • Alias and weighted reservoir methods for weight-proportional random selection.


Logo pyGPs 1.3.2

by mn - January 17, 2015, 13:08:43 CET [ Project Homepage BibTeX Download ] 6471 views, 1559 downloads, 4 subscriptions

About: pyGPs is a Python package for Gaussian process (GP) regression and classification for machine learning.

Changes:

Changelog pyGPs v1.3.2

December 15th 2014

  • pyGPs added to pip
  • mathematical definitions of kernel functions available in documentation
  • more error message added

Logo pySPACE 1.2

by krell84 - October 29, 2014, 15:36:28 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 4577 views, 941 downloads, 1 subscription

About: pySPACE is the abbreviation for "Signal Processing and Classification Environment in Python using YAML and supporting parallelization". It is a modular software for processing of large data streams that has been specifically designed to enable distributed execution and empirical evaluation of signal processing chains. Various signal processing algorithms (so called nodes) are available within the software, from finite impulse response filters over data-dependent spatial filters (e.g. CSP, xDAWN) to established classifiers (e.g. SVM, LDA). pySPACE incorporates the concept of node and node chains of the MDP framework. Due to its modular architecture, the software can easily be extended with new processing nodes and more general operations. Large scale empirical investigations can be configured using simple text- configuration files in the YAML format, executed on different (distributed) computing modalities, and evaluated using an interactive graphical user interface.

Changes:

improved testing, improved documentation, windows compatibility, more algorithms


Showing Items 1-20 of 77 on page 1 of 4: 1 2 3 4 Next