Projects that are tagged with classification.
Showing Items 1-20 of 79 on page 1 of 4: 1 2 3 4 Next

Logo MLweb 0.1.4

by lauerfab - June 28, 2016, 16:00:52 CET [ Project Homepage BibTeX Download ] 4750 views, 1095 downloads, 3 subscriptions

About: MLweb is an open source project that aims at bringing machine learning capabilities into web pages and web applications, while maintaining all computations on the client side. It includes (i) a javascript library to enable scientific computing within web pages, (ii) a javascript library implementing machine learning algorithms for classification, regression, clustering and dimensionality reduction, (iii) a web application providing a matlab-like development environment.

Changes:
  • Add Logistic Regression
  • Add support for sparse input in fast training of linear SVM
  • Better support for sparse vectors/matrices
  • Fix plot windows in IE
  • Minor bug fixes

About: Nowadays this is very popular to use the deep architectures in machine learning. Deep Belief Networks (DBNs) are deep architectures that use a stack of Restricted Boltzmann Machines (RBM) to create a powerful generative model using training data. DBNs have many abilities such as feature extraction and classification that are used in many applications including image processing, speech processing, text categorization, etc. This paper introduces a new object oriented toolbox with the most important abilities needed for the implementation of DBNs. According to the results of the experiments conducted on the MNIST (image), ISOLET (speech), and the 20 Newsgroups (text) datasets, it was shown that the toolbox can learn automatically a good representation of the input from unlabeled data with better discrimination between different classes. Also on all the aforementioned datasets, the obtained classification errors are comparable to those of the state of the art classifiers. In addition, the toolbox supports different sampling methods (e.g. Gibbs, CD, PCD and our new FEPCD method), different sparsity methods (quadratic, rate distortion and our new normal method), different RBM types (generative and discriminative), GPU based, etc. The toolbox is a user-friendly open source software in MATLAB and Octave and is freely available on the website.

Changes:

New in toolbox

  • Using GPU in Backpropagation
  • Revision of some demo scripts
  • Function approximation with multiple outputs
  • Feature extraction with GRBM in first layer

cardinal


Logo JMLR dlib ml 19.0

by davis685 - June 25, 2016, 23:04:08 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 145527 views, 23589 downloads, 4 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.

Changes:

This release adds a deep learning toolkit to dlib that has a clean and fully documented C++11 API. It also includes CPU and GPU support, binds to cuDNN, can train on multiple GPUs at a time, and comes with a pretrained imagenet model based on ResNet34.

The release also adds a number of other improvements such as new elastic net regularized solvers and QP solvers, improved MATLAB binding tools, and other usability tweaks and optimizations.


Logo JMLR GPstuff 4.7

by avehtari - June 9, 2016, 17:45:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 33985 views, 8149 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2016-06-09 Version 4.7

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Simple Bayesian Optimization demo

Improvements

  • Improved use of PSIS
  • More options added to gp_monotonic
  • Monotonicity now works for additive covariance functions with selected variables
  • Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
  • Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
  • LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
  • Selected variables -option works now better with monotonicity

Bugfixes

  • small error in derivative observation computation fixed
  • several minor bug fixes

Logo AutoWEKA 2.0

by larsko - May 19, 2016, 19:58:41 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1027 views, 228 downloads, 3 subscriptions

About: Automatically finds the best model with its best parameter settings for a given classification or regression task.

Changes:

Initial Announcement on mloss.org.


Logo WEKA 3.9.0

by mhall - April 15, 2016, 06:35:30 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 59633 views, 8863 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 6 votes)

About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...]

Changes:

In core weka:

  • JAMA-based linear algebra routines replaced with MTJ. Faster operation with the option to use native libraries for even more speed
  • General efficiency improvements in core, filters and some classifiers
  • GaussianProcesses now handles instance weights
  • New Knowledge Flow implementation. Engine completely rewritten from scratch with a simplified API
  • New Workbench GUI
  • GUI package manager now has a search facility
  • FixedDictionaryStringToWordVector filter allows the use of an external dictionary for vectorization. DictionarySaver converter can be used to create a dictionary file

In packages:

  • Packages that were using JAMA are now using MTJ
  • New netlibNativeOSX, netlibNativeWindows and netlibNativeLinux packages providing native reference implementations (and system-optimized implementation in the case of OSX) of BLAS, LAPACK and ARPACK linear algebra
  • New elasticNet package, courtesy of Nikhil Kinshore
  • New niftiLoader package for loading a directory with MIR data in NIfTI format into Weka
  • New percentageErrorMetrics package - provides plugin evaluation metrics for root mean square percentage error and mean absolute percentage error
  • New iterativeAbsoluteErrorRegression package - provides a meta learner that fits a regression model to minimize absolute error
  • New largeScaleKernelLearning package - contains filters for large-scale kernel-based learning
  • discriminantAnalysis package now contains an implementation for LDA and QDA
  • New Knowledge Flow component implementations in various packages
  • newKnowledgeFlowStepExamples package - contains code examples for new Knowledge Flow API discussion in the Weka Manual
  • RPlugin updated to latest version of MLR
  • scatterPlot3D and associationRulesVisualizer packages updated with latest Java 3D libraries
  • Support for pluggable activation functions in the multiLayerPerceptrons package

Logo Java Statistical Analysis Tool 0.0.4

by EdwardRaff - March 5, 2016, 06:28:14 CET [ Project Homepage BibTeX Download ] 935 views, 255 downloads, 2 subscriptions

About: General purpose Java Machine Learning library for classification, regression, and clustering.

Changes:

Initial Announcement on mloss.org.


Logo NaN toolbox 3.0.1

by schloegl - March 3, 2016, 20:35:02 CET [ Project Homepage BibTeX Download ] 47125 views, 9611 downloads, 3 subscriptions

About: NaN-toolbox is a statistics and machine learning toolbox for handling data with and without missing values.

Changes:

Changes in v.3.0.1 - fix packaging for octave

Changes in v.2.8.5 - bug fix: trimmean - compiler support for gcc-5 and clang - fix typos

For details see the CHANGELOG at http://pub.ist.ac.at/~schloegl/matlab/NaN/CHANGELOG


Logo JMLR Jstacs 2.2

by keili - February 17, 2016, 11:57:56 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 22758 views, 5311 downloads, 3 subscriptions

About: A Java framework for statistical analysis and classification of biological sequences

Changes:

New classes and packages:

  • CorreationCoefficient: PerformanceMeasure
  • de.jstacs.clustering: package with classes for hierarchical clustering
  • DeBruijnGraphSequenceGenerator and DeBruijnSequenceGenerator for generating De Buijn sequences
  • CyclicSequenceAdaptor for representing cyclic sequences
  • PlotGeneratorResult for representing results that plot images to a Graphics2D object
  • TextResult for results that may be stored as text files
  • package de.jstacs.results.savers for generic classes that store results to disk
  • LimitedSparseLocalInhomogeneousMixtureDiffSM_higherOrder for sparse local inhomogeneous mixture (Slim) models
  • PFMWrapperTrainSM for representing position frequency matrices and position weight matrices from databases
  • package de.jstacs.tools with classes for generic Jstacs tools that may be used in different user interfaces (command line, Galaxy, JavaFX)
  • Compression for ZIP compression of Strings
  • package de.jstacs.utils.graphics with generic GraphicsAdaptor using Apache XML commons
  • projects: Dimont, GeMoMa, Slim, TALEN, motif comparison

New features and improvements:

  • Major restructuring of Alignment for better efficiency
  • Alignment Costs and StringAlignment now Storable
  • New constructor of DataSet allowing a specified percentage of sequences to mismatch the given alphabet
  • BioJavaAdapter ported to BioJava 1.9
  • XMLParser now also allows for storing Sequences
  • New method for parsing HMMer profile HMMs in HMMFactory
  • Several minor improvements and bugfixes in many classes
  • Improvements of documentation of several classes

Logo KeLP 2.0.2

by kelpadmin - February 17, 2016, 09:03:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 8420 views, 2155 downloads, 3 subscriptions

About: Kernel-based Learning Platform (KeLP) is Java framework that supports the implementation of kernel-based learning algorithms, as well as an agile definition of kernel functions over generic data representation, e.g. vectorial data or discrete structures. The framework has been designed to decouple kernel functions and learning algorithms, through the definition of specific interfaces. Once a new kernel function has been implemented, it can be automatically adopted in all the available kernel-machine algorithms. KeLP includes different Online and Batch Learning algorithms for Classification, Regression and Clustering, as well as several Kernel functions, ranging from vector-based to structural kernels. It allows to build complex kernel machine based systems, leveraging on JSON/XML interfaces to instantiate prediction models without writing a single line of code.

Changes:

In addition to minor bug fixes, this release includes:

  • the Nystrom method for linearizing instances and allowing a large scale kernel learning

  • New examples for the usage of the Smoothed Partial Tree Kernel and the Compositionally Smoothed Partial Tree Kernel.

Check out this new version from our repositories. API Javadoc is already available. Your suggestions will be very precious for us, so download and try KeLP 2.0.2!


Logo Probabilistic Classification Vector Machine 0.22

by fmschleif - November 10, 2015, 13:16:19 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5230 views, 1151 downloads, 3 subscriptions

About: PCVM library a c++/armadillo implementation of the Probabilistic Classification Vector Machine.

Changes:

30.10.2015 * code has been revised in some places fixing also some errors different multiclass schemes and hdf5 file support added. Some speed ups and memory savings by better handling of intermediate objects.

27.05.2015: - Matlab binding under Windows available. Added a solution file for VS'2013 express to compile a matlab mex binding. Can not yet confirm that under windows the code is really using multiple cores (under linux it does)

29.04.2015 * added an implementation of the Nystroem based PCVM includes: Nystroem based singular value decomposition (SVD), eigenvalue decomposition (EVD) and pseudo-inverse calculation (PINV)

22.04.2015 * implementation of the PCVM released


Logo Apache Mahout 0.11.1

by gsingers - November 9, 2015, 16:12:06 CET [ Project Homepage BibTeX Download ] 21631 views, 5617 downloads, 3 subscriptions

About: Apache Mahout is an Apache Software Foundation project with the goal of creating both a community of users and a scalable, Java-based framework consisting of many machine learning algorithm [...]

Changes:

Apache Mahout introduces a new math environment we call Samsara, for its theme of universal renewal. It reflects a fundamental rethinking of how scalable machine learning algorithms are built and customized. Mahout-Samsara is here to help people create their own math while providing some off-the-shelf algorithm implementations. At its core are general linear algebra and statistical operations along with the data structures to support them. You can use is as a library or customize it in Scala with Mahout-specific extensions that look something like R. Mahout-Samsara comes with an interactive shell that runs distributed operations on a Spark cluster. This make prototyping or task submission much easier and allows users to customize algorithms with a whole new degree of freedom. Mahout Algorithms include many new implementations built for speed on Mahout-Samsara. They run on Spark 1.3+ and some on H2O, which means as much as a 10x speed increase. You’ll find robust matrix decomposition algorithms as well as a Naive Bayes classifier and collaborative filtering. The new spark-itemsimilarity enables the next generation of cooccurrence recommenders that can use entire user click streams and context in making recommendations.


Logo KeBABS 1.4.1

by UBod - November 3, 2015, 11:33:46 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12871 views, 2350 downloads, 3 subscriptions

About: Kernel-Based Analysis of Biological Sequences

Changes:
  • new method to compute prediction profiles from models trained with mixture kernels
  • correction for position specific kernel with offsets
  • corrections for prediction profile of motif kernel
  • additional hint on help page of kbsvm

Logo Cognitive Foundry 3.4.2

by Baz - October 30, 2015, 06:53:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 27852 views, 4744 downloads, 4 subscriptions

About: The Cognitive Foundry is a modular Java software library of machine learning components and algorithms designed for research and applications.

Changes:
  • General:
    • Upgraded MTJ to 1.0.3.
  • Common:
    • Added package for hash function computation including Eva, FNV-1a, MD5, Murmur2, Prime, SHA1, SHA2
    • Added callback-based forEach implementations to Vector and InfiniteVector, which can be faster for iterating through some vector types.
    • Optimized DenseVector by removing a layer of indirection.
    • Added method to compute set of percentiles in UnivariateStatisticsUtil and fixed issue with percentile interpolation.
    • Added utility class for enumerating combinations.
    • Adjusted ScalarMap implementation hierarchy.
    • Added method for copying a map to VectorFactory and moved createVectorCapacity up from SparseVectorFactory.
    • Added method for creating square identity matrix to MatrixFactory.
    • Added Random implementation that uses a cached set of values.
  • Learning:
    • Implemented feature hashing.
    • Added factory for random forests.
    • Implemented uniform distribution over integer values.
    • Added Chi-squared similarity.
    • Added KL divergence.
    • Added general conditional probability distribution.
    • Added interfaces for Regression, UnivariateRegression, and MultivariateRegression.
    • Fixed null pointer exception that can happen in K-means with an empty cluster.
    • Fixed name of maxClusters property on AgglomerativeClusterer (was called maxMinDistance).
  • Text:
    • Improvements to LDA Gibbs sampler.

Logo SALSA.jl 0.0.5

by jumutc - September 28, 2015, 17:28:56 CET [ Project Homepage BibTeX Download ] 1350 views, 273 downloads, 1 subscription

About: SALSA (Software lab for Advanced machine Learning with Stochastic Algorithms) is an implementation of the well-known stochastic algorithms for Machine Learning developed in the high-level technical computing language Julia. The SALSA software package is designed to address challenges in sparse linear modelling, linear and non-linear Support Vector Machines applied to large data samples with user-centric and user-friendly emphasis.

Changes:

Initial Announcement on mloss.org.


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.6

by hn - July 6, 2015, 12:31:28 CET [ Project Homepage BibTeX Download ] 34645 views, 7942 downloads, 4 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • added a new inference function infGrid_Laplace allowing to use non-Gaussian likelihoods for large grids

  • fixed a bug due to Octave evaluating norm([]) to a tiny nonzero value, modified all lik/lik*.m functions reported by Philipp Richter

  • small bugfixes in covGrid and infGrid

  • bugfix in predictive variance of likNegBinom due to Seth Flaxman

  • bugfix in infFITC_Laplace as suggested by Wu Lin

  • bugfix in covPP{iso,ard}


Logo Simple Generalized Learning Vector Quantization 1.0

by fmschleif - June 4, 2015, 10:49:49 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2156 views, 532 downloads, 2 subscriptions

About: Simple and hopefully clean and easy to follow implementation of the Generalized Learning Vector Quantizer (GLVQ) with variants for metric adaptation (RGLVQ, GMLVQ, LiRaM).

Changes:

Initial Announcement on mloss.org.


Logo java machine learning platform 1.0

by openpr_nlpr - April 2, 2015, 09:02:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2171 views, 398 downloads, 2 subscriptions

About: Jmlp is a java platform for both of the machine learning experiments and application. I have tested it on the window platform. But it should be applicable in the linux platform due to the cross-platform of Java language. It contains the classical classification algorithm (Discrete AdaBoost.MH, Real AdaBoost.MH, SVM, KNN, MCE,MLP,NB) and feature reduction(KPCA,PCA,Whiten) etc.

Changes:

Initial Announcement on mloss.org.


Logo Hivemall 0.3

by myui - March 13, 2015, 17:08:22 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9643 views, 1677 downloads, 3 subscriptions

About: Hivemall is a scalable machine learning library running on Hive/Hadoop.

Changes:
  • Supported Matrix Factorization
  • Added a support for TF-IDF computation
  • Supported AdaGrad/AdaDelta
  • Supported AdaGradRDA classification
  • Added normalization scheme

Logo JMLR Mulan 1.5.0

by lefman - February 23, 2015, 21:19:05 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 24142 views, 8025 downloads, 2 subscriptions

About: Mulan is an open-source Java library for learning from multi-label datasets. Multi-label datasets consist of training examples of a target function that has multiple binary target variables. This means that each item of a multi-label dataset can be a member of multiple categories or annotated by many labels (classes). This is actually the nature of many real world problems such as semantic annotation of images and video, web page categorization, direct marketing, functional genomics and music categorization into genres and emotions.

Changes:

Learners

  • MLCSSP.java: Added the MLCSSP algorithm (from ICML 2013)
  • Enhancements of multi-target regression capabilities
  • Improved CLUS support
  • Added pairwise classifier and pairwise transformation

Measures/Evaluation

  • Providing training data in the Evaluator is unnecessary in the case of specific measures.
  • Examples with missing ground truth are not skipped for measures that handle missing values.
  • Added logistics and squared error losses and measures

Bug fixes

  • IndexOutOfBounds in calculation of MiAP and GMiAP
  • Bug fix in Rcut.java
  • When in rank/score mode the meta-data contained additional unecessary attributes. (Newton Spolaor)

API changes

  • Upgrade to Java 7
  • Upgrade to Weka 3.7.10

Miscalleneous

  • Small changes and improvements in the wrapper classes for the CLUS library
  • ENTCS13FeatureSelection.java (new experiment)
  • Enumeration is now used for specifying the type of meta-data. (Newton Spolaor)

Showing Items 1-20 of 79 on page 1 of 4: 1 2 3 4 Next