Projects that are tagged with classification.
Showing Items 1-20 of 71 on page 1 of 4: 1 2 3 4 Next

Logo KeBABS 1.2.0

by UBod - April 17, 2015, 21:15:37 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2813 views, 468 downloads, 3 subscriptions

About: Kernel-Based Analysis of Biological Sequences

Changes:
  • inclusion of dense LIBSVM 3.20 for dense kernel matrix support to provide a reliable way for training with kernel matrices
  • new accessors folds and performance for CrossValidationResult
  • removed fold performance from show of CV result
  • adaptions for user defined sequence kernel with new export isUserDefined, example in inst/examples/UserDefinedKernel
  • correction of errors with position offset for position specific kernels
  • computation of AUC via trapezoidal rule
  • changes for auto mode in CV, grid search, model selection
  • check for non-negative mixing coefficients in spectrum and gappy pair kernel
  • build warnings on Windows removed
  • added definition of performance parameters for binary and multiclass classification to vignette
  • update of citation file and reference section in help pages

Logo Cognitive Foundry 3.4.0

by Baz - April 3, 2015, 08:28:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 18491 views, 2993 downloads, 2 subscriptions

About: The Cognitive Foundry is a modular Java software library of machine learning components and algorithms designed for research and applications.

Changes:
  • General:
    • Now requires Java 1.7 or higher.
    • Improved compatibility with Java 1.8 functions by removing ClonableSerializable requirement from many function-style interfaces.
  • Common Core:
    • Improved iteration speed over sparse MTJ vectors.
    • Added utility methods for more stable log(1+x), exp(1-x), log(1 - exp(x)), and log(1 + exp(x)) to LogMath.
    • Added method for creating a partial permutations to Permutation.
    • Added methods for computing standard deviation to UnivariateStatisticsUtil.
    • Added increment, decrement, and list view methods to Vector and Matrix.
    • Added shorter versions of get and set for Vector and Matrix getElement and setElement.
    • Added aliases of dot for dotProduct in VectorSpace.
    • Added utility methods for divideByNorm2 to VectorUtil.
  • Learning:
    • Added a learner for a Factorization Machine using SGD.
    • Added a iterative reporter for validation set performance.
    • Added new methods to statistical distribution classes to allow for faster sampling without boxing, in batches, or without creating extra memory.
    • Made generics for performance evaluators more permissive.
    • ParameterGradientEvaluator changed to not require input, output, and gradient types to be the same. This allows more sane gradient definitions for scalar functions.
    • Added parameter to enforce a minimum size in a leaf node for decision tree learning. It is configured through the splitting function.
    • Added ability to filter which dimensions to use in the random subspace and variance tree node splitter.
    • Added ReLU, leaky ReLU, and soft plus activation functions for neural networks.
    • Added IntegerDistribution interface for distributions over natural numbers.
    • Added a method to get the mean of a numeric distribution without boxing.
    • Fixed an issue in DefaultDataDistribution that caused the total to be off when a value was set to less than or equal to 0.
    • Added property for rate to GammaDistribution.
    • Added method to get standard deviation from a UnivariateGaussian.
    • Added clone operations for decision tree classes.
    • Fixed issue TukeyKramerConfidence interval computation.
    • Fixed serialization issue with SMO output.

Logo java machine learning platform 1.0

by openpr_nlpr - April 2, 2015, 09:02:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 323 views, 42 downloads, 2 subscriptions

About: Jmlp is a java platform for both of the machine learning experiments and application. I have tested it on the window platform. But it should be applicable in the linux platform due to the cross-platform of Java language. It contains the classical classification algorithm (Discrete AdaBoost.MH, Real AdaBoost.MH, SVM, KNN, MCE,MLP,NB) and feature reduction(KPCA,PCA,Whiten) etc.

Changes:

Initial Announcement on mloss.org.


Logo Hivemall 0.3

by myui - March 13, 2015, 17:08:22 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 5124 views, 812 downloads, 3 subscriptions

About: Hivemall is a scalable machine learning library running on Hive/Hadoop.

Changes:
  • Supported Matrix Factorization
  • Added a support for TF-IDF computation
  • Supported AdaGrad/AdaDelta
  • Supported AdaGradRDA classification
  • Added normalization scheme

Logo JMLR dlib ml 18.14

by davis685 - March 1, 2015, 23:51:06 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 95744 views, 16598 downloads, 3 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.

Changes:

This release adds an implementation of spectral clustering as well as a few bug fixes and usability improvements.


Logo JMLR Mulan 1.5.0

by lefman - February 23, 2015, 21:19:05 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 16542 views, 6529 downloads, 2 subscriptions

About: Mulan is an open-source Java library for learning from multi-label datasets. Multi-label datasets consist of training examples of a target function that has multiple binary target variables. This means that each item of a multi-label dataset can be a member of multiple categories or annotated by many labels (classes). This is actually the nature of many real world problems such as semantic annotation of images and video, web page categorization, direct marketing, functional genomics and music categorization into genres and emotions.

Changes:

Learners

  • MLCSSP.java: Added the MLCSSP algorithm (from ICML 2013)
  • Enhancements of multi-target regression capabilities
  • Improved CLUS support
  • Added pairwise classifier and pairwise transformation

Measures/Evaluation

  • Providing training data in the Evaluator is unnecessary in the case of specific measures.
  • Examples with missing ground truth are not skipped for measures that handle missing values.
  • Added logistics and squared error losses and measures

Bug fixes

  • IndexOutOfBounds in calculation of MiAP and GMiAP
  • Bug fix in Rcut.java
  • When in rank/score mode the meta-data contained additional unecessary attributes. (Newton Spolaor)

API changes

  • Upgrade to Java 7
  • Upgrade to Weka 3.7.10

Miscalleneous

  • Small changes and improvements in the wrapper classes for the CLUS library
  • ENTCS13FeatureSelection.java (new experiment)
  • Enumeration is now used for specifying the type of meta-data. (Newton Spolaor)

Logo NaN toolbox 2.7.1

by schloegl - February 3, 2015, 19:03:36 CET [ Project Homepage BibTeX Download ] 33099 views, 6743 downloads, 2 subscriptions

About: NaN-toolbox is a statistics and machine learning toolbox for handling data with and without missing values.

Changes:

Changes in v.2.7.1 - API compatibility of mahal, zscore, - improve support for cygwin, macosx/homebrew - a number of minor improvements

For details see the CHANGELOG at http://pub.ist.ac.at/~schloegl/matlab/NaN/CHANGELOG


Logo Hub Miner 1.1

by nenadtomasev - January 22, 2015, 16:33:51 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1385 views, 226 downloads, 2 subscriptions

About: Hubness-aware Machine Learning for High-dimensional Data

Changes:
  • BibTex support for all algorithm implementations, making all of them easy to reference (via algref package).

  • Two more hubness-aware approaches (meta-metric-learning and feature construction)

  • An implementation of Hit-Miss networks for analysis.

  • Several minor bug fixes.

  • The following instance selection methods were added: HMScore, Carving, Iterative Case Filtering, ENRBF.

  • The following clustering quality indexes were added: Folkes-Mallows, Calinski-Harabasz, PBM, G+, Tau, Point-Biserial, Hubert's statistic, McClain-Rao, C-root-k.

  • Some more experimental scripts have been included.

  • Extensions in the estimation of hubness risk.

  • Alias and weighted reservoir methods for weight-proportional random selection.


Logo pyGPs 1.3.2

by mn - January 17, 2015, 13:08:43 CET [ Project Homepage BibTeX Download ] 4077 views, 961 downloads, 4 subscriptions

About: pyGPs is a Python package for Gaussian process (GP) regression and classification for machine learning.

Changes:

Changelog pyGPs v1.3.2

December 15th 2014

  • pyGPs added to pip
  • mathematical definitions of kernel functions available in documentation
  • more error message added

Logo WEKA 3.7.12

by mhall - December 17, 2014, 03:04:17 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 44582 views, 6619 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 6 votes)

About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...]

Changes:

In core weka:

  • GUIChooser now has a plugin exension point that allows implementations of GUIChooser.GUIChooserMenuPlugin to appear as entries in either the Tools or Visualization menus
  • SubsetByExpression filter now has support for regexp matching
  • weka.classifiers.IterativeClassifierOptimizer - a classifier that can efficiently optimize the number of iterations for a base classifier that implements IterativeClassifier
  • Speedup for LogitBoost in the two class case
  • weka.filters.supervised.instance.ClassBalancer - a simple filter to balance the weight of classes
  • New class hierarchy for stopwords algorithms. Includes new methods to read custom stopwords from a file and apply multiple stopwords algorithms
  • Ability to turn off capabilities checking in Weka algorithms. Improves runtime for ensemble methods that create a lot of simple base classifiers
  • Memory savings in weka.core.Attribute
  • Improvements in runtime for SimpleKMeans and EM
  • weka.estimators.UnivariateMixtureEstimator - new mixture estimator

In packages:

  • New discriminantAnalysis package. Provides an implementation of Fisher's linear discriminant analysis
  • Quartile estimators, correlation matrix heat map and k-means++ clustering in distributed Weka
  • Support for default settings for GridSearch via a properties file
  • Improvements in scripting with addition of the offical Groovy console (kfGroovy package) from the Groovy project and TigerJython (new tigerjython package) as the Jython console via the GUIChooser
  • Support for the latest version of MLR in the RPlugin package
  • EAR4 package contributed by Vahid Jalali
  • StudentFilters package contributed by Chris Gearhart
  • graphgram package contributed by Johannes Schneider

Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 3.5

by hn - December 8, 2014, 13:54:38 CET [ Project Homepage BibTeX Download ] 22895 views, 5305 downloads, 3 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave 3.2.x and Matlab 7.x implementation of inference and prediction in Gaussian Process (GP) models.

Changes:
  • mechanism for specifying hyperparameter priors (together with Roman Garnett and José Vallet)
  • new inference method inf/infGrid allowing efficient inference for data defined on a Cartesian grid (together with Andrew Wilson)
  • new mean/cov functions for preference learning: meanPref/covPref
  • new mean/cov functions for non-vectorial data: meanDiscrete/covDiscrete
  • new piecewise constant nearest neighbor mean function: meanNN
  • new mean functions being predictions from GPs: meanGP and meanGPexact
  • new covariance function for standard additive noise: covEye
  • new covariance function for factor analysis: covSEfact
  • new covariance function with varying length scale : covSEvlen
  • make covScale more general to scaling with a function instead of a scalar
  • bugfix in covGabor* and covSM (due to Andrew Gordon Wilson)
  • bugfix in lik/likBeta.m (suggested by Dali Wei)
  • bugfix in solve_chol.c (due to Todd Small)
  • bugfix in FITC inference mode (due to Joris Mooij) where the wrong mode for post.L was chosen when using infFITC and post.L being a diagonal matrix
  • bugfix in infVB marginal likelihood for likLogistic with nonzero mean function (reported by James Lloyd)
  • removed the combination likErf/infVB as it yields a bad posterior approximation and lacks theoretical justification
  • Matlab and Octave compilation for L-BFGS-B v2.4 and the more recent L-BFGS-B v3.0 (contributed by José Vallet)
  • smaller bugfixes in gp.m (due to Joris Mooij and Ernst Kloppenburg)
  • bugfix in lik/likBeta.m (due to Dali Wei)
  • updated use of logphi in lik/likErf
  • bugfix in util/solve_chol.c where a typing issue occured on OS X (due to Todd Small)
  • bugfix due to Bjørn Sand Jensen noticing that cov_deriv_sq_dist.m was missing in the distribution
  • bugfix in infFITC_EP for ttau->inf (suggested by Ryan Turner)

Logo pySPACE 1.2

by krell84 - October 29, 2014, 15:36:28 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2887 views, 595 downloads, 1 subscription

About: pySPACE is the abbreviation for "Signal Processing and Classification Environment in Python using YAML and supporting parallelization". It is a modular software for processing of large data streams that has been specifically designed to enable distributed execution and empirical evaluation of signal processing chains. Various signal processing algorithms (so called nodes) are available within the software, from finite impulse response filters over data-dependent spatial filters (e.g. CSP, xDAWN) to established classifiers (e.g. SVM, LDA). pySPACE incorporates the concept of node and node chains of the MDP framework. Due to its modular architecture, the software can easily be extended with new processing nodes and more general operations. Large scale empirical investigations can be configured using simple text- configuration files in the YAML format, executed on different (distributed) computing modalities, and evaluated using an interactive graphical user interface.

Changes:

improved testing, improved documentation, windows compatibility, more algorithms


Logo AugmentedSVM 1.0.0

by ashukla - October 2, 2014, 11:24:14 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1104 views, 218 downloads, 2 subscriptions

About: A MATLAB toolkit for performing generalized regression with equality/inequality constraints on the function value/gradient.

Changes:

Initial Announcement on mloss.org.


Logo Boosted Decision Trees and Lists 1.0.4

by melamed - July 25, 2014, 23:08:32 CET [ BibTeX Download ] 3699 views, 1116 downloads, 3 subscriptions

About: Boosting algorithms for classification and regression, with many variations. Features include: Scalable and robust; Easily customizable loss functions; One-shot training for an entire regularization path; Continuous checkpointing; much more

Changes:
  • added ElasticNets as a regularization option
  • fixed some segfaults, memory leaks, and out-of-range errors, which were creeping in in some corner cases
  • added a couple of I/O optimizations

Logo JMLR GPstuff 4.5

by avehtari - July 22, 2014, 14:03:11 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 18795 views, 4568 downloads, 2 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2014-07-22 Version 4.5

New features

  • Input dependent noise and signal variance.

    • Tolvanen, V., Jylänki, P. and Vehtari, A. (2014). Expectation Propagation for Nonstationary Heteroscedastic Gaussian Process Regression. In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing, accepted for publication. Preprint http://arxiv.org/abs/1404.5443
  • Sparse stochastic variational inference model.

    • Hensman, J., Fusi, N. and Lawrence, N. D. (2013). Gaussian processes for big data. arXiv preprint http://arxiv.org/abs/1309.6835.
  • Option 'autoscale' in the gp_rnd.m to get split normal approximated samples from the posterior predictive distribution of the latent variable.

    • Geweke, J. (1989). Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57(6):1317-1339.

    • Villani, M. and Larsson, R. (2006). The Multivariate Split Normal Distribution and Asymmetric Principal Components Analysis. Communications in Statistics - Theory and Methods, 35(6):1123-1140.

Improvements

  • New unit test environment using the Matlab built-in test framework (the old Xunit package is still also supported).
  • Precomputed demo results (including the figures) are now available in the folder tests/realValues.
  • New demos demonstrating new features etc.
    • demo_epinf, demonstrating the input dependent noise and signal variance model
    • demo_svi_regression, demo_svi_classification
    • demo_modelcomparison2, demo_survival_comparison

Several minor bugfixes


Logo JMLR Waffles 2014-07-05

by mgashler - July 20, 2014, 04:53:54 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 27282 views, 7851 downloads, 2 subscriptions

About: Script-friendly command-line tools for machine learning and data mining tasks. (The command-line tools wrap functionality from a public domain C++ class library.)

Changes:

Added support for CUDA GPU-parallelized neural network layers, and several other new features. Full list of changes at http://waffles.sourceforge.net/docs/changelog.html


Logo RankSVM NC 1.0

by rflamary - July 10, 2014, 15:51:21 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 1499 views, 362 downloads, 1 subscription

About: This package is an implementation of a linear RankSVM solver with non-convex regularization.

Changes:

Initial Announcement on mloss.org.


Logo JMLR MOA Massive Online Analysis Nov-13

by abifet - April 4, 2014, 03:50:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 12880 views, 5102 downloads, 1 subscription

About: Massive Online Analysis (MOA) is a real time analytic tool for data streams. It is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collection of offline and online methods as well as tools for evaluation. In particular, it implements boosting, bagging, and Hoeffding Trees, all with and without Naive Bayes classifiers at the leaves. MOA supports bi-directional interaction with WEKA, the Waikato Environment for Knowledge Analysis, and it is released under the GNU GPL license.

Changes:

New version November 2013


Logo JMLR EnsembleSVM 2.0

by claesenm - March 31, 2014, 08:06:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 6938 views, 2500 downloads, 2 subscriptions

About: The EnsembleSVM library offers functionality to perform ensemble learning using Support Vector Machine (SVM) base models. In particular, we offer routines for binary ensemble models using SVM base classifiers. Experimental results have shown the predictive performance to be comparable with standard SVM models but with drastically reduced training time. Ensemble learning with SVM models is particularly useful for semi-supervised tasks.

Changes:

The library has been updated and features a variety of new functionality as well as more efficient implementations of original features. The following key improvements have been made:

  1. Support for multithreading in training and prediction with ensemble models. Since both of these are embarassingly parallel, this has induced a significant speedup (3-fold on quad-core).
  2. Extensive programming framework for aggregation of base model predictions which allows highly efficient prototyping of new aggregation approaches. Additionally we provide several predefined strategies, including (weighted) majority voting, logistic regression and nonlinear SVMs of your choice -- be sure to check out the esvm-edit tool! The provided framework also allows you to efficiently program your own, novel aggregation schemes.
  3. Full code transition to C++11, the latest C++ standard, which enabled various performance improvements. The new release requires moderately recent compilers, such as gcc 4.7.2+ or clang 3.2+.
  4. Generic implementations of convenient facilities have been added, such as thread pools, deserialization factories and more.

The API and ABI have undergone significant changes, many of which are due to the transition to C++11.


Logo Malheur 0.5.4

by konrad - December 25, 2013, 13:20:31 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 14078 views, 2714 downloads, 1 subscription

About: Automatic Analysis of Malware Behavior using Machine Learning

Changes:

Support for new version of libarchive. Minor bug fixes.


Showing Items 1-20 of 71 on page 1 of 4: 1 2 3 4 Next