All entries.
Showing Items 51-60 of 652 on page 6 of 66: Previous 1 2 3 4 5 6 7 8 9 10 11 Next Last

Logo Java Statistical Analysis Tool 0.0.7

by EdwardRaff - January 15, 2017, 22:21:50 CET [ Project Homepage BibTeX Download ] 3574 views, 881 downloads, 2 subscriptions

About: General purpose Java Machine Learning library for classification, regression, and clustering.

Changes:

See github release tab for change info


Logo FEAST 2.0.0

by apocock - January 8, 2017, 00:49:19 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 48206 views, 8399 downloads, 4 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: FEAST provides implementations of common mutual information based filter feature selection algorithms (mim, mifs, mrmr, cmim, icap, jmi, disr, fcbf, etc), and an implementation of RELIEF. Written for C/C++ & Matlab.

Changes:

Major refactoring of FEAST to improve speed and portability.

  • FEAST now clones the input data if it's floating point and discretises it to unsigned ints once in a single pass. This improves the speed by about 30%.
  • FEAST now has unsigned int entry points which avoid this discretisation and are much faster if the data is already categorical.
  • Added weighted feature selection algorithms to FEAST which can be used for cost-sensitive feature selection.
  • Added a Java API using JNI.
  • FEAST now returns the internal score for each feature according to the criterion. Available in all three APIs.
  • Rearranged the repository to make it easier to work with. Header files are now in `include`, source in `src`, the MATLAB API is in `matlab/` and the Java API is in `java/`.
  • FEAST now compiles cleanly using `-std=c89 -Wall -Werror`.

About: Hierarchical Recurrent Neural Network for Skeleton Based Action Recognition

Changes:

Initial Announcement on mloss.org.


Logo ADAMS 16.12.1

by fracpete - December 22, 2016, 05:24:00 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 30326 views, 5546 downloads, 3 subscriptions

About: The Advanced Data mining And Machine learning System (ADAMS) is a flexible workflow engine aimed at quickly building and maintaining data-driven, reactive workflows, easily integrated into business processes.

Changes:

Some highlights:

  • Over 80 new actors, nearly 30 new conversions
  • Weka Investigator -- the big brother of the Weka Explorer, or how to be more efficient with less clicks using multiple datasets in multiple sessions and multiple predefined outputs per evaluation run
  • Weka Multi-Experimenter -- simple interface for running Weka and ADAMS experiments.
  • File commander -- dual-pane file manager (inspired by Norton/Midnight commander) that allows you to manage local and remote files (ftp, sftp, smb); usually faster than native file managers (like Windows Explorer, Nautilus, Caja) in terms of handling 10s of thousand of files in a single directory
  • experimental deeplearning4j module
  • module for querying/consuming webservices using Groovy
  • basic terminal-based GUI for remote machines (eg cloud)
  • many interactive actors can be used in headless environment now as well
  • Fixed a memory leak introduced by Java's logging framework
  • Flow editor now has predefined rules for swapping actors, e.g. Trigger with Tee or ConditionalTrigger, maintaining as many options as possible (including any sub-actors).
  • improved imaging and PDF support

Logo WEKA 3.9.1

by mhall - December 19, 2016, 04:44:20 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 74914 views, 15673 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 6 votes)

About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...]

Changes:

In core weka:

  • JAMA-based linear algebra routines replaced with MTJ. Faster operation with the option to use native libraries for even more speed
  • General efficiency improvements in core, filters and some classifiers
  • GaussianProcesses now handles instance weights
  • New Knowledge Flow implementation. Engine completely rewritten from scratch with a simplified API
  • New Workbench GUI
  • GUI package manager now has a search facility
  • FixedDictionaryStringToWordVector filter allows the use of an external dictionary for vectorization. DictionarySaver converter can be used to create a dictionary file

In packages:

  • Packages that were using JAMA are now using MTJ
  • New netlibNativeOSX, netlibNativeWindows and netlibNativeLinux packages providing native reference implementations (and system-optimized implementation in the case of OSX) of BLAS, LAPACK and ARPACK linear algebra
  • New elasticNet package, courtesy of Nikhil Kinshore
  • New niftiLoader package for loading a directory with MIR data in NIfTI format into Weka
  • New percentageErrorMetrics package - provides plugin evaluation metrics for root mean square percentage error and mean absolute percentage error
  • New iterativeAbsoluteErrorRegression package - provides a meta learner that fits a regression model to minimize absolute error
  • New largeScaleKernelLearning package - contains filters for large-scale kernel-based learning
  • discriminantAnalysis package now contains an implementation for LDA and QDA
  • New Knowledge Flow component implementations in various packages
  • newKnowledgeFlowStepExamples package - contains code examples for new Knowledge Flow API discussion in the Weka Manual
  • RPlugin updated to latest version of MLR
  • scatterPlot3D and associationRulesVisualizer packages updated with latest Java 3D libraries
  • Support for pluggable activation functions in the multiLayerPerceptrons package

Logo JMLR scikitlearn 0.18.1

by fabianp - November 28, 2016, 17:45:27 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 32419 views, 12060 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarWhole Star1/2 Star
(based on 3 votes)

About: The scikit-learn project is a machine learning library in Python.

Changes:

Update for 0.18 .1


Logo Tools for Regression and Classification 1.0.0

by matloff - October 29, 2016, 08:22:40 CET [ Project Homepage BibTeX Download ] 2112 views, 405 downloads, 3 subscriptions

About: Toolkit for parametric and nonparametric regression and classification.

Changes:

Initial Announcement on mloss.org.


Logo rectools a Novel Toolbox for Recommender Systems 1.0.0

by matloff - October 29, 2016, 07:41:58 CET [ Project Homepage BibTeX Download ] 1961 views, 420 downloads, 2 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: Novel R toolbox for collaborative filtering recommender systems.

Changes:

Initial Announcement on mloss.org.


Logo DIANNE 0.5.0

by sbohez - October 25, 2016, 19:51:07 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 2061 views, 429 downloads, 3 subscriptions

About: DIANNE is a modular software framework for designing, training and evaluating artificial neural networks on heterogeneous, distributed infrastructure . It is built on top of OSGi and AIOLOS and can transparently deploy and redeploy (parts of) a neural network on multiple machines, as well as scale up training on a compute cluster.

Changes:

Initial Announcement on mloss.org.


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 4.0

by hn - October 19, 2016, 10:15:05 CET [ Project Homepage BibTeX Download ] 47610 views, 10475 downloads, 5 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave/Matlab implementation of inference and prediction with Gaussian process models. The toolbox offers exact inference, approximate inference for non-Gaussian likelihoods (Laplace's Method, Expectation Propagation, Variational Bayes) as well for large datasets (FITC, VFE, KISS-GP). A wide range of covariance, likelihood, mean and hyperprior functions allows to create very complex GP models.

Changes:

A major code restructuring effort did take place in the current release unifying certain inference functions and allowing more flexibility in covariance function composition. We also redesigned the whole derivative computation pipeline to strongly improve the overall runtime. We finally include grid-based covariance approximations natively.

More generic sparse approximation using Power EP

  • unified treatment of FITC approximation, variational approaches VFE and hybrids

  • inducing input optimisation for all (compositions of) covariance functions dropping the previous limitation to a few standard examples

  • infFITC is now covered by the more generic infGaussLik function

Approximate covariance object unifying sparse approximations, grid-based approximations and exact covariance computations

  • implementation in cov/apx, cov/apxGrid, cov/apxSparse

  • generic infGaussLik unifies infExact, infFITC and infGrid

  • generic infLaplace unifies infLaplace, infFITC_Laplace and infGrid_Laplace

Hiearchical structure of covariance functions

  • clear hierachical compositional implementation

  • no more code duplication as present in covSEiso and covSEard pairs

  • two mother covariance functions

    • covDot for dot-product-based covariances and

    • covMaha for Mahalanobis-distance-based covariances

  • a variety of modifiers: eye, iso, ard, proj, fact, vlen

  • more flexibility as more variants are available and possible

  • all covariance functions offer derivatives w.r.t. inputs

Faster derivative computations for mean and cov functions

  • switched from partial derivatives to directional derivatives

  • simpler and more concise interface of mean and cov functions

  • much faster marginal likelihood derivative computations

  • simpler and more compact code

New mean functions

  • new mean/meanWSPC (Weighted Sum of Projected Cosines or Random Kitchen Sink features) following a suggestion by William Herlands

  • new mean/meanWarp for constructing a new mean from an existing one by means of a warping function adapted from William Herlands

New optimizer

  • added a new minimize_minfunc, contributed by Truong X. Nghiem

New GLM link function

  • added the twice logistic link function util/glm_invlink_logistic2

Smaller fixes

  • two-fold speedup of util/elsympol used by covADD by Truong X. Nghiem

  • bugfix in util/logphi as reported by John Darby


Showing Items 51-60 of 652 on page 6 of 66: Previous 1 2 3 4 5 6 7 8 9 10 11 Next Last