About: Q. Dong, Two-dimensional relaxed representation, Neurocomputing, 121:248-253, 2013, http://dx.doi.org/10.1016/j.neucom.2013.04.044 Changes:Initial Announcement on mloss.org.
|
About: Data-efficient policy search framework using probabilistic Gaussian process models Changes:Initial Announcement on mloss.org.
|
About: PRoNTo is freely available software and aims to facilitate the interaction between the neuroimaging and machine learning communities. The toolbox is based on pattern recognition techniques for the analysis of neuroimaging data. PRoNTo supports the analysis of all image modalities as long as they are NIfTI format files. However, only the following modalites have been tested for version 1.1: sMRI, fMRI, PET, FA (fractional anisotropy) and Beta (GLM coefficients) images. Changes:Initial Announcement on mloss.org.
|
About: GPgrid toolkit for fast GP analysis on grid input Changes:Initial Announcement on mloss.org.
|
About: Fast Multidimensional GP Inference using Projected Additive Approximation Changes:Initial Announcement on mloss.org.
|
About: A Matlab implementation of Multilinear PCA (MPCA) and MPCA+LDA for dimensionality reduction of tensor data with sample code on gait recognition Changes:
|
About: Stochastic neighbor embedding originally aims at the reconstruction of given distance relations in a low-dimensional Euclidean space. This can be regarded as general approach to multi-dimensional scaling, but the reconstruction is based on the definition of input (and output) neighborhood probability alone. The present implementation also allows for handling dissimilarity or score-induced neighborhood topologies and makes use of quasi 2nd order gradient-based (l-)BFGS optimization. Changes:
|
About: This evaluation toolkit provides a unified framework for evaluating bag-of-words based encoding methods over several standard image classification datasets. Changes:Initial Announcement on mloss.org.
|
About: Approximate Rank One FACtorization of tensors. An algorithm for factorization of three-way-tensors and determination of their rank, includes example applications. Changes:Initial Announcement on mloss.org.
|
About: The aim is to embed a given data relationship matrix into a low-dimensional Euclidean space such that the point distances / distance ranks correlate best with the original input relationships. Input relationships may be given as (sparse) (asymmetric) distance, dissimilarity, or (negative!) score matrices. Input-output relations are modeled as low-conditioned. (Weighted) Pearson and soft Spearman rank correlation, and unweighted soft Kendall correlation are supported correlation measures for input/output object neighborhood relationships. Changes:
|
About: This is the core MCMC sampler for the nonparametric sparse factor analysis model presented in David A. Knowles and Zoubin Ghahramani (2011). Nonparametric Bayesian Sparse Factor Models with application to Gene Expression modelling. Annals of Applied Statistics Changes:Initial Announcement on mloss.org.
|
About: Regularization paTH for LASSO problem (thalasso) thalasso solves problems of the following form: minimize 1/2||X*beta-y||^2 + lambda*sum|beta_i|, where X and y are problem data and beta and lambda are variables. Changes:Initial Announcement on mloss.org.
|
About: The toolbox from the paper Near-optimal Experimental Design for Model Selection in Systems Biology (Busetto et al. 2013, submitted) implemented in MATLAB. Changes:Initial Announcement on mloss.org.
|
About: Block-Coordinate Frank-Wolfe Optimization for Structural SVMs Changes:Initial Announcement on mloss.org.
|
About: This toolbox implements a novel visualization technique called Sectors on Sectors (SonS), and a extended version called Multidimensional Sectors on Sectors (MDSonS), for improving the interpretation of several data mining algorithms. The MDSonS method makes use of Multidimensional Scaling (MDS) to solve the main drawback of the previous method, namely, the lack of representing distances between pairs of clusters. These methods have been applied for visualizing the results of hierarchical clustering, Growing Hierarchical Self-Organizing Maps (GHSOM), classification trees and several manifolds. These methods make possible to extract all the existing relationships among centroids’ attributes at any hierarchy level. Changes:Initial Announcement on mloss.org.
|
About: This letter proposes a new multiple linear regression model using regularized correntropy for robust pattern recognition. First, we motivate the use of correntropy to improve the robustness of the classicalmean square error (MSE) criterion that is sensitive to outliers. Then an l1 regularization scheme is imposed on the correntropy to learn robust and sparse representations. Based on the half-quadratic optimization technique, we propose a novel algorithm to solve the nonlinear optimization problem. Second, we develop a new correntropy-based classifier based on the learned regularization scheme for robust object recognition. Extensive experiments over several applications confirm that the correntropy-based l1 regularization can improve recognition accuracy and receiver operator characteristic curves under noise corruption and occlusion. Changes:Initial Announcement on mloss.org.
|
About: Robust sparse representation has shown significant potential in solving challenging problems in computer vision such as biometrics and visual surveillance. Although several robust sparse models have been proposed and promising results have been obtained, they are either for error correction or for error detection, and learning a general framework that systematically unifies these two aspects and explore their relation is still an open problem. In this paper, we develop a half-quadratic (HQ) framework to solve the robust sparse representation problem. By defining different kinds of half-quadratic functions, the proposed HQ framework is applicable to performing both error correction and error detection. More specifically, by using the additive form of HQ, we propose an L1-regularized error correction method by iteratively recovering corrupted data from errors incurred by noises and outliers; by using the multiplicative form of HQ, we propose an L1-regularized error detection method by learning from uncorrupted data iteratively. We also show that the L1-regularization solved by soft-thresholding function has a dual relationship to Huber M-estimator, which theoretically guarantees the performance of robust sparse representation in terms of M-estimation. Experiments on robust face recognition under severe occlusion and corruption validate our framework and findings. Changes:Initial Announcement on mloss.org.
|
About: This toolbox implements models for Bayesian mixed-effects inference on classification performance in hierarchical classification analyses. Changes:In addition to the existing MATLAB implementation, the toolbox now also contains an R package of the variational Bayesian algorithm for mixed-effects inference.
|
About: Soltion developed by team Turtle Tamers in the ChaLearn Gesture Challenge (http://www.kaggle.com/c/GestureChallenge2) Changes:Initial Announcement on mloss.org.
|
About: The VLFeat open source library implements popular computer vision algorithms including affine covariant feature detectors, HOG, SIFT, MSER, k-means, hierarchical k-means, agglomerative information bottleneck, SLIC superpixels, and quick shift. It is written in C for efficiency and compatibility, with interfaces in MATLAB for ease of use, and detailed documentation throughout. It supports Windows, Mac OS X, and Linux. The latest version of VLFeat is 0.9.16. Changes:VLFeat 0.9.16: Added VL_COVDET() (covariant feature detectors). This function implements the following detectors: DoG, Hessian, Harris Laplace, Hessian Laplace, Multiscale Hessian, Multiscale Harris. It also implements affine adaptation, estiamtion of feature orientation, computation of descriptors on the affine patches (including raw patches), and sourcing of custom feature frame. Addet the auxiliary function VL_PLOTSS(). This is the second point update supported by the PASCAL Harvest programme. VLFeat 0.9.15: Added VL_HOG() (HOG features). Added VL_SVMPEGASOS() and a vastly improved SVM implementation. Added IHASHSUM (hashed counting). Improved INTHIST (integral histogram). Added VL_CUMMAX(). Improved the implementation of VL_ROC() and VL_PR(). Added VL_DET() (Detection Error Trade-off (DET) curves). Improved the verbosity control to AIB. Added support for Xcode 4.3, improved support for past and future Xcode versions. Completed the migration of the old test code in toolbox/test, moving the functionality to the new unit tests toolbox/xtest. Improved credits. This is the first point update supported by the PASCAL Harvest (several more to come shortly).
|