Project details for Linear SVM with general regularization

Logo Linear SVM with general regularization 1.0

by rflamary - October 5, 2012, 15:34:21 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ]

view (7 today), download ( 1 today ), 1 subscription

Description:

We provide a general solver for squared hinge loss SVM of the form:

min_{w,b} sum_i max(0,y_i(x_i^Tw+b))^2 + Omega(w)

where Omega(w) can be :

  • l1 : Omega(w)=sum_i |w_i|
  • l2 : Omega(w)=sum_i |w_i|^2
  • l1-l2: Omega(w)=sum_g ||w_g||_2
  • l1-lp: Omega(w)=sum_g ||w_g||_p
  • adaptive l1-l2: Omega(w)=sum_g beta_g||w_g||_2

We also provide a multitask solver where T tasks can be learned simultaneously with joint sparsity constraints (mixed norm regularization).

Note that this toolbox has been designed to be efficient for dense data whereas most of the existing linear svm solvers have been designed for sparse datasets.

Changes to previous version:

Initial Announcement on mloss.org.

BibTeX Entry: Download
Corresponding Paper BibTeX Entry: Download
URL: Project Homepage
Supported Operating Systems: Agnostic
Data Formats: Any Format Supported By Matlab
Tags: Large Scale, Kernelmachine, Svm, Bci, Classification, Support Vector Machines, Feature Selection, Linear Svm, Convex Optimization, Gradient Based Learning, Manifold Learning, Optimization, Algorithms, Feature Weighting, Trace Norm, Toolbox, Group Lasso, Lasso, Sparse Learning, Quadratic Programming, Weighting, L1 Regularization, Large Datasets, Regularization, Pattern Recognition, Discriminant Analysis, Linear Model, Generalized Linear Models, Multiclass Support Vector Machine, L1 Minimization, Sparse Representation, L1 Norm, L21 Norm, Dimension Reduction, Multi Task
Archive: download here

Comments

No one has posted any comments yet. Perhaps you'd like to be the first?

Leave a comment

You must be logged in to post comments.