PRTools contents

PRTools manual

svo_nu

SVO_NU

Support Vector Optimizer: NU algorithm

    [V,J,C] = SVO(K,NLAB,NU,PD)

Input
 K Similarity matrix
 NLAB Label list consisting of -1/+1
 NU Regularization parameter (0 < NU < 1): expected fraction of SV (optional; default: 0.25)
     PD    Do or do not the check of the positive definiteness (optional; default: 1 (to do))

Output
 V Vector of weights for the support vectors
 J Index vector pointing to the support vectors
 C Equivalent C regularization parameter of SVM-C algorithm

Description

A low level routine that optimizes the set of support vectors for a 2-class  classification problem based on the similarity matrix K computed from the  training set. SVO is called directly from SVC. The labels NLAB should indicate  the two classes by +1 and -1. Optimization is done by a quadratic programming.  If available, the QLD function is used, otherwise an appropriate Matlab routine.

NU is bounded from above by NU_MAX = (1 - ABS(Lp-Lm)/(Lp+Lm)), where  Lp (Lm) is the number of positive (negative) samples. If NU > NU_MAX is supplied  to the routine it will be changed to the NU_MAX.

If NU is less than some NU_MIN which depends on the overlap between classes  algorithm will typically take long time to converge (if at all).  So, it is advisable to set NU larger than expected overlap.

Weights V are rescaled in a such manner as if they were returned by SVO with the parameter C.

See also

svc_nu, svo, svc,

PRTools contents

PRTools manual