PRTools contents |
SVO
[V,J,C,NU] = SVO(K,NLAB,C,OPTIONS)
Input | |
K | Similarity matrix |
NLAB | Label list consisting of -1/+1 |
C | Scalar for weighting the errors (optional; default: 1) |
OPTIONS | |
.PD_CHECK | force positive definiteness of the kernel by adding a small constant to a kernel diagonal (default: 1) |
.BIAS_IN_ADMREG | it may happen that bias of svc (b term) is not defined, then if BIAS_IN_ADMREG == 1, b will be taken from the midpoint of its admissible region, otherwise (BIAS_IN_ADMREG == 0) the situation will be considered as an optimization failure and treated accordingly (deafault: 1) |
.PF_ON_FAILURE | if optimization is failed (or bias is undefined and BIAS_IN_ADMREG is 0) and PF_ON_FAILURE == 1, then Pseudo Fisher classifier will be computed, otherwise (PF_ON_FAILURE == 0) an error will be issued (default: 1) |
Output | |
V | Vector of weights for the support vectors |
J | Index vector pointing to the support vectors |
C | C which was actually used for optimization |
NU | NU parameter of SVC_NU algorithm, which gives the same classifier |
A low level routine that optimizes the set of support vectors for a 2-class classification problem based on the similarity matrix K computed from the training set. SVO is called directly from SVC. The labels NLAB should indicate the two classes by +1 and -1. Optimization is done by a quadratic programming. If available, the QLD function is used, otherwise an appropriate Matlab routine.
svc,
PRTools contents |