PRTools contents |
NUSVC
[W,J,NU] = NUSVC(A,KERNEL,NU)
W = A*SVC([],KERNEL,NU)
Input | |
A | Dataset |
KERNEL | Untrained mapping to compute kernel by A*(A*KERNEL) during training, or B*(A*KERNEL) during testing with dataset B. - String to compute kernel matrices by FEVAL(KERNEL,B,A) Default: linear kernel (PROXM([],'p',1)); |
NU | Regularization parameter (0 < NU < 1): expected fraction of SV (optional; default: max(leave-one-out 1_NN error,0.01)) |
Output | |
W | Mapping: Support Vector Classifier |
J | Object indices of support objects |
NU | Actual nu_value used |
Optimizes a support vector classifier for the dataset A by quadratic programming. The difference with the standard SVC routine is the use and interpretation of the regularisation parameter NU. It is an upperbound for the expected classification error. By default NU is estimated by the leave-one-error of the 1_NN rule. For NU = NaN an automatic optimisation is performed using REGOPTC.
If KERNEL = 0 it is assumed that A is already the kernelmatrix (square). In this case also a kernel matrix B should be supplied at evaluation by B*W or MAP(B,W).
There are several ways to define KERNEL, e.g. PROXM([],'r',1) for a radial basis kernel or by USERKERNEL for a user defined kernel.
mappings, datasets, svc, nusvo, proxm, userkernel, regoptc,
PRTools contents |