PRTools contents |
LDC
[W,R,S,M] = LDC(A,R,S,M)
W = A*LDC([],R,S,M);
Input | |
A | Dataset |
R,S | Regularization parameters, 0 <= R,S <= 1 (optional; default: no regularization, i.e. R,S = 0) |
M | Dimension of subspace structure in covariance matrix (default: K, all dimensions) |
Output | |
W | Linear Bayes Normal Classifier mapping |
R | Value of regularization parameter R as used |
S | Value of regularization parameter S as used |
M | Value of regularization parameter M as usedd |
Computation of the linear classifier between the classes of the dataset A by assuming normal densities with equal covariance matrices. The joint covariance matrix is the weighted (by a priori probabilities) average of the class covariance matrices. R and S (0 <= R,S <= 1) are regularization parameters used for finding the covariance matrix G by
G = (1-R-S)*G + R*diag(diag(G)) + S*mean(diag(G))*eye(size(G,1))
This covariance matrix is then decomposed as
G = W*W' + sigma^2 * eye(K)
where W is a K x M matrix containing the M leading principal components and sigma^2 is the mean of the K-M smallest eigenvalues. The use of soft labels is supported. The classification A*W is computed by NORMAL_MAP.
If R, S or M is NaN the regularisation parameter is optimised by REGOPTC. The best result are usually obtained by R = 0, S = NaN, M = [], or by R = 0, S = 0, M = NaN (which is for problems of moderate or low dimensionality faster). If no regularisation is supplied a pseudo-inverse of the covariance matrix is used in case it is close to singular.
Note that A*(KLMS([],N)*NMC) performs a similar operation by first pre-whitening the data in an N-dimensional space, followed by the nearest mean classifier. The regularization controlled by N is different from the above in LDC as it entirely removes small variance directions.
To some extend LDC is also similar to FISHERC.
prex_plotc,
a = gendatd; % generate Gaussian distributed data in two classes
w = ldc(a); % compute a linear classifier between the classes
scatterd(a); % make a scatterplot
plotc(w) % plot the classifier
1. R.O. Duda, P.E. Hart, and D.G. Stork, Pattern classification, 2nd edition, John Wiley and Sons, New York, 2001.
2. A. Webb, Statistical Pattern Recognition, John Wiley & Sons, New York, 2002.
3. C. Liu and H. Wechsler, Robust Coding Schemes for Indexing and Retrieval from Large Face Databases, IEEE Transactions on Image Processing, vol. 9, no. 1, 2000, 132-136.
mappings, datasets, regoptc, nmc, nmsc, ldc, udc, quadrc, normal_map, fisherc,
PRTools contents |