PRTools contents

PRTools manual

confmat

CONFMAT

Construct confusion matrix

   [C,NE,LABLIST] = CONFMAT(LAB1,LAB2,METHOD,FID)

Input
 LAB1 Set of labels
 LAB2 Set of labels
 METHOD 'count' (default) to count number of co-occurences in
 LAB1 and LAB2, 'disagreement' to count relative  non-co-occurrence.
 FID Write text result to file

Output
 C Confusion matrix
 NE Total number of errors (empty labels are neglected)
 LABLIST Unique labels in LAB1 and LAB2

Description

Constructs a confusion matrix C between two sets of labels LAB1 (corresponding to the rows in C) and LAB2 (the columns in C). The order of  the rows and columns is returned in LABLIST. NE is the total number of  errors (sum of non-diagonal elements in C).

When METHOD = 'count' (default), co-occurences in LAB1 and LAB2 are counted  and returned in C. When METHOD = 'disagreement', the relative disagreement  is returned in NE, and is split over all combinations of labels in C (such that the rows sum to 1). (The total disagreement for a class equals  one minus the sensitivity for that class as computed by TESTC).

    [C,NE,LABLIST] = CONFMAT(D,METHOD)

If D is a classification result D = A*W, the labels LAB1 and LAB2 are  internally retrieved by CONFMAT before computing the confusion matrix.

    C = CONFMAT(D)

This call also applies in case in D = A*W the dataset A has soft labels  W is trained by a soft labeld classifier.

When no output argument is specified, or when FID is given, the  confusion matrix is displayed or written a to a text file. It is assumed  that LAB1 contains true labels and LAB2 stores estimated labels.

Example(s)

 Typical use of CONFMAT is the comparison of true and and estimated labels
 of a testset A by application to a trained classifier W: 
 LAB1 = GETLABELS(A); LAB2 = A*W*LABELD.
 More examples can be found in PREX_CONFMAT, PREX_MATCHLAB.

See also

mappings, datasets, getlabels, labeld,

PRTools contents

PRTools manual