PRTools contents |
NAIVEBCC
W = A*(WU*NAIVEBCC)
W = WT*NAIVEBCC(B*WT)
D = C*W
Input | |
A | Dataset used for training base classifiers as well as combiner |
B | Dataset used for training combiner of trained base classifiers |
C | Dataset used for testing (executing) the combiner |
WU | Set of untrained base classifiers, see STACKED |
WT | Set of trained base classifiers, see STACKED |
Output | |
W | Trained Naive Bayes Combining Classifier |
D | Dataset with prob. products (over base classifiers) per class |
During training the combiner computes the probabilities P(class | classifier outcomes) based on the crisp class assignements made by the base classifiers for the training set. During execution the product of these probabilities are computed, again following the crisp class assignments of the base classifiers. These products are returned as columns in D. Use CLASSC to normalise the outcomes. Use TESTD or LABELD to inspect performances and assigned labels.
NAIVEBCC differs from the classifier NAIVEBC by the fact that the latter uses continuous inputs (no crisp labeling) and does not make a distinction between classifiers. Like almost any other classifier however, NAIVEBC may be used as a trained combiner as well.
1. Kuncheva, LI. Combining pattern classifiers, 2004, pp.126-128.
datasets, mappings, stacked, naivebc, classc, testd, labeld,
PRTools contents |