KAPPA estimates Cohen's kappa coefficient
and related statistics
[...] = kappa(d1,d2);
NaN's are handled as missing values and are ignored
[...] = kappa(d1,d2,'notIgnoreNAN');
NaN's are handled as just another Label.
[kap,sd,H,z,ACC,sACC,MI] = kappa(...);
X = kappa(...);
d1 data of scorer 1
d2 data of scorer 2
kap Cohen's kappa coefficient point
se standard error of the kappa estimate
H Concordance matrix, i.e. confusion matrix
z z-score
ACC overall agreement (accuracy)
sACC specific accuracy
MI Mutual information or transfer information (in [bits])
X is a struct containing all the fields above
For two classes, a number of additional summary statistics including
TPR, FPR, FDR, PPV, NPF, F1, dprime, Matthews Correlation coefficient (MCC) or
Phi coefficient (PHI=MCC), Specificity and Sensitivity, Youden index (YI)
are provided. Note, the positive category must the larger label (in d and c), otherwise
the confusion matrix becomes transposed and the summary statistics are messed up.
Reference(s):
[1] Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20, 37-46.
[2] J Bortz, GA Lienert (1998) Kurzgefasste Statistik f|r die klassische Forschung, Springer Berlin - Heidelberg.
Kapitel 6: Uebereinstimmungsmasze fuer subjektive Merkmalsurteile. p. 265-270.
[3] http://www.cmis.csiro.au/Fiona.Evans/personal/msc/html/chapter3.html
[4] Kraemer, H. C. (1982). Kappa coefficient. In S. Kotz and N. L. Johnson (Eds.),
Encyclopedia of Statistical Sciences. New York: John Wiley & Sons.
[5] http://ourworld.compuserve.com/homepages/jsuebersax/kappa.htm
[6] http://en.wikipedia.org/wiki/Receiver_operating_characteristic
Package: nan