[R] cohen kappa for two-way table

Dylan Beaudette dylan.beaudette at gmail.com
Mon Dec 11 01:00:06 CET 2006


Greetings,

I am a bit confused by the results returned by the functions:
cohen.kappa {concord}
classAgreement {e1071}

when using a two-way table.


for example, if I have an matrix A, and a similar matrix B (same
dimensions), then:

matrix A and B can be found:
http://casoilresource.lawr.ucdavis.edu/drupal/files/a_40.txt
http://casoilresource.lawr.ucdavis.edu/drupal/files/b_40.txt

A <- matrix(unlist( read.table('a_40.txt'), use.names=FALSE), ncol=14)
B <- matrix(unlist( read.table('b_40.txt'), use.names=FALSE), ncol=14)

# compute cohen's kappa, default settings:
cohen.kappa(table(A,B))
Kappa test for nominally classified data
9 categories - 90 methods
kappa (Cohen) = 0.97353 , Z = 45.4465 , p = 0
kappa (Siegel) = -0.00744097 , Z = -0.0794501 , p = 0.531663
kappa (2*PA-1) = 0.947061


# compute cohen's kappa - type = counts
cohen.kappa(table(A,B), type='counts')

Different row sums, a no-classification category was added.

Kappa test for nominally classified data
91 categories - 22 methods
kappa (Siegel) = 0.168593 , Z = 2.50298 , p = 0.00615762
kappa (2*PA-1) = 0.71485

it seems like the second method (type='counts') is the correct way to
use a contingency table... but am i correct?

Secondly, when using the classAgreements() function I get different numbers:

classAgreement(table(A,B))
$diag
[1] 0.03296703

$kappa
[1] 0.02180419

$rand
[1] 0.9874325

$crand
[1] 0.7648124



Perhaps I am mis-reading the relevant manual pages. Can anyone shed
some light on the proper use, and therfore interpretation of these two
methods - when using a contingency table as input?

Thanks in advance,

Dylan




More information about the R-help mailing list