Cohen's kappa

Cohen's kappa coefficient is a statistical measure of interrater reliability. It is generally thought to be a more robust judge than simple percent agreement calculation.

The equation for kappa is:


 * $$\kappa = \frac{\Pr(a) - \Pr(e)}{1 - \Pr(e)}, \!$$

where $$\Pr(a)$$ is the relative observed annotator agreement, and $$\Pr(e)$$ is the probability that agreement is due to chance.

The seminal paper introducing kappa as a novel technique was published by Jacob Cohen in the journal Educational and Psychological Measurement in 1960.