Skip to main content

Table 1 Intra-observer agreement and agreement between observers and the 'golden standard' given as kappa values.

From: Reliability of an injury scoring system for horses

Agreement Intra-observer % (N) Each observer versus golden standard % (N)
Slight (0.00-0.20) 0% (0) 0% (0)
Fair (0.21-0.40) 2% (1) 2% (1)
Moderate (0.41-0.60) 12% (5) 30% (13)
Substantial (0.61-0.80) 63% (27) 60% (26)
Almost perfect (0.81-1.00) 23% (10) 7% (3)
  1. The table shows the proportion of observers (in % and number) in different levels of kappa value agreement according to standard description for the strength of agreement [19]