Skip to main content

Table 2 Kappa values for inter-observer agreement (left side) and agreement with 'golden standard' (right side) given per injury category and overall

From: Reliability of an injury scoring system for horses

  Agreement between observers Agreement all observers versus golden standard
Category Kappa value SE P (vs > 0) Kappa value SE P (vs > 0)
1 0.80 0.005 <0.0001 0.88 0.024 <0.0001
2 0.54 0.005 <0.0001 0.54 0.024 <0.0001
3 0.49 0.005 <0.0001 0.52 0.024 <0.0001
4 0.38 0.005 <0.0001 0.53 0.024 <0.0001
5 0.73 0.005 <0.0001 0.82 0.024 <0.0001
Overall 0.59 0.003 <0.0001 0.66 0.012 <0.0001
  1. Standard error and significance (kappa different from 0 which refers to what is found by chance) are given.