Skip to main content

Table 4 Inter-rater agreement for shoulder and knee checklist items

From: Validity evidence for two objective structured clinical examination stations to evaluate core skills of the shoulder and knee assessment

Shoulder

Rater 1

 

Performed

Not Performed

Total

Rater 2

Performed

118

1

119

Not Performed

4

24

28

Total

122

25

147

Knee

Rater 1

 

Performed

Not Performed

Total

Rater 2

Performed

126

14

140

Not Performed

9

26

35

Total

171

30

175

  1. Observed Agreement = (118 + 24)/147 = 0.97
  2. Chance Agreement = 0.70
  3. Cohen’s kappa = 0.9 (“Almost perfect”)
  4. Observed Agreement = (126 + 26)/175 = 0.87
  5. Chance Agreement = 0.66
  6. Cohen’s kappa = 0.6 (“Moderate”)