Skip to main content

Official Journal of the Italian Society of Orthopaedics and Traumatology

Table 6 Intraobserver reliability for individual reviewers for each system

From: Reliability and reproducibility of the new AO/OTA 2018 classification system for proximal humeral fractures: a comparison of three different classification systems

Classification system

Observer

% agreement

Kappa (95% CI)

Judgement

Neer

Observer 1

87.1

0.801 (0.71–0.891)

Almost perfect agreement

Observer 2

87.9

0.831 (0.75–0.912)

Almost perfect agreement

Observer 3

95.7

0.941(0.891–0.991)

Almost perfect agreement

Mean

90.2

0.857 (0.71–0.991)

Almost perfect agreement

AO/OTA 2007

Observer 1

63.8

0.597 (0.446–0.638)

Moderate agreement

Observer 2

63.8

0.602 (0.508–0.696)

Substantial agreement

Observer 3

57.8

0.542 (0.446–0.638)

Moderate agreement

Mean

61.8

0.580 (0.446–0696)

Moderate agreement

AO/OTA 2018

Observer 1

81.9

0.770 (0.681–0.858)

Substantial agreement

Observer 2

76.7

0.718 (0.627–0.809)

Substantial agreement

Observer 3

81

0.767 (0.681–0.852)

Substantial agreement

Mean

79.9

0.751 (0.627–0.858)

Substantial agreement

AO/OTA 2018

(with universal modifiers and qualifications)

Observer 1

79.3

0.770 (0.688–0.852)

Substantial agreement

Observer 2

75.9

0.724 (0.641–0.815)

Substantial agreement

Observer 3

80.2

0.777(0.696–0.858)

Substantial agreement

Mean

78.5

0.752 (0.641–0.858)

Substantial agreement