Article Figures & Data
Tables
Inter- and intraobserver agreement
Agreement 1) Interobserver agreement (R1, R2, and R3) Agreement among 3 raters: Slight agreement for both reading sessions (Fleiss κ: 0.18 and 0.11) Agreement between R1 and R2, R2 and R3, and R1 and R3 for 2 sessions: Slight agreement R1 × R2 (Cohen κ unweighted = 0.20 and 0.04) Fair agreement R2 × R3 (Cohen κ unweighted = 0.27 and 0.30) Slight agreement R1 × R3 (Cohen κ unweighted = 0.12 and 0.07) 2) Interobserver agreement (prepublication reviewers) No agreement (Cohen κ unweighted = −0.22) 3) Intraobserver agreement (R1, R2, and R3) R1 fair agreement (Cohen κ: 0.23) R2 fair agreement (Cohen κ: 0.38) R3 substantial agreement (Cohen κ: 0.66)