by Didier Lepelletier, Philippe Ravaud, Gabriel Baron, Jean-Christophe Lucet
To assess agreement in diagnosing surgical site infection (SSI) among healthcare professionals involved in SSI surveillance. Methods
Case-vignette study done in 2009 in 140 healthcare professionals from seven specialties (20 in each specialty, Anesthesiologists, Surgeons, Public health specialists, Infection control physicians, Infection control nurses, Infectious diseases specialists, Microbiologists) in 29 University and 36 non-University hospitals in France. We developed 40 case-vignettes based on cardiac and gastrointestinal surgery patients with suspected SSI. Each participant scored six randomly assigned case-vignettes before and after reading the SSI definition on an online secure relational database. The intraclass correlation coefficient (ICC) was used to assess agreement regarding SSI diagnosis on a seven-point Likert scale and the kappa coefficient to assess agreement for superficial or deep SSI on a three-point scale. Results
Based on a consensus, SSI was present in 21 of 40 vignettes (52.5%). Intraspecialty agreement for SSI diagnosis ranged across specialties from 0.15 (95% confidence interval, 0.00–0.59) (anesthesiologists and infection control nurses) to 0.73 (0.32–0.90) (infectious diseases specialists). Reading the SSI definition improved agreement in the specialties with poor initial agreement. Intraspecialty agreement for superficial or deep SSI ranged from 0.10 (-0.19–0.38) to 0.54 (0.25–0.83) (surgeons) and increased after reading the SSI definition only among the infection control nurses from 0.10 (-0.19–0.38) to 0.41 (-0.09–0.72). Interspecialty agreement for SSI diagnosis was 0.36 (0.22–0.54) and increased to 0.47 (0.31–0.64) after reading the SSI definition. Conclusion
Among healthcare professionals evaluating case-vignettes for possible surgical site infection, there was large disagreement in diagnosis that varied both between and within specialties.