Abstract
Delphi studies in disaster medicine lack consensus on expert agreement metrics. This study examined various metrics using a Delphi study on chemical, biological, radiological, and nuclear (CBRN) preparedness in the Middle East and North Africa region. Forty international disaster medicine experts evaluated 133 items across ten CBRN Preparedness Assessment Tool themes using a 5‐point Likert scale. Agreement was measured using Kendall's W, Intraclass Correlation Coefficient, and Cohen's Kappa. Statistical and machine learning techniques compared metric performance. The overall agreement mean score was 4.91 ± 0.71, with 89.21% average agreement. Kappa emerged as the most sensitive metric in statistical and machine learning analyses, with a feature importance score of 168.32. The Kappa coefficient showed variations across CBRN PAT themes, including medical protocols, logistics, and infrastructure. The integrated statistical and machine learning approach provides a promising method for understanding expert consensus in disaster preparedness, with potential for future refinement by incorporating additional contextual factors.
Original language | English |
---|---|
Article number | e70044 |
Number of pages | 20 |
Journal | Journal of Contingencies and Crisis Management |
Volume | 33 |
Issue number | 2 |
Early online date | 6 Apr 2025 |
DOIs | |
Publication status | E-pub ahead of print - 6 Apr 2025 |
Externally published | Yes |
Keywords
- disaster medicine
- expert's opinion
- agreement analysis
- MENA
- Delphi study