| Literature DB >> 33653111 |
Yun Tao1,2, William J M Probert3, Katriona Shea4,5, Michael C Runge6, Kevin Lafferty7, Michael Tildesley8, Matthew Ferrari4,5.
Abstract
Livestock diseases have devastating consequences economically, socially and politically across the globe. In certain systems, pathogens remain viable after host death, which enables residual transmissions from infected carcasses. Rapid culling and carcass disposal are well-established strategies for stamping out an outbreak and limiting its impact; however, wait-times for these procedures, i.e. response delays, are typically farm-specific and time-varying due to logistical constraints. Failing to incorporate variable response delays in epidemiological models may understate outbreak projections and mislead management decisions. We revisited the 2001 foot-and-mouth epidemic in the United Kingdom and sought to understand how misrepresented response delays can influence model predictions. Survival analysis identified farm size and control demand as key factors that impeded timely culling and disposal activities on individual farms. Using these factors in the context of an existing policy to predict local variation in response times significantly affected predictions at the national scale. Models that assumed fixed, timely responses grossly underestimated epidemic severity and its long-term consequences. As a result, this study demonstrates how general inclusion of response dynamics and recognition of partial controllability of interventions can help inform management priorities during epidemics of livestock diseases.Entities:
Keywords: epidemics; foot-and-mouth; livestock diseases; outbreak management; response delay
Year: 2021 PMID: 33653111 PMCID: PMC8086880 DOI: 10.1098/rsif.2020.0933
Source DB: PubMed Journal: J R Soc Interface ISSN: 1742-5662 Impact factor: 4.118
Figure 1Temporal variations in response delays on IPs during the 2001 foot-and-mouth epidemic in the UK. (a) Incidence time-series based on daily national case reports. (b) Delay intervals from infection report to culling (purple) and from culling to disposal (green) for each farm with respect to its report date. Illustration credit: Life Science Studios. (c) Mean delays to culling (purple) and disposal (green) as a function of report date; the insets show the overall distributions of both delays.
Survival analysis of response delays on infected premises (IPs). 1 April 2001 represents the date on which national control policy was strengthened by the inclusion of a 24 h target window for culling IPs following case report.
| case report date | factor | culling delay | disposal delay | ||
|---|---|---|---|---|---|
| hazard ratio (95% CI) | hazard ratio (95% CI) | ||||
| before 1 April | farm size | 0.973 (0.963–0.984) | 0.00 | 0.984 (0.973–0.994) | 0.002 |
| control demand | 1.183 (1.132–1.235) | 0.00 | 0.988 (0.973–1.004) | 0.14 | |
| farm density | 1.087 (1.020–1.159) | 0.011 | 0.968 (0.903–1.039) | 0.37 | |
| on or after 1 April | farm size | 0.957 (0.947–0.968) | 0.00 | 0.977 (0.968–0.987) | 0.00 |
| control demand | 0.883 (0.848–0.921) | 0.00 | 0.889 (0.876–0.902) | 0.00 | |
| farm density | 1.031 (0.974–1.093) | 0.29 | 1.012 (0.949–1.078) | 0.73 | |
| entire timeline | farm size | 0.966 (0.959–0.973) | 0.00 | 0.978 (0.971–0.985) | 0.00 |
| control demand | 0.870 (0.852–0.889) | 0.00 | 0.916 (0.907–0.926) | 0.00 | |
| farm density | 1.041 (0.997–1.086) | 0.07 | 0.952 (0.909–0.998) | 0.041 | |
Comparisons between the recorded and the predicted delays on infected premises (IPs). Left data column: survival analysis using mostly two-factors Cox proportional hazards regression models for IPs reported across the specified epidemic timeline. Right data column: parametric bootstrap analysis using predictions generated with resampled covariate values.
| factor | case report date | from the recorded delays | from the predicted delays | ||
|---|---|---|---|---|---|
| hazard ratio (95% CI) | mean hazard ratio (s.d.) | ||||
| culling | farm size | entire timeline | 0.965 (0.958–0.973) | 0.00 | 0.965 (0.001) |
| before 1 April | 0.972 (0.962–0.983) | 0.00 | 0.972 (0.002) | ||
| on or after 1 April | 0.957 (0.946–0.967) | 0.00 | 0.957 (0.002) | ||
| control demand | entire timeline | 0.873 (0.855–0.891) | 0.00 | 0.872 (0.013) | |
| before 1 April | 1.180 (1.130–1.231) | 0.00 | 1.180 (0.017) | ||
| on or after 1 April | 0.885 (0.849–0.923) | 0.00 | 0.885 (0.017) | ||
| disposal | farm size | entire timeline | 0.979 (0.972–0.986) | 0.00 | 0.979 (0.001) |
| before 1 April | 0.984 (0.974–0.995) | 0.003 | 0.984 (0.002) | ||
| on or after 1 April | 0.977 (0.968–0.987) | 0.00 | 0.977 (0.001) | ||
| control demand | entire timeline | 0.916 (0.906–0.925) | 0.00 | 0.915 (0.006) | |
| before 1 April | 0.988 (0.973–1.004) | 0.15 | |||
| on or after 1 April | 0.889 (0.877–0.902) | 0.00 | 0.889 (0.006) |
Figure 2Simulations of daily FMD incidence time-series and overall management success in the extended Warwick model with variable culling and disposal delays on individual farms. The reference dynamics, shown in black (a–c), represent epidemic profiles conditional on locally heterogeneous delays as a function of farm size, control demand and policy timeframes (time-dependent response). The resulting dynamics of fixed, idealized response (a) and randomly drawn, approximated responses (c) are shown in green and blue, respectively. The time-series in orange (b) illustrate the changes in the dynamical pattern after removing the 1 April policy reinforcement factor from model description (time-independent response). Two hundred simulations were run per model, each initialized at 1 February 2001 and continued until disease elimination. The management outcomes of the model responses are shown in corresponding colours using violin plots (d) under three standard measures of control effectiveness: epidemic duration, total number of animals culled and total number of farms culled.