| Literature DB >> 35600270 |
Søren Saxmose Nielsen, Julio Alvarez, Paolo Calistri, Elisabetta Canali, Julian Ashley Drewe, Bruno Garin-Bastuji, José Luis Gonzales Rojas, Christian Gortázar, Mette Herskin, Virginie Michel, Miguel Ángel Miranda Chueca, Barbara Padalino, Paolo Pasquali, Helen Clare Roberts, Hans Spoolder, Karl Ståhl, Antonio Velarde, Arvo Viltrop, Christoph Winckler, Andrea Gervelmeyer, Yves Van der Stede, Dominique Joseph Bicout.
Abstract
The EFSA asked the Panel on Animal Health and Welfare to develop a guidance document on good practice in conducting scientific assessments in animal health using modelling. In previous opinions, the AHAW Panel has responded to two-thirds of animal health-related mandates using some kind of modelling. These models range from simple to complex, employing a combination of scientific, economic, socio-economic or other types of data. Hence, there is strong interest in the development of a guidance document to integrate modelling efforts into the routine process of EFSA working groups. In this document, an 'operating procedure' (OP) for the use of modelling within an AH working group is presented. The OP provides a detailed flowchart enabling modelling to be transparently and consistently integrated in the assessment. The OP is structured into phases. These phases combine the relevant standard operating procedures and working instructions of EFSA with the modelling process. Each phase includes roles and actions to be taken, expected output and the sequence of agreements that need to be made between all partners in the scientific assessment. In conclusion, it is expected that adherence to the OP will improve transparency of models in EFSA outputs, and it is recommended to adopt it as a standard procedure when responding to AHAW mandates.Entities:
Keywords: animal health; modelling techniques; operating procedures; standard terminology; transparency
Year: 2022 PMID: 35600270 PMCID: PMC9115711 DOI: 10.2903/j.efsa.2022.7346
Source DB: PubMed Journal: EFSA J ISSN: 1831-4732
Figure 1Identified purposes of modelling with typical examples
Note: Overlapping ellipses symbolise potentially non‐distinct purposes of practical models.
Responsibilities of the subject and modelling expert(s)
| Responsibilities of the subject expert(s) | Responsibilities of the modelling expert(s) |
|---|---|
| Review the mandate and determine the required relevant scientific literature and data sources | Review the mandate and determine relevant needs for the type of model and its requirements |
| Contribute to laying out the conceptual model within the framework of the objectives of the ToRs and translating the ToRs into specific risk questions | Contribute to laying out the conceptual model within the framework of the objectives of the ToRs |
| Determine the scientific soundness of the proposed modelling approach(es) including assumptions and biological aspects | Determine the relevant modelling options, considering timeframe, data and available resources; propose a structured model plan with required data |
| Assess the reliability of the data sources and the quality of data | Assess the feasibility of the intended modelling |
| Assess the reliability of the practical model implementation | Provide proper implementation of the model tool |
|
Assure transparency of the modelling process including reviewing the validity of assumptions, limitations and potential uncertainty together with the model experts | Assure transparency of the modelling process including documentation of the model that identifies to the subject experts the underlying assumptions, limitations and potential uncertainty of the expected model output |
| Provide scientific guidance and information for the justification of the model within the framework of the objectives of the TOR | Provide sufficient evaluation of the model to demonstrate consistency with scientific guidance and information (e.g. correctness and validity) within the framework of the objectives of the TOR |
| Assess the scientific validity of the model output | Apply the model to answer the objectives |
| Review and assess critically the practical relevance of the model outputs aiming to derive sound findings as outcome from the modelling | Present, explain and justify the model output, aiming for transparent communication of technical details and respond to questions from the subject experts |
| Draft model findings and conclusions for the scientific output | Draft final model description and model output for the model report |
| Being the advocate in promoting to the Panel members and others the scientific findings generated by the modelling process | Confirm the conclusions and recommendation of the model report derived from the model output |
Risk Assessment activities and actors including specific tasks related to modelling
|
|
|
|
|
|---|---|---|---|
|
| 1 Negotiation of the mandate |
No action for the Panel Taken care of by EFSA staff | Not applicable |
| 2 Receipt of a mandate |
EFSA assigns mandate to the AHAW Panel EFSA staff forwards mandate to the Panel | Panel informed of the new mandate for the AHAW Panel | |
| 3 Chartering and acceptance |
European Commission presents mandate to the Panel, and Panel and European Commission discuss/clarify mandate (including identification of background, objectives and questions) and the need for modelling at a plenary meeting of the Panel EFSA staff, WG chair and European Commission discuss the mandate at the kick‐off meeting with the European Commission |
Mandate preliminarily clarified (TORs/goal/target/aim/problem/question understood) and accepted by the Panel with WG chair and other Panel members (including modelling advice) designated Mandate clarified (defined and accomplishable goals, purpose, question, expected answers and timelines agreed including deadlines) Potential strategic approaches described (including draft roadmap, potential models and their expected contribution, required and available resources, sources of information/data) | |
| 4 Workforce Mix Definition |
EFSA staff and Panel discuss whether the RA can be fully or partially outsourced to Art.36 Organization or other Tenderers/Grant beneficiaries EFSA staff and WG chair propose experts for WG EFSA staff invites selected experts |
Decision on outsourcing of RA AHAW Panel comments on WG composition (by written procedure) WG established | |
| 5 Definition of outsourced tasks | EFSA staff assures that appropriate contracts and/or grant agreements are put in place | Outsourcing contracts/agreements | |
|
| 6 Protocol development |
EC/WG chair present the clarified mandate and its purpose to the WG EFSA staff and WG discuss strategic approaches to respond to the mandate (determination of specific risk assessment questions and expected answers in relation to timelines) EFSA staff and WG decide if a quantitative assessment is needed or not (if not, follow the procedure except for points related to modelling) EFSA staff and WG present, justify and discuss the proposed modelling approach with Panel EFSA staff and WG discuss and further develop modelling approach according the Panel's comments |
Draft protocol with approach to respond to the TORs, including the modelling approach, draft work plan, task distribution and action plan proposed. Proposed protocol with modelling approach commented by Panel |
| 7 Protocol check (Tollgate 1) |
EFSA staff and WG inform the Panel and European Commission about the draft protocol Panel and European Commission check if the protocol sufficiently describes the methodologies, the scientific (mathematical/statistical/computer) models and the required expertise, data and scientific clarity and completeness needed to reply to all relevant questions of the mandate in relation to the scientific value agreed with the requestor |
Agreed protocol, including the modelling approach to be followed Tollgate passing recorded in case management tool | |
| 8 Protocol approval | If a public consultation on the draft protocol was agreed during the mandate negotiation, the draft protocol is updated based on the comments received during the public consultation |
Agreed protocol Tollgate passing recorded in case management tool | |
| 9 Meetings |
No action for Panel Taken care of by EFSA staff | not applicable | |
|
| 10 Preparation of first draft output |
EFSA staff and WG collect data and expert opinion for the model, implement the model, discuss and revise the model report, inform the Panel on progress of the model report EFSA staff, WG demonstrate the model and its suitability (valid, representative, fit for purpose) EFSA staff, WG apply the model and communicate the model output |
Applicable model and documentation and agreed draft model report Eventual feedback from the Panel on the model report and modelling follow up Agreement on the application of the presented model in contributing to the response to the mandate Model output as basis for findings Discussion of uncertainties as basis for transparency |
| 11 Draft output integration (Tollgate 2) | EFSA staff, WG, panel representatives and European Commission discuss the draft output, including model‐based findings |
Interpretation of findings (limitations, assumptions and uncertainties) agreed Version of the draft output presented to the Panel and needs for further improvements agreed | |
| 12 Draft output finalisation (Tollgate 3) |
EFSA staff and WG revise the draft output based on the feedback received from Panel and European Commission EFSA staff and WG decide if draft output is ready for adoption | Revised draft output for possible adoption | |
| 13/14/15 Endorsement/ adoption of scientific output |
EFSA staff and WG chair present the model report and model outcome/derived findings to Panel and European Commission Panel adopts the scientific opinion based on the accepted model report/assessment and model | Adopted scientific opinion and accepted model report on modelling | |
|
| 16 Editorial checks and corrections |
No action for Panel Taken care of by EFSA staff | Not applicable |
| 17/18/19 Publication of scientific output |
No action for Panel Taken care of by EFSA staff | Not applicable | |
| 20 Correction of published scientific output |
No action for Panel Taken care of by EFSA staff | Not applicable |
Figure A.1Representation of various combinations of model characteristics
Sample paths (lines) represent existing models from epidemiological or ecological studies. Horizontally, different forms of the same model feature are shown, overlapping represents possible mixtures or hybrid models.
Table B.1 Terms and definitions for ToR 1 on standard terminology
|
|
Generally, in the modelling context accuracy describes the degree of agreement between the observation and the model outcome, i.e. how close the model outcome is to the observed value (e.g. R2). In specific statistical models, accuracy of the model output in comparison to the observed data is often expressed as a summary value, and the objective in model fitting is to optimise this value. The smaller/larger it is, the higher accurate the model is depending on the particular summary value (e.g. Chi2, R2, AIC). |
|
| Assumptions, or working hypotheses, are important components of many models. They can be defined as propositions taken for granted on which models may be based, and under which these models will give valid results. The validity of and therefore results from those models partly depend on the plausibility of such propositions. Assumptions often are found to be the most plausible, reliable, or suitable conditions (but often without formal proof). |
|
| A theorem developed by Thomas Bayes that is the backbone for Bayesian Inference and thus a Bayesian framework. Bayes' Theorem is a simple mathematical formula used for calculating conditional probabilities. The Theorem relates the ‘direct’ probability of a hypothesis conditional on a given body of data, P |
|
| Within a Bayesian framework, Bayes' Theorem is used as a method to combine new evidence or observations (data) with prior (to data collection) probability of a certain condition or event into a new (posterior) probability for that condition/ event. This is to be contrasted to the frequentist framework in which the new probability of a condition or event is exclusively derived from the data (and the used model with inherent assumptions). |
|
| The division of risk into classes according to specific criteria of both their probability to occur and their consequence. The classification will depend on the hazard, the risk assessment process as well as the risk management and communication needs. |
|
| Method of analysing a model resulting in a ‘closed solution’. A closed‐form solution solves a given model in terms of functions and mathematical operations. |
|
| These models describe how individual units like animals or herds move between defined compartments (states) of a system on the basis of transition probabilities. One basic assumption is that all entities in a compartment are assumed in an identical status (homogeneous) with regard to the described dynamics. |
|
| A secondary probability distribution specified by a first probability distribution in which one or more parameters that define this primary distribution are not fixed values but follow yet another (second) probability distribution. Compound distributions are sometimes used in stochastic models to describe specific probabilities. (Oxford Dictionary of statistical terms). |
|
|
Descriptive representation of a system based on current knowledge as well as on assumptions about its components, their inter‐relationships, and system boundaries. Conceptual models often are depicted by visual methods (diagrams) that exhibit assumed causal relationships. They form the basis for further modelling approaches. |
|
| A range of estimates, with a lower bound and an upper bound, statistically derived from a sample designed to include (capture) an unknown (true) population parameter with a certain level of confidence. |
|
| Quantitative or metric variable measured on a continuous scale. It may take on any value within a given interval, and the meaning of unity does not change along the interval. The interval (valid data range) could be finite or infinite. |
|
| Explanatory variables likely to affect the outcome variable of a model or the relationship between this outcome variable and other explanatory variables of primary interest of primary interest |
|
| Quantitative models where the relationships between the factors are directly determined/estimated from observed data. A simple example of a data‐driven model is a linear regression model. Coefficients of the regression equation are identified (‘trained’) based on the existing data. |
|
| The model translation of a decision tree or risk pathway diagram. Usually applied as unidirectional evaluation of a sequence of alternative (stochastic) events that contribute to the final outcome of the tree (end‐point calculation). |
|
| A model (or system) in which no random process is involved in the derivation of future states of the model. Deterministic models thus produce identical outputs (results) for a given unchanged set of input values (starting conditions). (Wikipedia) |
|
| Quantitative or metric variable that takes on selected values (typically equally spaced) within an interval; the interval could be finite or infinite. |
|
| A dose‐response model describes the likelihood of a specified response resulting from exposure to a specified pathogen or hazard in a specified population, as a function of the dose. The result of such a model described the change in response with changing levels of dose (exposure). |
|
|
Statistics: Calculation of the value of an unknown parameter based on observed data from a sample of individual units using statistical functions and assumptions. |
|
| Includes specific information that is used to demonstrate the truth of an assertion or allow the estimation of a parameter. |
|
| Information on a specific question or the value of a parameter that was provided by one or more experts based on their personal experience, opinion and (often) assumptions. Expert opinion is important in areas where data is needed but not readily available through other sources. |
|
| Variable which seeks to predict or explain the outcome variable (also known as independent variables although they may not be independent of one another) |
|
| Statistical techniques (mostly graphical) not based on prior assumptions on data structure describing the distribution of values within variables, and subsequently exploring relevant relationships between factors or differences between population groups of interest. EDA is frequently used to identify potential research questions. |
|
| The quantitative and qualitative evaluation of the likelihood of hazards occurring in a given population as a result of exposure. |
|
| Generalised format of existing models not yet adapted to a specific hazard (e.g. individual pathogen, disease, population, or combination of all). They incorporate standardised relation types, together with the entities or objects that may be related. |
|
| The qualitative and/or quantitative evaluation of the nature of the adverse effects associated with the hazard. |
|
| The identification of any factor, from birth to end of life, capable of causing adverse effects on a studied subject / population. |
|
| Formal risk assessment to evaluate the probability of importing a specific hazard into a defined (animal) population or (geographic) region (to be checked with other risk assessment glossaries). |
|
| Model with individuals as basic entity. Individuals differ in their status and exchange information between them or with the environment (e.g. host animals, farms, and free rooming herds). In such models, the history of individually identified units (animals, people) is modelled and thus can be followed. |
|
| A factor/ component in a model which is provided with a value/ specification at the beginning of the calculation process (output parameter). |
|
| Intermediate output of a stepwise (iterative) model analysis that is necessary for the next analysis step but is not a model result. |
|
| Models where the system relationships, key parameters and their values are predominantly based on a synthesis of existing knowledge including published and unpublished data sources as well as expert opinion, but not from sample‐derived estimation (see ‘data driven models’). |
|
| Probability. In statistics often used in the context of estimation, e.g. the ‘maximum likelihood estimator’ as being the estimator of a certain value or model component which gives the highest probability (likelihood) to the observed data given the applied model. |
|
| A regression model assuming a linear functional relationship between outcome and explanatory variables, i.e. assuming that there is a linear (straight line) relationship between those. |
|
| A regression model assuming a linear functional relationship between the logit (log odds) of an event probability (ln(p/(1‐p)) as outcome variable and the explanatory variables. |
|
| Models that are formulated (can be written down) by mathematical language. |
|
| A statistical analysis that combines the results of several independent studies that have addressed the same research question. As combination may increase statistical power of the estimation, results may be a more accurate reflection of the unknown property than those derived from a single study under one set of conditions. |
|
| The term metapopulation originates from ecology. A |
|
| A (simplifying) representation of the essentials (parameters, relations, processes, or mechanisms) of an existing system (or a system to be constructed) which incorporates existing knowledge and/or assumptions about the relationship between all system components in an explicit form that can be investigated by systematic or manipulative experiments. |
|
| Any part of a model which is specified (e.g. by a value/ distribution/ functional relation / mechanistic rule) before model analysis (‐* model output). |
|
|
|
|
| A process where models, based on specific input, are used to forecast (predict) results for yet unobserved (unobservable, new or future) situations. |
|
| The methods used to construct, validate and analyse the model, including estimation techniques for the model analysis. |
|
| Iterative technique applies in modelling (with Markov chain Monte Carlo or MCMC sampling as a common example) to estimate the range of possible output (i.e. a distribution) that involves repeatedly drawing random numbers from input (parameter) probability distributions. The technique usually is applied in stochastic models in which the exact parameterisation cannot be taken for granted (substantial uncertainty in input values). |
|
| A model in which several explanatory (predictor/ risk factor) variables are assessed simultaneously for their relationship to a single outcome variable (univariate model), thereby allowing control for confounding relationships between the explanatory variables. |
|
| A model in which one (univariable) or several (multivariable) explanatory (predictor/ risk factor) variables are assessed simultaneously for their relationship to two or more outcome variables; this relationship is often expressed in the form of matrices. |
|
| The variable of primary importance in investigations since the major objective is usually to study the effects of treatment and/or other explanatory variables on this variable and to provide suitable models for the relationship between it and the explanatory variables |
|
| Qualitative or quantitative value of designated output parameters at the end of the model analysis. |
|
| Factor / component in a model for which the final value is derived or estimated during the calculation process (as a function of the model structure and the model input). Consistent use only possible if the output structure is pre‐specified and has itself „parameters‟ to estimate/ evaluate (see ‘output value’). |
|
| Numerical characteristic of a model element, system or function. Parameters can take a range of values from qualitative classes via single values to probability distributions, depending on their role in a model (‐, input, intermediate or output parameter). |
|
| The single‐valued result of the application of a point estimator to the data. In statistical models, this is often provided by the maximum likelihood estimation (MLE) of the (unknown) true population parameter. Point estimation usually is accompanied by its confidence interval, i.e. the calculation of an interval estimate from the same data. |
|
| A model that represents dynamic processes of a system on the level of population changes, i.e. proportions of populations or sub‐populations that change their ‘state’. From these models, population averages can be derived, but no individuals fate can be ‘simulated’ (see ‘individual based model’). |
|
| An interval estimate in which future observations will fall, with a certain probability (e.g. 95%), given what has already been observed. Prediction intervals are often used in regression analysis. |
|
| see ‘Model prediction’ |
|
| A model of occurrence of possible values (probabilities) of a random variable. There are theoretic probability distributions with defined shape (e.g. normal, exponential, binomial) and empirical distributions reflecting raw data on occurrence that have no defined shape. |
|
| The probability that a sample characteristic or model output (e.g. difference between mean of two groups) might have been observed by chance, given that the null hypothesis (of no difference) is true in the population from which the sample was drawn. The p‐value can range from 0 to 1. By specifying a threshold level of significance (often 0.05), sample characteristics (difference between means) are judged statistically significant (‘not plausible by chance’) if the p‐value is smaller than the threshold. |
|
| An assessment that generates an estimate of categorical nature or based on an ordinal scoring system. The outcome of such an assessment is a classification of output into descriptive categories. |
|
| An assessment that generates an estimate of a numerical nature directly tied to a measurement or calculation. Depending on the type of model tool used, an indication of the associated uncertainties ‐ expressed either as extreme values, ‐, confidence intervals or ‐, prediction intervals are needed. |
|
| A mathematical model that describes the relationship between an outcome variable (y) and one or more explanatory (predictor/risk factor) variables (x1, x2, x3...) using a specific functional form of the relation (e.g. ‐, linear, ‐, logistic, exponential). |
|
| The comparison of risk estimates from two samples or risk scenarios by dividing the two risks, i.e. expressing on risk as a relative value to the other (often denoted as baseline) risk value. Possible value range is 0 to infinity, with a relative risk of 1 indicating that the two compared risks were identical. |
|
|
Epidemiology: Likelihood (probability) of a certain event (outcome) to occur in a cohort, where the event usually is considered ‘negative’. Risk assessment: A function of a probability of an adverse health effect and the health effect and the (negative) consequence, severity of that effect, consequential to a hazard. General: subjective summary for a hazard, its probability of occurrence and the severity of that effect, consequential to a hazard. |
|
| A formal process consisting of three components: risk assessment, risk management and risk communication. |
|
| A systematic approach to assess the effect of an exposure to a hazard/stressor. The approachformally includes hazard identification, characterisation and consequences assessment. These steps are implemented in the risk assessment model, and respective procedural guidelines are available. |
|
| Element of a ‐, Risk Assessment, determining/ describing the effect of a hazard qualitatively or quantitatively, including attendant uncertainties about the occurrence and severity of known or potential adverse effects on a given population. |
|
| A factor that influences the likelihood for the disease or health event to occur. Risk factors are often identified through epidemiological studies and related risk factor analyses (such as uni‐ and multivariable regression models). |
|
|
Diagrammed technique to prioritise risks according to frequency (alternatively likelihood) and severity (alternatively significance). For each risk, the severity is plotted on one axis and the frequency is plotted on the other axis. Geographical representation of spatial variation in risk. |
|
| A (conceptual) representation that illustrates the sequential events of risks considered to be leading to the risk outcome. The risk pathway will serve as guidance for data collection, logical deductions, and any quantification required in the subsequent risk assessment e.g. using ‐, decision tree models. |
|
| Model that is constructed from simple and generic rules that reflect expert knowledge as close as possible. The method is purposeful if limitations due to structural assumptions have to be avoided and usually results in more complex models. |
|
| A certain combination of input parameter values that is used in a specific model run. When there is uncertainty about the value of a specific input parameter, a range is considered (selected), and (randomly) chosen representatives are tested in separate model runs (‘scenarios’). Alternatively, different hypotheses about the modelled system (control options) might lead to specific parameterisation of the model, each reflecting a scenario. |
|
| Assessment of the model output depending on specified scenarios. Often the analysis includes at least a worst‐case scenario, i.e. with values selected for important input parameters that are assumed to (all) be at the maximum likely negative (adverse in the risk context) value. |
|
| Compartmental model that incorporating four possible ‘states’ (compartments) in which subjects can be found: S=susceptible, E=latently infected but not (yet) infectious; I= infectious; and R= recovered/immune. |
|
| Within ‐, Risk Assessment, probabilities of an event are assessed and described textually on a scale from negligible, indicating that the probability of an event or the estimated risk cannot be differentiated from zero (and in practical terms can be ignored) to extremely high. |
|
| A method to qualify the output of a model by measuring the variation in model outputs resulting from changes in inputs. Through this, the ‘sensitivity’ of a model to the respective changes can be assessed, and work can be focused onto those input parameters that have substantial impact on the model output. Testing changes in model output caused by changing certain structural aspects of the model usually may be referred to as Robustness Analysis. |
|
|
A model that is evaluated via explicit (e.g. step‐by‐step) simulation of the implemented structural processes and their interactions. Simulation as method of model analysis/model solution allows arbitrary complexity of the model.
|
|
| Compartmental model that incorporating three possible ‘states’ (compartments) in which subjects can be found: S = susceptible, I = infectious; and R = recovered/immune. |
|
| Model that explicitly or implicitly incorporates the effect of spatial heterogeneity, i.e. spatial differences in either population density, outcome‐related (risk) factors or both. |
|
| Any method applied to explore, describe or model the information contained in a given set of data, in most instances samples derived from larger populations, to make inferences from that sample to specific (source) population parameters. |
|
| The a‐priori fixed (threshold) level of maximum error probability (alpha, type 1 error) that one accepts when concluding – based on the results of a statistical test – that the alternative hypothesis (inequality) is correct. Depending on the nature of topic maximum error probability of alpha = 0.05, 0.01, or even less are used in statistical hypothesis testing, i.e. ‐, p‐value. |
|
| A model in which randomness is involved in the derivation of future states of the model. Stochastic models thus produce distributions as output even for a given starting condition. Randomness might be incorporated via stochastic parameterisation, i.e. accounting for variability and uncertainty of event occurrence. |
|
| Conducting a literature review using predefined criteria for searching/selection of the relevant literature with scientific tools to assess the findings from the published studies in a transparent and reproducible way. |
|
| Specific models in which pathways describing the transmission of (infectious) diseases/agents in populations are constructed, and values for the transmission probabilities along that pathway either entered (to simulate disease spread) or estimated based on observed population data. |
|
| Lack of knowledge in the exact value of a population parameter. Statistic methods derive estimates for that parameter as well as the associated uncertainty using fundamental concepts and theories of sampling, probability and randomness. In models uncertainty can be incorporated by probability distributions with information coming from either data or expert opinion. |
|
| Uncertainty analysis is defined as the process of identifying and characterising uncertainty about questions of interest and/or quantities of interest in a scientific assessment. |
|
| A model in which a single explanatory (predictor/ risk factor) variable is assessed for its relationship to one or more outcome variables. |
|
| A model in which one or more explanatory (predictor/ risk factor) variables are assessed for their relationship to a single outcome variable. |
|
| The concept of checking the validity of the model formulation with regard to its intended purpose; ideally done with independently observed patterns. Checking correctness is intended task of model ‐, verification. |
|
| True (inherent) biological, measurement or system‐based variation in the possible values (value range) for a given parameter. In models, that variability, similarly to what is done with uncertainty, can be incorporated as probability distributions with information coming from either data (and classic statistics) or expert opinion |
|
| The concept of checking the correctness of the model implementation; ideally done by measuring back any input pattern, code review, implausible scenarios (e.g. assuming no effect of a proven treatment). Checking appropriateness for purpose and consistency with conceptual thinking is the intended task of model ‐, validation. |
|
| A situation where everything that can go wrong, does go wrong. Used in risk assessment to consider the worst predictable outcome by using extreme (risk increasing) model inputs. |