| Literature DB >> 32401795 |
Christopher J Lynch1, Saikou Y Diallo1, Hamdi Kavak2, Jose J Padilla1.
Abstract
Verification is a crucial process to facilitate the identification and removal of errors within simulations. This study explores semantic changes to the concept of simulation verification over the past six decades using a data-supported, automated content analysis approach. We collect and utilize a corpus of 4,047 peer-reviewed Modeling and Simulation (M&S) publications dealing with a wide range of studies of simulation verification from 1963 to 2015. We group the selected papers by decade of publication to provide insights and explore the corpus from four perspectives: (i) the positioning of prominent concepts across the corpus as a whole; (ii) a comparison of the prominence of verification, validation, and Verification and Validation (V&V) as separate concepts; (iii) the positioning of the concepts specifically associated with verification; and (iv) an evaluation of verification's defining characteristics within each decade. Our analysis reveals unique characterizations of verification in each decade. The insights gathered helped to identify and discuss three categories of verification challenges as avenues of future research, awareness, and understanding for researchers, students, and practitioners. These categories include conveying confidence and maintaining ease of use; techniques' coverage abilities for handling increasing simulation complexities; and new ways to provide error feedback to model users.Entities:
Year: 2020 PMID: 32401795 PMCID: PMC7219780 DOI: 10.1371/journal.pone.0232929
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Fig 1Methodology for forming the Verification Corpus and conducting analyses.
Corpus constructed to conduct the content analysis study.
| Venue | Venue Type | Year Range Retrieved | Total Articles Obtained | Subset of Articles containing the term “Verification” |
|---|---|---|---|---|
| Journal | 1996–2014 | 23* | 23 | |
| Journal | 2007–2015 | 45* | 45 | |
| Journal | 1998–2015 | 666 | 92 | |
| Journal | 2004–2015 | 276 | 79 | |
| Journal | 1963–1965 | 375 | 12 | |
| Journal | 1994–2015 | 317* | 317 | |
| Conference | 1968–1971, 1973–2014 | 8933 | 1776 | |
| Conference | 1997–2014 | 4012 | 936 | |
| Conference | 1968–2014 | 77* | 77 | |
| Conference | 1976–2014 | 5179 | 401 | |
| Conference | 1997–2008 | 557 | 88 | |
| Conference | 2006–2015 | 128* | 128 | |
| Conference | 2014–2015 | 317 | 73 | |
| 20,905 | 4,047 |
* Indicates that only freely available papers were obtained.
Verification Corpus organized by decade.
| Decade | Years Included | Number of Publication Venues Represented | Number of “Verification” Publications | All Publications |
|---|---|---|---|---|
| 1964, 1965, 1968, 1969 | 3 | 27 | 500 | |
| 1970–1979 | 3 | 100 | 799 | |
| 1980–1989 | 3 | 253 | 1,584 | |
| 1990–1999 | 8 | 822 | 5,018 | |
| 2000–2009 | 11 | 1,724 | 8,353 | |
| 2010–2015 | 11 | 1,121 | 4,651 | |
| 4,047 | 20,905 |
Summary of the main goals, analysis method, and the component of the corpus used to provide the necessary data for Step 6 of the methodology.
| Goal–Gain Insight into: | Method | Corpus Component |
|---|---|---|
| Identify the evolution of the Verification Corpus’s concepts (Section 3.1) | Extract the Verification Corpus’s ranked concept list with each publication within the Corpus tagged by its decade of publication. Compare concept rankings across decades. | Verification Corpus |
| Identify prominences of verification, validation, and V&V over the decades (Section 3.2) | Compare the prominences of | Verification Corpus |
| Identify the evolution of verification’s concepts over the decades (Section 3.3) | Extract and analyze the ranked concept list for each decade’s content analysis conducted using only the publications within that decade. | Verification Corpus divided into six decades using “verification” as the comparison category |
| Identify the defining characteristics of verification each decade (Section 3.4) | Using the thesaurus generated from each decade’s content analysis, analyze the terms commonly associated with verification. | Verification Corpus Thesauri from each of the six decades using “verification” as the comparison category |
| Identify challenges and future directions for simulation verification research (Section 4) | Explore common themes within the concepts and definitions pertaining to verification over time to identify existing challenges. Then, conduct a literature review to reflect the state-of-the-art. | The evolution of concepts and defining characteristics identified from Sections 3.3 and 3.4 |
Verification Corpus’s top 10% most prominent concepts.
| computer | 5.638 | computer | 2.640 | computer | 1.859 |
| program | 2.830 | program | 2.569 | program | 1.626 |
| group | 2.500 | total | 1.829 | input | 1.479 |
| function | 2.474 | distribution | 1.572 | output | 1.419 |
| output | 2.111 | simulated | 1.527 | production | 1.281 |
| social | 2.058 | rate | 1.448 | techniques | 1.214 |
| total | 1.838 | function | 1.439 | system | 1.193 |
| distribution | 1.773 | input | 1.386 | form | 1.185 |
| input | 1.718 | population | 1.358 | problem | 1.171 |
| value | 1.625 | output | 1.331 | structure | 1.166 |
| object | 1.810 | HLA | 1.527 | social | 1.711 |
| HLA | 1.373 | V&V | 1.302 | population | 1.680 |
| interface | 1.329 | M&S | 1.262 | agent | 1.593 |
| program | 1.287 | architecture | 1.249 | policy | 1.452 |
| event | 1.231 | capabilities | 1.220 | power | 1.393 |
| execution | 1.222 | technology | 1.207 | average | 1.387 |
| verification | 1.182 | training | 1.197 | total | 1.354 |
| test | 1.182 | distributed | 1.193 | dynamics | 1.338 |
| computer | 1.173 | component | 1.179 | algorithm | 1.304 |
| software | 1.172 | effort | 1.176 | study | 1.298 |
aThe High Level Architecture (HLA) is an IEEE Modeling and Simulation Interoperability Standard developed by the Defense Modeling and Simulation Office (DMSO) and adopted by NATO [70]. The HLA facilitates specifying and exchanging information when creating a simulation by federating simulations.
Correlations of each decade’s concepts prominence values. Color intensity indicates the strength of correlation (green is positive and red is negative).
| Decade | 1960s | 1970s | 1980s | 1990s | 2000s | 2010s |
|---|---|---|---|---|---|---|
| 1960s | - | 0.749 | 0.558 | 0.028 | -0.458 | -0.033 |
| 1970s | - | 0.704 | 0.007 | -0.663 | 0.064 | |
| 1980s | - | 0.225 | -0.633 | -0.124 | ||
| 1990s | - | 0.196 | -0.855 | |||
| 2000s | - | -0.556 | ||||
| 2010s | - |
Fig 2Prominence values of verification, validation, and V&V from the 1960s to the 2010s.
Fig 3Word clouds of concepts that frequently occur with verification each decade with word size indicating relative frequency within the corresponding decade.
Top 10% of prominent verification concepts obtained through independent content analyses.
| solution | 8.449 | design | 6.072 | Validation | 10.795 |
| results | 6.331 | results | 5.188 | Programming | 2.980 |
| analog | 6.075 | program | 4.559 | Development | 2.912 |
| method | 5.503 | test | 4.518 | Computer | 2.834 |
| described | 5.043 | process | 4.497 | Application | 2.700 |
| case | 4.950 | software | 4.364 | Model | 2.634 |
| given | 4.455 | provide | 4.247 | Process | 2.623 |
| logic | 3.819 | development | 3.995 | Design | 2.551 |
| work | 3.819 | techniques | 3.869 | Analysis | 2.549 |
| order | 3.457 | real | 3.240 | program | 2.424 |
| validation | 20.908 | validation | 25.043 | validation | 17.138 |
| M&S | 2.933 | effort | 2.577 | M&S | 4.163 |
| test | 2.858 | model | 2.253 | training | 4.026 |
| model | 2.535 | development | 2.243 | requirements | 3.831 |
| development | 2.153 | test | 2.237 | program | 3.280 |
| implementation | 2.112 | M&S | 2.229 | development | 3.116 |
| results | 2.068 | results | 2.221 | test | 3.010 |
| requirements | 2.049 | requirements | 2.199 | tool | 2.486 |
| behavior | 2.018 | process | 2.186 | engineering | 2.384 |
| process | 1.983 | activities | 2.122 | support | 2.381 |