Literature DB >> 33586204

Defining Cyber Security and Cyber Security Risk within a Multidisciplinary Context using Expert Elicitation.

Mariana G Cains1, Liberty Flora1, Danica Taber1, Zoe King1, Diane S Henshel1.   

Abstract

It is important to have and use standardized terminology and develop a comprehensive common understanding of what is meant by cyber security and cyber security risk given the multidisciplinary nature of cyber security and the pervasiveness of cyber security concerns throughout society. Using expert elicitation methods, collaborating cyber researchers from multiple disciplines and two sectors (academia, government-military) were individually interviewed and asked to define cyber security and cyber security risk. Data-driven thematic analysis was used to identify the most salient themes within each definition, sector, and cyber expert group as a whole with results compared to current standards definitions. Network analysis was employed to visualize the interconnection of salient themes within and across sectors and disciplines. When examined as a whole group, "context-driven," "resilient system functionality," and "maintenance of CIA (confidentiality, integrity, availability)" were the most salient themes and influential network nodes for the definition of cyber security, while "impacts of CIA vulnerabilities," "probabilities of outcomes," and "context-driven" were the most salient themes for cyber security risk. We used this expert elicitation process to develop comprehensive definitions of cyber security (cybersecurity) and cyber security risk that encompass the contextual frameworks of all the disciplines represented in the collaboration and explicitly incorporates human factors as significant cyber security risk factors.
© 2021 The Authors. Risk Analysis published by Wiley Periodicals LLC on behalf of Society for Risk Analysis.

Entities:  

Keywords:  Cross-disciplinary; human factors; network analysis; system; thematic analysis

Mesh:

Year:  2021        PMID: 33586204      PMCID: PMC9543401          DOI: 10.1111/risa.13687

Source DB:  PubMed          Journal:  Risk Anal        ISSN: 0272-4332            Impact factor:   4.302


INTRODUCTION

Cyber security risk models have traditionally focused on machine‐based threat, deterrence, mitigation, and recovery. However, human factors contribute to many cyber risks via the creation and deployment of malicious‐acting software, increased attacker use of social engineering, and the lack of protective behaviors, such as password encryption or the use of antivirus software. Humans play a role in the creation, exacerbation, propagation, and mitigation of cyber security risk as users, defenders, and attackers (Henshel, Cains, Hoffman, & Kelley, 2015; Henshel, Sample, Cains, & Hoffman, 2016; King et al., 2018). In recent years, researchers have begun to include human factors such as maliciousness and expertise in cyber security risk models to provide additional insight into the human behaviors that induce or mitigate cyber security breaches (Bowen, Devarajan, & Stolfo, 2011; Cherdantseva et al., 2016; Mittu & Lawless, 2015; Oltramari, Henshel, Cains, & Hoffman, 2015). This human‐inclusive approach challenges the current machine‐focused definition of cyber security as it incorporates fields such as sociology, psychology, risk and decision science, and many others (Cebula, Popeck, & Young, 2014; NIST, 2018; Oltramari & Kott, 2018). However, the increasingly interdisciplinary nature of cyber security research also complicates discussion between stakeholders from diverse disciplines as they define key concepts such as cyber security and cyber security risk differently. This article examines the definitions of cyber security and cyber security risk, as defined by principal investigators, researchers, and practitioners who are a part of the Army Research Laboratory funded Cyber Security Collaborative Research Alliance (CSec CRA). The CSec CRA was first formed in 2013 with a mission to: …develop a fundamental understanding of cyber phenomena, including aspects of human attackers, cyber defenders, and end users, so that fundamental laws, theories, and theoretically grounded and empirically validated models can be applied to a broad range of Army domains, applications, and environments. ARL [Army Research Laboratory] envisions the alliance bringing together government, industry and academia through this basic research program to develop and advance the state of the art of Cyber Security. (ARL, 2013) The different lenses through which practitioners, researchers, and users interact with the notions of cyber security and cyber security risk influence their perception and conceptualization of either term (Craigen, Diakun‐Thibault, & Purse, 2014). Therefore, as experts from the diverse fields of information science, computer science, risk and decision science, organizational management, and behavioral psychology converge to advance the science of cyber security in the face of ever advanced and persistent threats, it is crucial for all parties to have a shared and cohesive definition of cyber security and cyber security risk. The background section of this article demonstrates risk communication issues that result from the varying definitions and perceptions surrounding cyber security and cyber security risk. The background section also analyzes similar risk communication and risk perception issues in other interdisciplinary fields to understand how these problems have been identified and addressed. The methods and analysis sections detail the qualitative processes used to identify salient themes from individual cyber experts’ definitions of cyber security and cyber security risk gathered from semistructured interviews with both academic and U.S. Army Research Laboratory (ARL) participants in the CSec CRA. This article includes five analyses: expert elicitation, thematic analysis, network analysis, a comparative content analysis between these expert elicitation definitions and prior analysis of cyber language in formal cyber ontologies, and a comparison of the derived expert elicitation definition to definitions from national and international standards and best practices. Each analysis presents both visual and tabular data representations. The final discussion section compares the results of these analyses and highlights the challenges associated with expert elicitation and thematic analysis and concludes with the implications of this research. Using both sectoral and disciplinary subgroupings, the thematic analysis of the expert elicitation‐based cyber security and cyber security risk definitions indicated that while there were clear inconsistencies between the subgroupings, the main commonality was the need to maintain confidentiality, integrity, and availability (CIA; cyber security) and reduce CIA vulnerabilities (cyber security risk). Both academia and ARL interviewees expressed that perfect cyber security is unattainable and that cyber security risk is exacerbated by a lack of understanding of human factors in cyber security. None of the reviewed cyber security risk definitions from standard developing organizations included either human factors or the element of time control, the latter of which is becoming more critical as cyber systems control more and more time‐dependent societal functions and processes. In contrast, only the Merriam–Webster definition for “security risk” acknowledges the human element.

BACKGROUND

Current Definitions in Cyber Security Research

The Merriam–Webster dictionary defines cyber security as “measures taken to protect a computer or computer system (as on the Internet) against unauthorized access or attack” (Merriam‐Webster, 2020). The International Telecommunications Union (ITU) defines cyber security as “the collection of tools, policies, security concepts, security safeguards, guidelines, risk management approaches, actions, training, best practices, assurance and technologies that can be used to protect the cyber environment and organization and user's assets” within the cyber security foci of confidentiality, availability, and integrity (CIA) objectives (ITU, 2008). The department of Homeland Security's National Nnitiative for Cybersecurity Careers and studies glossary defines cybersecurity1 as an activity or process that protects and/or defends information and systems against damage, unauthorized use or modification, or exploitation (DHS, 2020). Although these definitions express the need to protect assets, they are hardware and software focused and fail to consider human aspects of cyber security. Further, cyber security and the risk associated with cyber security is impacted by a multidisciplinary and multistressor system. The physical environment and human/social interactions components are omitted from these definitions which hinders the ability for cyber researchers and risk analysts to holistically assess the risk posed to systems, networks, and users in a cyber domain. This translates to ineffective communication between cyber security experts, whether they are academic researchers or applied practitioners. Cyber security strategies and perspectives also differ between nations. Some nations use a top–down approach in which they believe the purpose of cyber security is to protect against threats from cyberspace while other nations use a bottom–up approach in which security properties must be safeguarded and guaranteed (Luiijf, Besseling, & De Graaf, 2013). As a response to international differences, the Russian‐US bilateral working group of the East West Institute and Lomonosov Moscow State University created a terminology framework in which they defined cyber security as “a property of cyber space that is an ability to resist intentional and unintentional threats and respond and recover” (ISI, 2014). However, this definition still focuses on hardware and software and fails to address the human aspects of cyber security. While some researchers have investigated trends within the current definitions and uses of risk within cyber security, few have presented formalized definitions of cyber security risk. For example, Oltramari and Kott (2018) suggest practitioners describe cyber risk2 in terms of a configuration of a system instead of the probability of harm occurring. Researchers have also investigated the process of identifying specific risks for various systems. A study on the risks to supervisory control and data acquisition (SCADA) systems defined risk management as “coordinated activities to direct and control an organization with regard to risk” and risk assessment as the “overall process of risk identification, risk analysis and risk evaluation” (Cherdantseva et al., 2016). There has also been some research on differences in cyber security risk perceptions. Quigley, Burns, and Stallard (2013) suggest there are discrepancies in the way cyber security risk is perceived among individuals in different sectors. The study consisted of a rhetorical analysis on 10 writing samples to determine how cyber security‐related topics are portrayed by “management gurus,” which consisted of journalists as well as professionals in academia and consulting fields. A rhetorical analysis is an evaluation of the relationship between the writer, message, and audience. The results of this analysis demonstrated that management gurus enhance the perception of cyber security risk by using rhetorical devices (persuasion) and heuristics to create fear among readers and clients. The various definitions of cyber security and the lack of explicit definitions of cyber security risk demonstrate that these key terms are not standardized, although two research teams have recently presented integrated definitions (Craigen et al., 2014; Ramirez & Choucri, 2016). Since both terms can be used in a wide array of disciplines and applied to various systems, a universal or standardized definition of cyber security and cyber security risk may not be possible. However, it is still important to create standardized definitions of cyber security and cyber security risk in order to mitigate the problems that arise when members of diverse disciplines work together.

Multidisciplinary Effects on Communication

Technological, sociological, and organizational challenges associated with risk communication are due to the unfamiliar nature and severity of crises (Kellens, Terpstra, & De Maeyer, 2012; Slovic, Fischhoff, & Lichtenstein, 1982) and lack of established mutually agreed upon terms, which diminishes the ability for information to be efficiently and effectively disseminated (Manoj & Baker, 2007). When faced with uncertainty, humans often use heuristics, such as emotions, to make decisions (Gigerenzer & Gaissmaier, 2011). These decisions are often justified using risk as feelings, rather than quantified risk analysis, which is based in logic and measurement (Quigley et al., 2013). Researchers suggest that humans are unable to interpret risk‐relevant numbers if terms are not defined in a clear and conceptual manner (Cox, 2008; Gaissmaier & Gigerenzer, 2008). Cognitive biases also create ineffective risk communication. For example, doctors and nurses often have differing cognitive biases (Blumenthal‐Barby & Krieger, 2015; Sladek et al, Bond, & Phillips, 2010). These biases affect their ability to communicate as they have different conceptions of patient care goals. Reader et al. (2007) found that the differing conceptions between doctors and nurses are rarely addressed as nurses may report feeling embarrassed by their differences with senior doctors. These issues contribute to a lack of safety precautions and a decline in the quality of patient care. Cognitive biases can be exacerbated in multidisciplinary work as different disciplines have different approaches to risk management (Zinn, 2006). It is therefore important to address these cognitive biases components of the human factor both as risk assessors and as a contributor or mitigator of cyber risk. Issues in risk communication associated with cognitive biases are worsened by a lack of standardized vocabulary (Manoj & Baker, 2007). The importance of standardized terminology has been demonstrated across disciplines and in cross‐disciplinary work. For example, a lack of standardized vocabulary contributed to reduced innovation in fish reproduction studies as scientists and resource managers could not accurately compare their work due to differences in the terms used (Bowen et al., 2011). Standardization of vocabulary is commonly established by creating a formalized, systematic nomenclature that facilitates communication among stakeholders from various disciplines (Ramirez & Choucri, 2016). Ramirez and Choucri (2016) argue that a standardized cyber security vocabulary starts with increased research efforts focusing on identifying trends in terminology standards. Ramirez further suggests there are four subdisciplines of cyber security: public policy, computer science, management, and social science (Ramirez, 2017). In order to facilitate cyber security communication, Ramirez recommends professionals (such the ARL and academic counterparts in the CSec CRA) initiate change using technical language that is compatible across disciplines. The presented research used expert elicitation, data‐driven thematic analysis, network analysis, and content analysis to identify how ARL and academic cyber experts, all members of a multidisciplinary collaborative effort, individually conceptualize cyber security and cyber security risk given their respective disciplines and sectors.

EXPERT ELICITATION AND ANALYSIS

The research methods consisted of four components: expert elicitation through semistructured interviews, data‐driven thematic analysis, network analysis, and a content analysis comparison. This study utilized the knowledge of a group of self‐selecting cyber experts who were members of the CSec CRA at the time of the interview. The data collected during the expert elicitation were systematically analyzed using data‐driven thematic analysis to identify patterns and salient themes within interviewees’ definitions of cyber security and cyber security risk. The salient themes were further examined with ARL‐ and academia‐derived themes as a single group and then as two subgroups. Network analysis was used to understand the relationship between the identified themes and across the two sectors (e.g., academia and government–military) and the interviewee disciplines. Content analysis was used to compare the interviewees’ definitions (not the salient themes) to the vocabulary used in cyber ontologies (Oltramari & Kott, 2018).

Expert Elicitation Protocol

Outreach

Forty‐four (22 ARL, 22 academia) CSec CRA principal investigators, researchers, and professionals were contacted via email asking for their participation in a semistructured interview to collect cyber security risk assessment goals, risk factors, and risk metrics. All research institutes3 associated with the CSec CRA (at the time of interviews) were represented in the 27 self‐selected interview participants; 10 participants from ARL and 17 from academia. Participants from both ARL and academia self‐identified as having multiple sub‐areas of expertise, including security informatics, intrusion detection, social and decision sciences, and risk assessment (See Table I).
Table I

Disciplines and Research Areas of Interviewed Cyber Security Collaborative Research Alliance Participants (CSec CRA)

Behavioral PsychologyCognitive NeuroscienceComputer Engineering
Computer ScienceComputer SecurityHuman Factors Engineering
Network Science and EngineeringRisk AssessmentSecurity Informatics
Security Monitoring and Intrusion DetectionSocial and Decision SciencesSoftware Engineering
Systems and Network SecurityTactical Mobile and Strategic networksWireless Systems and Networks
Disciplines and Research Areas of Interviewed Cyber Security Collaborative Research Alliance Participants (CSec CRA)

Semistructured Interviews

The semistructured interview consisted of the interviewer, a scribe, and the interviewee. The same interviewer and scribe were present for all interviews. At the start of the interview, the interviewer described the need for the interview and how it fit into the overall risk assessment process. The first step in any risk assessment is problem formulation and the identification of information, that is what question(s) is the cyber security risk assessment trying to answer and what information is available to answer those questions. The semistructured interview consisted of questions about the tasks and management goals necessary to achieve progress in each interviewee's respective research area and the types of risks posed to each research area. Interviewees were also asked for their definitions of cyber security and cyber security risk. Each semistructured interview lasted approximately 30–60 minutes. The interviews of academic participants were recorded for transcription purposes. However, audio recording was not permitted in Army Research Laboratory buildings, so the interviews were transcribed in situ by the same scribe for all interviews.

Data Analysis and Discussion

Thematic Analysis

Thematic analysis is commonly used in qualitative research to identify overarching patterns, or themes, that are expressed both implicitly and explicitly across qualitative data sets such as narratives and interviews (Braun & Clarke, 2006). The data‐driven method of thematic analysis draws on the practice of developing theory from trends arising from the data via systematic investigation, while the theory‐driven method examines the data for predetermined trends or theories (Clarke & Braun, 2014). Data‐driven thematic analysis was performed on the transcripts of interviewees’ responses to the questions “What is your definition of cyber security?” and “What is your definition of cyber security risk?” to identify and organize similar concepts into representative themes. All 27 interview participants were asked the same “definitions” question, however, two individuals’ responses were removed from “cyber security” thematic analysis due to (1) stated uncertainty about how to answer the question, and (2) a demonstrated lack of understanding of the prompt due to variable English proficiency. All 27 interview participants’ responses for “cyber security risk” were appropriate for thematic analysis. The cyber security and cyber security risk definitions of the interview transcripts were analyzed using data‐driven thematic analysis; Table II is an example of this process.
Table II

Excerpt of Coding Progression via Thematic Analysis of Responses to “What is your Definition of Cyber Security Risk?”

IntervieweeDefinition ExtractCode1° Theme2° Theme3° Theme
A“To me, it is the humans that are the biggest risk.”humans are the biggest riskBad user or attackerUnknown of human behaviorUncertainties introduced by human factors
A“…humans that are the biggest risk.”humans are the biggest riskHumans are a source of risk
B“…humans will be hard such as insider threat.”humans are hard to quantifyUnknowns of human behavior
C“…attackers don't follow rules in reality…”attackers don't follow rulesAttackers can be unpredictable
B“Risk in terms of machine may be easier to quantify.”machine risk is easier to quantify than humanMachines are easier than humansHuman complexity
Excerpt of Coding Progression via Thematic Analysis of Responses to “What is your Definition of Cyber Security Risk?” Using the data‐driven thematic analysis methodology detailed by Braun and Clark (2006), the entirety of each interview (all questions and answers) was independently read and coded by three researchers (doctorate, master's, and undergraduate level). The researchers then coded the responses to the “definitions” questions to identify the core concepts of each answer (e.g., Table II: “Code” column). Once individually read and coded, the three researchers (with input from other authors of this article) gathered as a group to refine the codes for the “definitions” answers into first‐order themes (e.g., Table II: “1° Theme” column). Themes “[reflect] a pattern of shared meaning, organized around a core concept or idea, a central organizing concept.” (Braun, Clarke, Hayfield, & Terry, 2018). The first‐order themes were then compared to the interviewees’ original “definitions” answers to ensure the theme was representative of the response. The researchers than individually collated the first‐order themes into the second‐order themes and regrouped to (1) compare the individually identified second‐order themes and (2) come to a consensus on how the first‐order themes were refined into second‐order themes (Table II: “2° Theme” column). This process was repeated for a third time resulting in third‐order themes (Table II: “3° Theme” column). When read left to right, Table II exemplifies the sequential order of the response‐code‐theme refining process of thematic analysis and illustrates how more than one theme can be identified in a single definition. Table II presents verbatim excerpts from three interviewees (marked A, B, and C). The two entries for Interviewee A demonstrate how thematic analysis researchers can identify different themes, in this case first‐order themes, from the same definition extract and code. The two entries for interviewee B illustrate how multiple codes and themes can be identified in a single definition. Through the deliberation process described in the previous paragraph and in detail by Braun and Clark (2006) and Braun et al. (2018), the second‐order theme and third‐order theme columns show how the identified first‐order themes and second‐themes were collated, respectively, into more inclusive (i.e., higher order) themes.

Thematic analysis results

The theme refining process in thematic analysis is visualized in the thematic butterfly diagrams below (Figs. 1 and 2). The systematic consolidation of similar ideas (i.e., patterns of thought) into representative themes is illustrated by the grouping of several themes into a unifying theme (moving from the edges of the diagram inward). The research presented is a comparison of the thematic butterfly diagrams for CSec CRA academics and ARL researchers and practitioners. The results of the refined thematic butterfly diagrams are useful in distilling and relating commonly held perceptions of cyber security and cyber security risk across diverse issues and stakeholder groups.
Fig 1

Thematic butterfly map of first, second, and third‐order themes from thematic analysis of the answers to “What is your definition of cyber security?” Green themes were expressed by both sectors (U.S. Army Research Laboratory [ARL] and academia), blue themes were expressed by only one sector, and yellow themes represent differing first or second‐order themes that consolidate into the same second and third‐order themes, respectively. Image best viewed in color and enlarged via online journal.

Fig 2

Thematic map of first, second, and third‐order themes from thematic analysis of the answers to “What is your definition of cyber security risk?” Green themes were expressed by both sectors (U.S. Army Research Laboratory [ARL] and academia), blue themes were expressed by only one sector, and yellow themes represent differing first or second‐order themes that consolidate into the same second and third‐order themes, respectively. Image best viewed in color and enlarged via online journal.

Thematic butterfly map of first, second, and third‐order themes from thematic analysis of the answers to “What is your definition of cyber security?” Green themes were expressed by both sectors (U.S. Army Research Laboratory [ARL] and academia), blue themes were expressed by only one sector, and yellow themes represent differing first or second‐order themes that consolidate into the same second and third‐order themes, respectively. Image best viewed in color and enlarged via online journal. Thematic map of first, second, and third‐order themes from thematic analysis of the answers to “What is your definition of cyber security risk?” Green themes were expressed by both sectors (U.S. Army Research Laboratory [ARL] and academia), blue themes were expressed by only one sector, and yellow themes represent differing first or second‐order themes that consolidate into the same second and third‐order themes, respectively. Image best viewed in color and enlarged via online journal. The thematic butterfly diagrams below provide a comparison of the first, second, and third‐order cyber security and cyber security risk themes identified from interviews across the disciplines of ARL practice and academic research. The outermost themes (far left for ARL and far right for academia) are first‐order themes. Moving toward the spine of the thematic butterfly diagram, the themes are refined from first‐order, second‐order, and third‐order themes. A handful of second order themes also serve as third‐order themes as some themes could not be consolidated with other themes without losing the context of the answered definition. Green themes were identified in both sectors (ARL and academia), blue themes were expressed by only one sector, and yellow themes represent differing first or second‐order themes that were refined into the same second and third‐order themes, respectively. Eighty‐two first‐order “cyber security” themes were identified across the 25 cyber security definitions provided by the CSec CRA expert elicitation participants. The 82 first‐order themes were refined into 33 second order themes, which were further refined into 19 third‐order themes (Fig. 1; Tables SI–SIII). Eighty first‐order “cyber security risk” themes were identified across the 27 cyber security risk definitions provided by the CSec CRA expert elicitation participants. The 80 first‐order themes were refined into 36 second‐order themes, which were further refined into 26 third‐order themes (Fig. 2; Tables SIV–SVI). To build the composite cyber security and cyber security risk definitions, the identified third‐order themes were reexamined within the context of the interviewees stated definitions from the interviews and the greater the context of the CSec CRA mission. The themes were then pieced together into coherent, contextually sensible definitions for cyber security and cyber security risk. Using the 19 third‐order “cyber security” themes (Fig. 1; Table SIII), the interviewees’ composite definition of cyber security . Fig. 3 illustrates the frequency of the top five third‐order themes inferred from the interviewees’ answers to the question: “What is your definition of cyber security?” A full list and frequencies of the first‐, second‐, and third‐order themes is provided in the Supporting Information Tables SI–SIII.
Fig 3

Top five third‐order themes of cyber security, identified and refined from expert elicitation using data‐driven thematic analysis.

Top five third‐order themes of cyber security, identified and refined from expert elicitation using data‐driven thematic analysis. Using the 26 third‐order “cyber security risk” themes (Fig. 2, Table SVI), the interviewees’ composite definition of cyber security risk … . Fig. 4 illustrates the frequency of the top five third‐order themes inferred from the interviewees’ answers to the question: “What is your definition of cyber security risk?” A full list and frequencies of the first‐, second‐, and third‐order themes is provided in the Supporting Information Tables SIV–SVI.
Fig 4

Top five third‐order cyber security risk themes, identified and refined from expert elicitation using data‐driven thematic analysis. Three third‐order themes tied for fifth: “Vulnerabilities (known and unknown),” “Negative consequences,” and “Absolute and relative resource valuation.”

Top five third‐order cyber security risk themes, identified and refined from expert elicitation using data‐driven thematic analysis. Three third‐order themes tied for fifth: “Vulnerabilities (known and unknown),” “Negative consequences,” and “Absolute and relative resource valuation.”

Thematic analysis discussion

The thematic butterfly diagrams (Figs. 1 and 2) visualize the theme refining process across the ARL and academic subgroups of the larger CSec CRA participants. All of the respective third‐order themes were used to build the composite definitions for cyber security and cyber security risk. The cyber security themes indicate that experts in both sectors recognize cyber security is context‐driven (e.g., what is being secured for whom at what cost) and requires resilient system functionality. System functionality can be maintained through the maintenance of the traditional cyber security vulnerabilities (CIA), threat prediction and prevention, and the protection of resources. The themes identified and refined from the academic expert elicitations suggest defenders are able to mitigate risks associated with system functionality; the themes derived from the ARL expert elicitations suggest cyber security is dependent on the human factor as an attacker, but the defender is not specifically addressed as a risk mitigator. Further, the academics view cyber security as a multidimensional, complex framework. ARL experts, on the other hand, focus on specific resources that need to be protected to ensure a cyber secure environment. Neither sector, ARL nor academia, think perfect cyber security is attainable; however, academia definitions implied that cyber security was complicated (Fig. 1), while ARL definitions implied that cyber security risk was complicated (Fig. 2). The thematic analysis suggests that although both ARL and academia experts consider the scope and context of cyber security risk, they approach cyber security risk in a different manner. Both sectors think that cyber security risk is exacerbated by a lack of understanding of human factors effects on cyber security. The ARL interviewees expressed that the majority of cyber security risk is captured within the traditional information vulnerability triad of CIA, while academics expressed that the CIA triad does not comprise all risk factors (e.g., human factors, timing). When analyzing and discussing risks, computer scientists focus on CIA attributes (Confidentiality of information, Integrity or maintaining the trustworthiness of data, and Availability or providing information access to authorized personnel only) as the sole indicators of risk; Von Solms & Van Niekerk, 2013). Oltramari et al. (2015) suggest a holistic cyber security risk model incorporates variables other than solely CIA, specifically time and humans as crucial factors in evaluating risk to a system, network, or user. Academics attempt to list the multitude of external factors that influence a problem, address the uncertainty, and then attempt to identify the source of uncertainty. ARL experts generalize the uncertainty and focus on agility as a means to overcome risk. Both groups either explicitly stated or implied that cyber security risk is exacerbated by a lack of understanding of humans as risk contributors or mitigators. The academia definitions included the impact on humans, whereas the negative impacts are not explicitly specified, thus not inferred, in the ARL analysis. The academia perspective attempts to understand cyber security risk as it relates to cyber security and human factors, both as defenders and attackers. This holistic view corresponds to the cyber security analysis, in which the themes inferred from academic expert language revolve around the characterization of cyber security as a complex framework.

Network Analysis

Network analysis and visualization examines the relationship (e.g., coexistence) between entities (e.g., interviewees’ definition themes). Network analysis was performed to understand the relationships between the definition‐derived third‐order themes analyzed by interviewee sector and discipline. In this setting, the visualizations that result from network analysis of the third‐order themes can provide answers to the questions such as, “Which third‐order theme was used the most in conjunction with other third‐order themes?”; “Which third‐order theme is isolated from the corpus of third‐order themes, that is a theme that only represents a solitary definition?”; or “What third‐order themes were identified across all disciplines?” Network analysis was performed using Gephi (Bastian, Heymann, & Jacomy, 2009) on each of the third‐order themes for cyber security and cyber security risk comparing across academic and ARL interviewees’ individual definitions (see section on sector‐based network analysis) and across the interviewee corpus by discipline (see section 3.2.2.2.).

Sector‐based network analysis

Sector‐based network analysis was used to explore the relationships between the third‐order themes within any given interviewee definition. Given the complex nature of the resulting networks, Fig. 5 is presented as a simplified network to facilitate interpretation of Figs. 6, 7, and 8.
Fig 5

Simplified sector‐based network.

Fig 6

Parent networks of cyber security (left network) and cyber security risk (right network) third‐order themes identified from the interview corpus using thematic analysis. Nodes (i.e., circles) are the third‐order theme and edges (i.e., connecting lines) signify interviewee definitions from which both themes were identified. The nodes of the same color are more densely connected to each other than other nodes in the network, that is community of nodes. The edge color denotes the interviewees sector, Academia (gold) and U.S. Army Research Laboratory (ARL; blue). Image best viewed in color and enlarged via online journal.

Fig 7

Sector‐parsed networks of cyber security third‐order themes identified from the interview corpus using thematic analysis. Nodes (i.e., circles) are the third‐order theme and edges (i.e., connecting lines) signify interviewee definitions from which both themes were identified. The nodes of the same color are more densely connected to each other than other nodes in the network, that is community of nodes. The edge color denotes the interviewees sector, Academia (gold; left network) and U.S. Army Research Laboratory (ARL; blue, right network). Image best viewed in color and enlarged via online journal.

Fig 8

Sector‐parsed networks of cyber security risk third‐order themes identified from the interview corpus using thematic analysis. Nodes (i.e., circles) are the third‐order theme and edges (i.e., connecting lines) signify interviewee definitions from which both themes were identified. The nodes of the same color are more densely connected to each other than other nodes in the network, that is community of nodes. The edge color denotes the interviewees sector, Academia (gold; left network) and U.S. Army Research Laboratory (ARL; blue, right network). Image best viewed in color and enlarged via online journal.

Simplified sector‐based network. Parent networks of cyber security (left network) and cyber security risk (right network) third‐order themes identified from the interview corpus using thematic analysis. Nodes (i.e., circles) are the third‐order theme and edges (i.e., connecting lines) signify interviewee definitions from which both themes were identified. The nodes of the same color are more densely connected to each other than other nodes in the network, that is community of nodes. The edge color denotes the interviewees sector, Academia (gold) and U.S. Army Research Laboratory (ARL; blue). Image best viewed in color and enlarged via online journal. Sector‐parsed networks of cyber security third‐order themes identified from the interview corpus using thematic analysis. Nodes (i.e., circles) are the third‐order theme and edges (i.e., connecting lines) signify interviewee definitions from which both themes were identified. The nodes of the same color are more densely connected to each other than other nodes in the network, that is community of nodes. The edge color denotes the interviewees sector, Academia (gold; left network) and U.S. Army Research Laboratory (ARL; blue, right network). Image best viewed in color and enlarged via online journal. Sector‐parsed networks of cyber security risk third‐order themes identified from the interview corpus using thematic analysis. Nodes (i.e., circles) are the third‐order theme and edges (i.e., connecting lines) signify interviewee definitions from which both themes were identified. The nodes of the same color are more densely connected to each other than other nodes in the network, that is community of nodes. The edge color denotes the interviewees sector, Academia (gold; left network) and U.S. Army Research Laboratory (ARL; blue, right network). Image best viewed in color and enlarged via online journal. Interpreting sector‐based networks: The nodes (i.e., circles) are the representative third‐order themes derived from interviewees’ respective definitions, and the size of the node corresponds to the number of degrees (i.e., connections) to other nodes. The color of the nodes represents the modularity of communities within the constructed network, that is nodes of the same color within a network are more densely connected to each other than other nodes in the network (Blondel, Guillaume, Lambiotte, & Lefebvre, 2008). Within Figs. 5, 6, 7, 8, the node colors signify communities of themes frequently expressed together in the same definition. For example, within Fig. 5, themes A, C, and D are all green because they were expressed together in the same or multiple definitions, while B (purple) was only expressed with A in one definition, and E (gray) is an isolated node and represents a definition with one theme. An isolated node has no connections to another node, versus a network component that is connected by least one path of edges (e.g., Fig. 5: B‐A‐C‐D‐A) The edges (i.e., connection, line) between two nodes (i.e., themes) illustrates the pairing of those two themes within a single interviewee definition. For example, themes A and B were found in the same definition and themes C and D were found in the same definition. The color of the edge indicates the interviewee's sector (gold for academia and blue for ARL), for example themes A and B were found in an one academia definition, while themes C and D were found in a second academia definition, while themes A, C, and D were found in either one (C‐A‐D) or two ARL definitions (A‐C, A‐D). Constructed networks: The third‐order theme and interviewee sector networks were constructed using the following layout protocol: Force Atlas > No Overlap > Label Adjust > Expansion. The sector‐based network analysis resulted in two undirected parent networks, one for cyber security third‐order themes and one for cyber security risk third‐order themes (see Fig. 6). Each parent network was further visualized as two child networks (i.e., sector‐parsed), one network of academic third‐order themes and one network of ARL third‐order themes (see Figs. 7 and 8). Figs. 6, 7, and 8 are the network visualizations of the network analysis performed on the cyber security and cyber security risk thematic analysis third‐order themes. Fig. 6 contains the network structure of cyber security (left network) and cyber security risk (right network) third‐order themes identified from interviewed cyber experts’ definitions of cyber security and cyber security risk, respectively. Fig. 7 contains the network structure of cyber security third‐order themes parsed into the interviewees sector, academia (left network) and ARL (right network). Fig. 8 contains the network structure of cyber security risk third‐order themes parsed into the interviewees sector, academia (left network), and ARL (right network). The emergent structure and interconnectedness of the sector‐based networks (Figs. 6, 7, and 8) were evaluated for the following network properties: node and edge count, connected components, graph density, influential nodes, isolated nodes, modularity, and hanging edges. Connected components: Each of the networks (parents: Fig. 6 and sector‐parsed: Figs. 7 and 8) have one main cohesive network structure (i.e., component), with the exception of two isolated nodes in the ARL cyber security risk network (Fig. 8, right network). A single cohesive network means that all the interviewees’ definitions share common themes with cyber experts within their sector and across the two sectors. Several fragmented networks would be indicative of wholly dissimilar definitions. Node and edge count: The academic and ARL participants’ cyber security and cyber security risk definitions produced approximately the same number of themes (i.e., nodes), 18 versus 15 and 22 versus 17, respectively (Table III). The number of edges in the academic cyber security network is 94, more than twice that of the ARL cyber security network with 42 edges. Also, the number of edges in the academic cyber security risk network is 68, which is nearly twice that of the ARL cyber security risk network with 35 edges. The academic subgroup tended to explore the multitudes of implications, more than ARL researchers, as illustrated by the increased number of identified themes and fewer academic definitions converging on unifying themes. Fewer themes were identified within any given ARL definition as compared to the multiple themes identified within a majority of the academic definitions. The pontification of academics compared to the directness of the ARL participants supports the basic research perspective of academics and the generalizing and operationalizing objective for the ARL. Additionally, the ARL subgroup themes are more management focused with a common understanding of the mission within the ARL and an emphasis on the crucial components they believe are required to attain cyber security. Some may argue this is a more practical approach, but all factors must be considered in order to holistically quantify cyber risk.
Table III

Minimum and Maximum Number of Degrees, with Respective Node (i.e., Third‐Theme) for Each Parent Networks and Parsed Sector Networks. The Degree is the Number of Edges (i.e., connections) Connecting to Other Nodes

NetworkNumber of CommTotal NTotal EGDMin DTheme(s)Max DTheme
Cyber Security
Corpus3191360.7952Systemic solutions43Context‐driven
Academia218940.6141Verifiable information provenance27Context‐driven
ARL315420.42Accurate intrusion detection; Comprehensive system awareness; Resource management; Systemic solutions16Context‐driven
Cyber Security Risk
Corpus3261030.3171Goal dependent28Impacts of CIA vulnerabilities
Academia322680.2941Goal dependent; Multiple realms of threats; Sociotechnical exploitation23Impacts of CIA vulnerabilities
ARL617350.4350Absolute and relative resource valuation; Scope of risk perception9Probability of outcomes

Comm. = communities, N = nodes, E = edges, GD = graph density, D = degrees.

Minimum and Maximum Number of Degrees, with Respective Node (i.e., Third‐Theme) for Each Parent Networks and Parsed Sector Networks. The Degree is the Number of Edges (i.e., connections) Connecting to Other Nodes Comm. = communities, N = nodes, E = edges, GD = graph density, D = degrees. Graph density: Graph density measures, on a scale from 0 to 1, the completeness of a network (Bastian, 2015). If all of the nodes connected to all of the other nodes in the network, graph density would be 1. The less connected (i.e., complete) a network, the closer the graph density is to 0. If a third‐order theme network were to have a graph density of 1, all of the definitions would have contained all of the identified third‐order themes. If a third‐order theme network were to have a graph density of 0, each of the definitions could contain a single third‐order theme that was unique to each definition, that is, no shared understanding. The corpus network for cyber security has the higher graph density of 0.795, while the corpus network for cyber security risk only has a graph density of 0.317 (Table III). The definitions provided for cyber security shared more third‐order themes across the interviewees than the cyber security risk definitions. The graph density of the cyber security sector‐parsed network is 0.4 for ARL and 0.614 for academia; the cyber security risk sector‐parsed network is 0.435 for ARL and 0.294 for academia. The academia definitions of cyber security shared more third‐order themes than ARL, while the ARL definitions of cyber security risk shared more third‐order themes than academia. Influential nodes: Table III details the most and least influential nodes in each network. The most influential node is the node connected to the most nodes, that is the third‐order theme that was identified within the most definitions and definitions from which more than one third‐order theme was identified. Within the corpus and both sector cyber security networks, the third‐order theme of “Context‐driven” is the most influential node with 43 connections (i.e., degrees) in the corpus network, 27 connections in the academia network, and 16 connections in the ARL network. “Impacts of CIA vulnerabilities” is the most influential node in the cyber security risk corpus and academia network with 28 and 23 connections, respectively. “Probability of outcomes” is the most influential node in the cyber security risk ARL network with nine connections. When the cyber experts’ definition themes were examined as a group (rather than by sector), “Context‐driven,” “Resilient system functionality,” and “Maintenance of CIA” were the top three most salient theme and influential network nodes for the definition of cyber security, while “Impacts of CIA vulnerabilities,” “Probabilities of outcomes,” and “Context‐driven” were the top three themes for cyber security risk. Isolated nodes: Isolated nodes are nodes that have no connecting edges to any other node. Isolated nodes represent definitions that contained one third‐order theme that was not identified in any other interviewee definition. The ARL network for cyber security risk, (Fig. 8) has two isolated nodes: “Scope of risk perception” in the upper left section of the network and “Absolute and relative valuation” in the middle right section of the network. Modularity: Modularity is a measure of connectedness within the network and is represented by node color. The more densely interconnected nodes, in comparison to the rest of the network, share the same color. Modularity can be used to identify communities of nodes, which represent connections between themes of those that share similar pairings or groupings. Hanging Edges: Hanging edges represent definitions from which only one third‐order theme was identified. If a node has a hanging edge, it will be on the right side of node parallel to the legend text, for example “Maintenance of CIA” in the academia network for cyber security (Fig. 7, left network). Hanging edges are present in all networks except for the ARL network for cyber security (Fig. 7, right network).

Discipline‐based network analysis

The discipline‐based network (Fig. 9) visualizes the cross‐tabulation of interviewees per discipline per cyber security (left) and cyber security risk (right) third‐order theme. Tables SVII and SVIII are the tabular form of the data used to produce Fig. 9. The discipline nodes were anchored in the center as the spine of the network with the cyber security third‐order theme nodes alphabetically radiating to the left and the cyber security risk third‐order theme nodes to the right (Fig. 9). The disciplines and research areas listed in Table I were grouped into six representative disciplines: cognition and human factors (e.g., behavioral psychology, cognitive neuroscience, and human factors engineering); computer and cyber security (e.g., computer engineering, computer science, computer security, security informatics, and security monitoring and intrusion protection); decision sciences (risk assessment and social and decision sciences), network science, security, and engineering (e.g., network science and engineering, systems and network security, tactical mobile and strategic networks, and wireless systems and networks); software engineering; and user privacy and security.
Fig 9

Discipline‐parsed network of cyber security and cyber security risk third‐order themes identified from the interview corpus using thematic analysis. Center nodes (i.e., colorful circles) are the interviewee's discipline and external nodes (i.e., gray circles) are the third‐order themes. The size of the node corresponds to the number interviewees per discipline (colored nodes) and total number of interviewee definitions per theme (gray nodes). The edges (i.e., lines) connect interviewee discipline with identified third order themes in their definition. The edge color denotes the interviewee discipline. The edge thickness/weight corresponds to the number of interviewees per discipline per theme. Image best viewed in color and enlarged via online journal.

Discipline‐parsed network of cyber security and cyber security risk third‐order themes identified from the interview corpus using thematic analysis. Center nodes (i.e., colorful circles) are the interviewee's discipline and external nodes (i.e., gray circles) are the third‐order themes. The size of the node corresponds to the number interviewees per discipline (colored nodes) and total number of interviewee definitions per theme (gray nodes). The edges (i.e., lines) connect interviewee discipline with identified third order themes in their definition. The edge color denotes the interviewee discipline. The edge thickness/weight corresponds to the number of interviewees per discipline per theme. Image best viewed in color and enlarged via online journal. The size of the discipline nodes corresponds to the number of interviewees per discipline and the size of the third‐order theme nodes corresponds to the number of interviewees whose definition contained the theme. The thickness of the edge connecting a representative discipline and third‐order theme is proportional to the number of interviewees from the discipline whose definition contain the connected theme. The color of the edges identifies the interviewees’ representative discipline. Since the structure of the discipline‐based network was predetermined (i.e., no algorithm was used to determine nodal relationships), the structural network properties were not analyzed. While the 14 of the 19 cyber security third‐order themes and 19 of the 26 cyber security risk third‐order themes were “multidisciplinary” (i.e., more than one discipline), only one theme per definition question encompassed all disciplines. All six disciplines were represented in the 10 interviewees’ whose cyber security definition contained the third‐order theme “maintenance of CIA” and the 10 interviewees’ whose cyber security risk definition contained the third‐order theme “impacts of CIA vulnerabilities.” These cross‐disciplinary references to CIA reflect the pervasiveness and historical emphasis on CIA and the protected data in the cyber security field. All disciplines except for software engineering were represented in the top cyber security third‐order theme “context‐driven” (13 interviewees). Additionally, all disciplines except for software engineering and user privacy and security were represented in the top cyber security risk third order theme “probability of outcomes” (11 interviewees). Despite the fact that three of the six disciplines are human focused, and the other three acknowledge the importance of human factors in their work, humans’ role in cyber security and cyber security risk is implied rather than directly cited by interviewees in most disciplines.

Content Analysis

In addition to thematic text analysis, semantic text analysis methods can qualitatively and quantitatively evaluate the relationships between vocabulary and themes appearing in text (Roberts, 2000). Oltramari and Kott (2018) conducted a semantic text analysis to evaluate the number of risk‐related concepts incorporated into cyber ontologies. These authors developed a cyber risk vocabulary list consisting of 45 terms found in the literature and determined the frequency of occurrence each term appeared in 10 cyber security risk‐related ontologies.4 Using the cyber risk vocabulary list compiled by Oltramari and Kott, a content analysis was performed on the original full‐length definitions provided by each expert elicitation participant. The content analysis determined the number of definitions that used each vocabulary term, not the frequency of the vocabulary term within the definitions. The content analysis was then compared to the “frequency of occurrence” conducted by Oltramari and Kott (Table III), which measures the number of ontologies containing the specific terms. When two terms were listed together by Oltramari and Kott (2018), for example source/origin, the frequency was determined for each term and then summed, ensuring that no ontologies were counted twice. The percentage of interview participants’ definitions that contained each vocabulary term was calculated for cyber security (N = 25) and cyber security risk (N = 27) and compared against the “frequency of occurrence (represented as a percentage; N = 9) calculated by Oltramari and Kott (2018). For this comparison, the interview participants were not separated into ARL and academia subgroups. For each definition (i.e., cyber security and cyber security risk) and each vocabulary term, the percentage was computed using the number of interview participant definitions that contained each vocabulary term and the total CSec CRA expert elicitation participants. Several cyber risk vocabulary terms and concepts were modified by the adjectives “cyber” or “risk” and consequently, the full term, for example cyber attack, rarely, if ever, appeared in the expert elicitations. An additional content analysis was performed with adjectives removed and the stem form of the term, for example “vulnerab” rather than “cyber vulnerability.” Using the stem form of the term ensures that all possible forms of the term are captured in the content analysis, for example “vulnerable,” “vulnerability,” and “vulnerabilities.”

Content analysis results

The content analysis of the interviewees’ original definition based on the cyber risk vocabulary compiled by Oltramari and Kott (2018) is detailed in Table IV. The content analysis comparison determines whether the expert elicitation participants used the same language as cyber ontology developers when defining cyber security and cyber security risk. Table IV provides the vocabulary percentage breakdown for the ontologies identified by Oltramari and Kott (2018) and the expert elicitation definitions provided for cyber security and cyber security risk.
Table IV

Expert Elicitation Use of Cyber Security Terms and Concepts by Percentage. The Cyber Risk Ontology (CR Ont; N = 9) Columns is the Percentage of Select Ontologies that Contain the Cyber Risk Specific Vocabulary Terms and as Determined by Oltramari and Kott. The Cyber Security Expert Elicitation (CS EE; N = 25) and Cyber Security Risk Expert Elicitation (CSR EE; N = 27) Columns are Percentage of Experts Who Used the Term Within Their Definition of Cyber Security and Cyber Security Risk, Respectively

Term/ConceptCR OntCS EECSR EETerm/ConceptCR OntCS EECSR EE
Alert22.20.00.0Impact44.40.07.4
Asset66.70.03.7Intent22.20.03.7
Benefit11.10.03.7Likelihood33.30.014.8
Configuration33.30.00.0Mission33.30.00.0
Consequence55.64.07.4Network33.332.022.2
Control11.18.03.7Origin/Source55.620.011.1
Cost22.24.03.7Payload22.20.00.0
Countermeasure44.40.00.0Report22.20.00.0
Credential22.24.00.0Risk33.30.081.5
Cyber attack88.90.00.0Risk assessment33.30.011.1
Cyber defense22.20.00.0Risk factor33.30.00.0
Cyber exploitation44.40.00.0Risk identification11.10.00.0
Cyber incident22.20.00.0Risk metric33.30.00.0
Cyber operation22.20.00.0Risk mitigation11.10.00.0
Cyber response22.20.00.0Risk monitoring11.10.00.0
Cyber risk22.20.03.7Security protocol11.10.00.0
Cyber threat22.24.00.0Security/Risk Policy33.30.00.0
Cyber vulnerability66.70.00.0Service55.64.03.7
Dependability22.20.00.0Situation33.30.00.0
Detection55.60.03.7Stakeholder22.20.00.0
Failure33.30.03.7Target44.40.00.0
Fault33.30.00.0Threat44.412.018.5
Treatment11.10.00.0
Expert Elicitation Use of Cyber Security Terms and Concepts by Percentage. The Cyber Risk Ontology (CR Ont; N = 9) Columns is the Percentage of Select Ontologies that Contain the Cyber Risk Specific Vocabulary Terms and as Determined by Oltramari and Kott. The Cyber Security Expert Elicitation (CS EE; N = 25) and Cyber Security Risk Expert Elicitation (CSR EE; N = 27) Columns are Percentage of Experts Who Used the Term Within Their Definition of Cyber Security and Cyber Security Risk, Respectively The top cyber risk vocabulary terms for cyber ontology developers were “cyber attack,” “asset,” and “cyber vulnerability,” while the top terms for expert elicitation participants were “network,” “control,” and “consequences” for cyber security and “risk,” “network,” and “likelihood” for cyber security risk (Table IV). However, when the expert elicitations were evaluated for the nonadjectival stem forms of the vocabulary term, the top terms for cyber security were “network,” “attack” (stemmed from “cyber attack”), and “identi” (stemmed from “risk identification); the top terms for cyber security risk were “risk,” “attack” (stemmed from “cyber attack”), and “vulnerab” (stemmed from “cyber vulnerability”) (Table V). The majority of vocabulary terms appeared in a greater percentage of the cyber risk ontologies than of the expert elicitation definitions. “Cyber attack,” “cyber vulnerability,” and “asset” were the top vocabulary terms used by the cyber risk ontology developers.
Table V

Content Analysis of Expert Elicitation for Cyber Risk Vocabulary Without Adjectives and Using Stem Form of Vocabulary Terms. The Cyber Risk Vocabulary Terms is the Full Form of the Vocabulary Term/Concept Identified by Oltramari and Kott. The Stemmed Vocabulary Terms Are the Stem (i.e., base) Form of the Word without Any Adjective. The Cyber Risk Ontology (CR Ont; N = 9) columns is the Percentage of Select Ontologies that Contain the Cyber Risk Specific Vocabulary Terms and as Determined by Oltramari and Kott. The Cyber Security Expert Elicitation (CS EE; N = 25) and Cyber Security Risk Expert Elicitation (CSR EE; N = 27) Columns Are Percentage of Experts Who Used the Term Within Their Definition of Cyber Security and Cyber Security Risk, Respectively

Cyber Risk Vocabulary TermsCR OntCS EE %CSR EE %Stemmed Vocabulary TermsCS EE %CSR EE %
Cyber attack88.90.00.0Attack24.029.6
Cyber defense22.20.00.0Defen8.00.0
Cyber exploitation44.40.00.0Exploit0.011.1
Cyber operation22.20.00.0Operat16.07.4
Cyber response22.20.00.0Respon4.00.0
Cyber vulnerability66.70.00.0Vulnerab4.025.9
Dependability22.20.00.0Dependab4.00.0
Risk identification11.10.00.0Identi24.014.8
Risk metric33.30.00.0Metric4.03.7
Risk mitigation11.10.00.0Mitigat8.00.0
Content Analysis of Expert Elicitation for Cyber Risk Vocabulary Without Adjectives and Using Stem Form of Vocabulary Terms. The Cyber Risk Vocabulary Terms is the Full Form of the Vocabulary Term/Concept Identified by Oltramari and Kott. The Stemmed Vocabulary Terms Are the Stem (i.e., base) Form of the Word without Any Adjective. The Cyber Risk Ontology (CR Ont; N = 9) columns is the Percentage of Select Ontologies that Contain the Cyber Risk Specific Vocabulary Terms and as Determined by Oltramari and Kott. The Cyber Security Expert Elicitation (CS EE; N = 25) and Cyber Security Risk Expert Elicitation (CSR EE; N = 27) Columns Are Percentage of Experts Who Used the Term Within Their Definition of Cyber Security and Cyber Security Risk, Respectively Of the 45 cyber risk vocabulary terms and concepts identified by Oltramari and Kott, only 19 terms/concepts were used during the expert elicitation to define cyber security and cyber security risk. This number increased to 29 when the nonadjectival stemmed forms of terms/concepts were used. The terms/concepts that appear in cyber security risk ontologies but were not found in the expert elicitation are listed in Table VI. The nonadjectival stemmed forms of terms/concepts with the asterisks were found in the expert elicitation.
Table VI

Cyber Risk Vocabulary Terms and Concepts from Semantic Analysis of Ontologies (Oltramari & Kott, 2018) Not Used by Expert Elicitation Participants to Define Cyber Security or Cyber Security Risk. Asterisk Denotes Terms/Concepts Found in the Expert Elicitation in Their NonADJECTIVE Stemmed Form

AlertCyber exploitation*Dependability*Risk factorSecurity protocol
ConfigurationCyber incidentFaultRisk identification*Security/Risk Policy
CountermeasureCyber operation*MissionRisk metric*Situation
Cyber attack*Cyber response*PayloadRisk mitigation*Stakeholder
Cyber defense*Cyber vulnerability*ReportRisk monitoringTarget
Treatment
Cyber Risk Vocabulary Terms and Concepts from Semantic Analysis of Ontologies (Oltramari & Kott, 2018) Not Used by Expert Elicitation Participants to Define Cyber Security or Cyber Security Risk. Asterisk Denotes Terms/Concepts Found in the Expert Elicitation in Their NonADJECTIVE Stemmed Form

Content analysis discussion

The content analysis compared the cyber security and cyber security risk terms used by CSec CRA expert elicitation participants (from this study) versus the cyber security and cyber security risk terms used in cyber ontologies (Oltramari & Kott, 2018). The results of this analysis suggest ontology developers and cyber experts (researchers and practitioners) do not use the same terms to operationalize cyber security risk. A number of terms used in ontologies to describe cyber security risk were not included in the definitions of cyber security and cyber security risk generated during the expert elicitation (Table VI); however, both groups focused on “attack” as a key term. Other top terms used by ontology developers to describe cyber security risk include “vulnerability” and “target.” Other top terms used by expert elicitation interviewees include “network” and “threat.” The results from the final analysis indicates that there is reason to believe that experts in different disciplines and sectors, each working in some aspect of cyber security, focus on different aspects of the complex cyber risk universe or use different terms to describe similar concepts (Stemler, 2001).

Comparison to Standards and Best Practices

Cyber security definitions have changed over time in part due to the increasing dependence of global society on cyber systems. The third‐order themes used to build the composite definitions of cyber security (CS) and cyber security risk (CSR) were compared against a selection of national and international standards and best practices (Table VII). The comparisons in Tables VIII and IX illustrate the congruencies and inconsistencies across state‐of‐the‐art and regulatory definitions and emphasize the common themes that are highlighted. The comparisons also highlight expert‐identified important themes that are not included in any of the current definitions. The following sources of definitions were used: Federal Information Security Modernization Act of 2014 (U.S. Congress, 2014), International Organization for Standardization (ISO, 2012), International Telecommunication Union (ITU, 2008; 2011), National Institute of Standards and Technology (NIST, 2018), National Initiative for Cybersecurity Careers and Studies (NICCS; DHS, 2020), Committee on National Security Systems (CNSS, 2015), and the World Economic Forum (WEF, 2012). The Merriam‐Webster Dictionary (MW, 2020) and Craigen et al. (2014) sources are not a standard or best practice, however, they provide context for nontechnical general definitions. Additionally, the European Network and Information Security Agency (ENISA) uses the abbreviated United States Department of Homeland Security (DHS) NICCS glossary definition (DHS, 2020), while the presented comparison evaluates the “extended definition” provided by the glossary (Brookson et al., 2015).
Table VII

Select Definitions of Cyber/Information Security and Risk from National and International Standards and Best Practices. Merriam‐Webster (MW) and Craigen et al., 2014 Are Standard or Best Practice, however it Provides Context for a Nontechnical General Definition

SourceSecurity definitionRisk definition
MW

Cybersecurity:

“Measures taken to protect a computer or computer system (as on the Internet) against unauthorized access or attack.”

Security risk:

1) “someone who could damage an organization by giving information to an enemy or competitor”

2) “someone or something that is a risk to safety”

Craigen et al., 2014

Cybersecurity:

“…the organization and collection of resources, processes, and structures used to protect cyberspace and cyberspace‐enabled systems from occurrences that misalign de jure from de facto property rights.”

No explicit definition for cyber or information security risk.
FISMA

Information security:

“The term ‘information security’ means protecting information and information systems

from unauthorized access, use, disclosure, disruption, modification, or destruction in order to provide:

(A) integrity, which means guarding against improper information modification or destruction, and includes ensuring information nonrepudiation and authenticity;

(B) confidentiality, which means preserving authorized restrictions on access and disclosure, including means for protecting personal privacy and proprietary information;

and

(C) availability, which means ensuring timely and reliable access to and use of information.

No explicit definition for cyber or information security risk.
ISO

Cybersecurity:

“Preservation of confidentiality, integrity and availability of information in the Cyberspace. In addition, other properties, such as authenticity, accountability, non‐repudiation, and reliability can also be involved”

No explicit definition for cyber or information security risk
ITU

Cybersecurity:

“The collection of tools, policies, security concepts, security safeguards, guidelines, risk management approaches, actions, training, best practices, assurance and technologies that can be used to protect the cyber environment and organization and user's assets. Organization and user's assets include connected computing devices, personnel, infrastructure, applications, services, telecommunications systems, and the totality of transmitted and/or stored information in the cyber environment.

Cybersecurity strives to ensure the attainment and maintenance of the security properties of the organization and user's assets against relevant security risks in the cyber environment. The general security objectives comprise the following: Availability; Integrity, which may include authenticity and nonrepudiation; Confidentiality”

Security risk:

“The probability that a threat will exploit a vulnerability to breach the security of an asset.”

NICCS

Cybersecurity:

“Strategy, policy, and standards regarding the security of and operations in cyberspace, and encompass[ing] the full range of threat reduction, vulnerability reduction, deterrence, international engagement, incident response, resiliency, and recovery policies and activities, including computer network operations, information assurance, law enforcement, diplomacy, military, and intelligence missions as they relate to the security and stability of the global information and communications infrastructure.”

Risk:

“The potential for an unwanted or adverse outcome resulting from an incident, event, or occurrence, as determined by the likelihood that a particular threat will exploit a particular vulnerability, with the associated consequences.”

NIST

Cybersecurity:

“The process of protecting information by preventing, detecting, and responding to attacks.”

Risk:

“A measure of the extent to which an entity is threatened by a potential circumstance or event, and typically a function of (i) the adverse impacts that would arise if the circumstance or event occurs; and (ii) the likelihood of occurrence.”

CNSS

Cybersecurity:

“Prevention of damage to, protection of, and restoration of computers, electronic communications systems, electronic communications services, wire communication, and electronic communication, including information contained therein, to ensure its availability, integrity, authentication, confidentiality, and nonrepudiation.”

Risk:

“A measure of the extent to which an entity is threatened by a potential circumstance or event, and typically a function of (i) the adverse impacts that would arise if the circumstance or event occurs; and (ii) the likelihood of occurrence.

Note: Information system‐related security risks are those risks that arise from the loss of confidentiality, integrity, or availability of information or information systems and reflect the potential adverse impacts to organizational operations (including mission, functions, image, or reputation), organizational assets, individuals, other organizations, and the Nation. Adverse impacts to the Nation include, for example, compromises to information systems that support critical infrastructure applications or are paramount to government continuity of operations as defined by the Department of Homeland Security.”

WEF

Cybersecurity:

“‘Cybersecurity’ refers to analysis, warning, information sharing, vulnerability reduction, risk mitigation and recovery efforts for networked information systems.”

Cyber risk:

‘Cyber risks’ are defined as the combination of the probability of an event within the realm of networked information systems and the consequences of this event on assets and reputation.”

MW, 2020.: Merriam‐Webster Dictionary; FISMA: United States Federal Information Security Modernization Act of 2014 (ISO, 2012; U.S. Congress, 2014): International Organization for Standardization; ITU: International Telecommunication Union, 2008, 2011; NICCS: National Initiative for Cybersecurity Careers and Studies (DHS, 2020); NIST, 2018: United States National Institute of Standards and Technology; CNSS, 2015, : United States Committee on National Security Systems; WEF, 2012: World Economic Forum.

Table VIII

Comparison of third‐order thematic Analysis Themes Identified in the Definition of Cyber/Information Security for US and International Standards and Best Practices

Third‐Order Theme(Expert Elicitation Interviewee Count; Percent)MWCraigen et al., 2014 FISMAISOITUNICCSNISTCNSSWEF
Context‐driven (13; 52%)(x)
Resilient system functionality (12; 48%)xxxxx
Maintenance of CIA (10; 40%)xxx(x)x
Threat prediction and prevention (7; 28%)(x)(x)(x)(x)xxx(x)
Protection of resources (6; 24%)xXx(x)xxxx(x)
Sociotechnical system (5; 20%)x
Unattainable (4; 16%)
Resource management (4; 6%)(x)(x)(x)(x)xx(x)x(x)
Comprehensive system awareness (3; 12%)(x)(x)
Evolving security standards (3; 12%)
Iterative (and/or active) process (2; 8%)(x)(x)(x)
Accurate intrusion detection (2; 8%)(x)x(x)
Security competence (2; 8%)(x)(x)
Verifiable information provenance (2; 8%)(x)(x)x
Characterization and effects of human factor (2; 8%)
Diverse dimensions and factors (2; 8%)x
Quantifiable security (2; 8%)
Complicated (1; 4%)
Systemic solutions (1; 4%)X(x)(x)

x = Third‐order theme concept and or words explicitly included in definition; (x) = third‐order theme concept is implicit in definition, ∼ = required assumption for definition context.

Source of definition is not a standard or best practice, however it provides context for a nontechnical general definition.

MW, 2020: Merriam‐Webster Dictionary; FISMA: United States Federal Information Security Modernization Act of 2014 (ISO, 2012; U.S. Congress, 2014): International Organization for Standardization; ITU: International Telecommunication Union, 2008, 2011; NICCS: National Initiative for Cybersecurity Careers and Studies (DHS, 2020); NIST, 2018: United States National Institute of Standards and Technology; CNSS, 2015: United States Committee on National Security Systems; WEF, 2012: World Economic Forum.

Table IX

Comparison of Third‐Order Thematic Analysis Themes Identified in the Definition of Cyber/Information Security Risk for US and International Standards and Best Practices. FIMSA nor ISO Provide an Explicit Definition for Cyber or Information Security Risk

Third‐Order Theme(Expert Elicitation Interviewee Count; Percent)MW* Craigen et al., 2014, * FISMAISOITUNICCSNISTCNSSWEF
Probability of outcomes (11; 41%)N/AN/AN/Axx(x)xx
Impacts of CIA Vulnerabilities (10; 37%)x
Context‐driven (6; 22%)(x)
Uncertainties introduced by human factors (5; 19%)x
Vulnerabilities (known and unknown) (4; 15%)xx
Negative consequences (4; 15%)x(x)xxx
Absolute and relative resource valuation (4; 15%)
Monetization (3; 11%)
Negative consequences for humans (3; 11%)(x)x
Multiple dimensions and scales (3; 11%)x
Vulnerabilities introduced by human factors (3; 11%)x
Classical elements of risk (3; 11%)xxxx(x)
Sociotechnical exploitation (2; 7%)x
Complicated (2; 7%)
Interference with physical components (2; 7%)x(x)
Interference with cyber security (2; 7%)(x)x(x)
Probability of vulnerability (2; 7%)xx(x)(x)
Scope of risk perception (2; 7%)
Vulnerabilities in first line of defense (1; 4%)(x)
Limits of detection (1; 4%)
Risk Quantification (1; 4%)(x)(x)x
Agility is paramount (1; 4%)
Agility‐dependent (1; 4%)
Beyond CIA (1; 4%)x
Goal dependent (1; 4%)
Multiple realms of threats (1; 4%)xx

x = Third‐order theme concept and or words explicitly included in definition; (x) = third‐order theme concept is implicit in definition, ∼ = required assumption for definition context.

Source of definition is not a standard or best practice, however it provides context for a nontechnical general definition.

MW, 2020: Merriam‐Webster Dictionary; FISMA: United States Federal Information Security Modernization Act of 2014 (U.S. Congress); ISO, 2012: International Organization for Standardization; ITU: International Telecommunication Union, 2008, 2011; NICCS, 2020: National Initiative for Cybersecurity Careers and Studies; NIST, 2018: United States National Institute of Standards and Technology; CNSS, 2015: United States Committee on National Security Systems; WEF, 2012: World Economic Forum.

Select Definitions of Cyber/Information Security and Risk from National and International Standards and Best Practices. Merriam‐Webster (MW) and Craigen et al., 2014 Are Standard or Best Practice, however it Provides Context for a Nontechnical General Definition Cybersecurity: “Measures taken to protect a computer or computer system (as on the Internet) against unauthorized access or attack.” Security risk: 1) “someone who could damage an organization by giving information to an enemy or competitor” 2) “someone or something that is a risk to safety” Cybersecurity: “…the organization and collection of resources, processes, and structures used to protect cyberspace and cyberspace‐enabled systems from occurrences that misalign de jure from de facto property rights.” Information security: “The term ‘information security’ means protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction in order to provide: (A) integrity, which means guarding against improper information modification or destruction, and includes ensuring information nonrepudiation and authenticity; (B) confidentiality, which means preserving authorized restrictions on access and disclosure, including means for protecting personal privacy and proprietary information; and (C) availability, which means ensuring timely and reliable access to and use of information. Cybersecurity: “Preservation of confidentiality, integrity and availability of information in the Cyberspace. In addition, other properties, such as authenticity, accountability, non‐repudiation, and reliability can also be involved” Cybersecurity: “The collection of tools, policies, security concepts, security safeguards, guidelines, risk management approaches, actions, training, best practices, assurance and technologies that can be used to protect the cyber environment and organization and user's assets. Organization and user's assets include connected computing devices, personnel, infrastructure, applications, services, telecommunications systems, and the totality of transmitted and/or stored information in the cyber environment. Cybersecurity strives to ensure the attainment and maintenance of the security properties of the organization and user's assets against relevant security risks in the cyber environment. The general security objectives comprise the following: Availability; Integrity, which may include authenticity and nonrepudiation; Confidentiality” Security risk: “The probability that a threat will exploit a vulnerability to breach the security of an asset.” Cybersecurity: “Strategy, policy, and standards regarding the security of and operations in cyberspace, and encompass[ing] the full range of threat reduction, vulnerability reduction, deterrence, international engagement, incident response, resiliency, and recovery policies and activities, including computer network operations, information assurance, law enforcement, diplomacy, military, and intelligence missions as they relate to the security and stability of the global information and communications infrastructure.” Risk: “The potential for an unwanted or adverse outcome resulting from an incident, event, or occurrence, as determined by the likelihood that a particular threat will exploit a particular vulnerability, with the associated consequences.” Cybersecurity: “The process of protecting information by preventing, detecting, and responding to attacks.” Risk: “A measure of the extent to which an entity is threatened by a potential circumstance or event, and typically a function of (i) the adverse impacts that would arise if the circumstance or event occurs; and (ii) the likelihood of occurrence.” Cybersecurity: “Prevention of damage to, protection of, and restoration of computers, electronic communications systems, electronic communications services, wire communication, and electronic communication, including information contained therein, to ensure its availability, integrity, authentication, confidentiality, and nonrepudiation.” Risk: “A measure of the extent to which an entity is threatened by a potential circumstance or event, and typically a function of (i) the adverse impacts that would arise if the circumstance or event occurs; and (ii) the likelihood of occurrence. Note: Information system‐related security risks are those risks that arise from the loss of confidentiality, integrity, or availability of information or information systems and reflect the potential adverse impacts to organizational operations (including mission, functions, image, or reputation), organizational assets, individuals, other organizations, and the Nation. Adverse impacts to the Nation include, for example, compromises to information systems that support critical infrastructure applications or are paramount to government continuity of operations as defined by the Department of Homeland Security.” Cybersecurity: “‘Cybersecurity’ refers to analysis, warning, information sharing, vulnerability reduction, risk mitigation and recovery efforts for networked information systems.” Cyber risk: ‘Cyber risks’ are defined as the combination of the probability of an event within the realm of networked information systems and the consequences of this event on assets and reputation.” MW, 2020.: Merriam‐Webster Dictionary; FISMA: United States Federal Information Security Modernization Act of 2014 (ISO, 2012; U.S. Congress, 2014): International Organization for Standardization; ITU: International Telecommunication Union, 2008, 2011; NICCS: National Initiative for Cybersecurity Careers and Studies (DHS, 2020); NIST, 2018: United States National Institute of Standards and Technology; CNSS, 2015, : United States Committee on National Security Systems; WEF, 2012: World Economic Forum. Comparison of third‐order thematic Analysis Themes Identified in the Definition of Cyber/Information Security for US and International Standards and Best Practices x = Third‐order theme concept and or words explicitly included in definition; (x) = third‐order theme concept is implicit in definition, ∼ = required assumption for definition context. Source of definition is not a standard or best practice, however it provides context for a nontechnical general definition. MW, 2020: Merriam‐Webster Dictionary; FISMA: United States Federal Information Security Modernization Act of 2014 (ISO, 2012; U.S. Congress, 2014): International Organization for Standardization; ITU: International Telecommunication Union, 2008, 2011; NICCS: National Initiative for Cybersecurity Careers and Studies (DHS, 2020); NIST, 2018: United States National Institute of Standards and Technology; CNSS, 2015: United States Committee on National Security Systems; WEF, 2012: World Economic Forum. Comparison of Third‐Order Thematic Analysis Themes Identified in the Definition of Cyber/Information Security Risk for US and International Standards and Best Practices. FIMSA nor ISO Provide an Explicit Definition for Cyber or Information Security Risk x = Third‐order theme concept and or words explicitly included in definition; (x) = third‐order theme concept is implicit in definition, ∼ = required assumption for definition context. Source of definition is not a standard or best practice, however it provides context for a nontechnical general definition. MW, 2020: Merriam‐Webster Dictionary; FISMA: United States Federal Information Security Modernization Act of 2014 (U.S. Congress); ISO, 2012: International Organization for Standardization; ITU: International Telecommunication Union, 2008, 2011; NICCS, 2020: National Initiative for Cybersecurity Careers and Studies; NIST, 2018: United States National Institute of Standards and Technology; CNSS, 2015: United States Committee on National Security Systems; WEF, 2012: World Economic Forum.

Comparison of Definitions Results

The NICCS cyber security definition was the most complete definition of the nine definitions analyzed. In the NICCS definition, 14 of the 19 cyber security third‐order themes were identified (Table VIII). The CNSS cyber security risk definition was the most complete of the six analyzed definitions, with 14 of the 26 cyber security risk third‐order themes identified (Table IX). None of the standards or best practices contained all of the third‐order themes for either cyber security or cyber security risk. For the cyber security definition, “protection of resources” and “resource management” were either explicit or implicit in all nine definitions, while threat prediction and/or prevention was explicit in only three definitions, and implied in another five. Only five of nine definitions explicitly referred to or implied “maintenance of confidentiality, integrity, and availability” and “resilient system functionality,” while the critical aspects of intrusion detection, concerns about information provenance, and the need for a systemic solution were only mentioned in one and inferred in two additional definitions. The security state of any connected cyber system or even any single machine is constantly in flux, thus any definition of cyber security should reflect this dynamic situation. Yet despite this very broadly acknowledged phenomenon (Heard, Heard, & Adams, 2016; Xu, 2019), only three of the nine definitions even implied the ever‐changing nature of functional cyber security, and only one referred in any way to the human elements inherent in cyber networks, and thus in cyber security. The approaches to defining cyber security risk, in the six definitions analyzed, tended toward considering risk as a probability of harmful outcomes. Of these, only one considered human impacts of cyber security risk, and none included humans as risk creators or inducers, that is attackers, malicious users, or incompetent users or defenders. And despite the common reliance on monetization as a means of normalizing the quantified impacts of cyber security breaches (IBM Security, 2019; Verizon, 2020), none of the six cyber security risk definitions included financial valuation.

Comparison of Definitions Discussion

The European Network and Information Security Agency (ENISA) performed a similar “cybersecurity” definition comparison to identify gaps in the standardization of the terminology (Brookson et al., 2015). The gap analysis evaluated the use and definition, or lack thereof, of cyber security as defined by the Oxford dictionary, the Merriam–Webster dictionary, European Telecommunications Standards Institute (ESTI), ISO, ITU, NIST, North Atlantic Treaty Organization (NATO), and CNSS. Despite ESTI's cyber related activities and initiatives and NATO's Cooperative Cyber Defence Centre of Excellence, neither organization formally defines their use of cyber security. Since cyber security is an “enveloping term,” ENISA does not recommend having a singular definition, rather that each fit‐for‐purpose definition is unambiguous and explicit in scope and application (Brookson et al., 2015). Schatz, Bashroush, & Wall, 2017 computational linguistic analysis of existing definitions highlighted the common thread of CIA of information, the high‐level management of networks by organizations and states, clearly refers to human users (and by implication, defenders) and the need for training of the human elements. It is an inwardly looking definition, providing a shell of protection around information assets, but does not include reference to attackers or any form of malware or “bad actors”—human or not. Over the last 20 years the use of cyber networks has expanded from communications and control systems to become the underpinning of societal “critical infrastructure” (such as water systems controls and the electrical grid) and manufacturing as well as of social and commercial communications and interactions, and the control of small microscale Internet of Things networks. Humans are inevitable parts of the current cyber security framework, as users, defenders, and attackers. Thus, in any definition of cyber security and cyber security risk, human connection needs to be acknowledged, the inherent complexity needs to be taken into account and the dynamic nature of cyber technology and its integration into our society and social structures needs to be implied if not actually included in the definition.

DISCUSSION

Cyber experts from different sectors across multiple disciplines and research areas were asked via expert elicitation to define cyber security and cyber security risk. The results of the expert elicitation were analyzed using data‐driven thematic analysis and content analysis and the results were compared to current national and international standards and best practices. The output of the thematic analysis was visualized using butterfly diagrams and network analysis. The use of network analysis led to additional points of analysis regarding the interconnectedness of themes across (e.g., ARL to academia) and within sectors (e.g., amongst ARL interviewees). The results suggest: (1) definitions of cyber security and cyber security risk vary across and within ARL and academic sectors, and (2) definitions of cyber security and cyber security risk differ between cyber experts and ontology developers. Furthermore, some experts had interdependent definitions of cyber security and cyber security risk, using one term to define the other.

Analysis and Interpretation Challenges

Interpretation Challenges

Although data‐driven thematic analysis is useful in determining collective concepts associated with cyber security and cyber security risk, the analysis method is subject to the following challenges: Semistructured interviews are subject to inconsistencies in communication efficiency and senses of mutual understanding (Margaret, 1994); for example, an expert feels s/he has expressed a concept, while the interviewer may not register or interpret that concept in the analysis. Expert elicitation is a subjective measure, that is beliefs of an individual or a group of experts impact the results (O'Hagan, 2019). Thematic analysis is qualitative and vulnerable to analyst subjectivity (Usher & Strachan, 2013). The results of the expert elicitation and subsequent thematic analysis only represent those interviewed and not the entirety of the expertise domains of cyber security and cyber security risk. In essence, the analysis of an expert elicitation is a function of both the experts’ beliefs as well as the researchers’ interpretation of the data. To reduce any researcher bias and enhance the reliability of the analysis, this analysis had three researchers independently transcribe the interviews from the interview corpus, and the thematic and network analyses used all three sets of transcriptions. Research suggests experts are less susceptible than other individuals to the bias that is exaggerated by popular media or cultural nuances, for example the influence of peer networks as experts rely on data analysis rather than inference from hearsay (Quigley et al., 2013). Some differences in the analysis of the two sectors (academia and ARL) were due to differences in the expert elicitation procedures. The ARL interviews were not allowed to be recorded and thus were dually transcribed in situ by the interviewer and a scribe, while the audio of the academic interviews was recorded for later transcription and verification. This procedural difference may have influenced the amount of detail retained within the definitions provided by the ARL participants, however the duplicate notetaking was intended to reduce the loss of information. Cyber experts, like other humans, are subject to cognitive biases. Experts typically use experience and intuition to make decisions about cyber security and cyber security risk (Oltramari & Kott, 2018; Tversky & Kahneman, 1974). These biases are affected by personal judgement and experience as well as cultural influence, meaning formulations of subjective probabilities vary between individuals and across disciplines (Henrie, 2013; O'Hagan, 2019; Tversky & Kahneman, 1974). Similarly, humans often fail to incorporate numbers as probabilities into their decision‐making processes, which has resulted in unclear understandings of security risk (Cox, 2008). Similarly, Oltramari and Kott (2018) determined that practitioners define cyber risk in terms of a configuration of a system instead of the probability of harm occurring. This helps explain the discrepancy between definitions and the perception of cyber security and cyber security risk. Cyber networks, and therefore cyber security, are now inherent within all disciplines, from science through social science and even into the humanities and arts. Nonstandard definitions result in contrary or oblique uses of the language in different papers and research. When cyber security and cyber security risk are being newly considered within a broaden disciplinary scope, the lack of common definitions are even more problematic. Researchers and people applying the results can find themselves in discussions and disagreements that are based on a lack of language clarity rather than an inherent disagreement, which results in inefficient use of time and effort. As language is dynamic and evolves over time, these standard definitions can be revisited as new technology, research, application, or governance expand the use of the terms: cyber security and cyber security risk.

Multidisciplinary Communication Challenges

McPhillips et al. (2018) discuss the need for common terminology when addressing phenomena (in their study: extreme events) that affect and are studied by (and within) multiple disciplines in order to ensure that these complex phenomena can be managed holistically, using effective cross‐disciplinary communication. Disciplinary training involves developing a set of disciplinary heuristics used to view and analyze the world. In order to communicate in a true cross disciplinary way, the team members have to get out of their own heads and away from their hard‐built disciplinary heuristics (cf. O'Rourke et al, 2013). Each discipline, each disciplinary language has inherently unique biases and assumptions. For example, engineers look at a system and immediately try to simplify the system to a few critical components or at least a greatly simplified network (Harte, 1988). The first assumption is that interactions are straight‐forward and easily modeled. Ecologists will also try to identify the key components that other parts of the system depend on, the drivers or keystone nodes (Suter, 2007). However, ecologists also assume that interactions are complex and both time‐ and population (density)‐based. Creating common definitions that engender the assumptions of the different disciplines creates a communication‐based common ground that acts as the nexus for effective cross‐disciplinary communication (c.f. Aagaard‐Hansen, 2007; Li et al., 2008; Pennington, Simpson, McConnell, Fair, & Baker, 2013)

CONCLUSIONS

Terminology standardization facilitates efficient communication. The results of this research suggest there is a communication gap across disciplines which can be partially bridged by developing and applying standardized language. The potential benefits associated with standardizing terminology (in this case cyber‐related vocabulary) include effective laws and policies, “repeatable, mutually intelligible, comparable, and interdisciplinary research” and improved data management that facilitates searchability and usability of cyber‐related research (Ramirez, 2017). Despite the aforementioned challenges of qualitative data and thematic analysis, the present research is the first known effort to determine a cross‐disciplinary working definition of cyber security and cyber security risk. While similarities can be drawn between the composite definitions above and the National Initiative for Cybersecurity Careers and Studies definitions (DHS, 2020) for cyber security and risk, visual analysis of the thematic maps (see Figs. 1 and 2 for example) highlights both the disparate and recurrent themes that make cyber security and cyber security risk complex concepts. Although the composite definitions presented above are not formally accepted by the greater discipline of cyber research, the definitions offer useful information regarding the current expert perceptions of cyber security and cyber security risk. Like other cyber terminology researchers (Ramirez & Choucri, 2016), we believe a standardized cyber security vocabulary starts by identifying terminology standards. The methods to conduct a comparative content analysis described in this article are useful in determining cyber experts’ baseline understanding of cyber security and cyber security risk. Current definitions are disciplinarily narrow and do not take into account how cyber networks and therefore cyber security have infiltrated virtually all aspects of society. Multidisciplinary research fields use nuanced language influenced by the researchers’ parent discipline. Disagreements based on undefined and unspoken vocabulary meanings can result from otherwise unclarified underlying disciplinary assumptions. Therefore, using more comprehensive definitions for key actionable terms within multidisciplinary research facilitates shared understanding, respect for collaborators’ disciplinary perspective, increased productivity of multidisciplinary collaboration, and reduces unproductive linguistic obstacles to functional research. Additionally, the third‐order themes emphasize the diversity of thought across disciplines conducting cyber security and cyber security research and the ubiquity of fundamental concepts (e.g., CIA). The number of third‐order themes also illustrates that the definitions of cyber security and cyber security risk cannot be distilled down to a single concept. The third‐order themes also reflect cultural differences between the two sectors, academia and ARL. Academia as a culture pushes the researcher toward analysis and context consideration, whereas the cultural focus of applied cyber security specialists is on the protection against constant and dynamic threats throughout the network. Given the necessary collaboration across these sectors, it is all the more important for common definitions to encompass the use of these terms within both sectors. We argue that: The current definitions of cyber security and cyber security risk are inadequate due to the lack of inclusion of human factors, No standardized cyber security terminology exists across disciplines, and Communication to develop interdisciplinary definitions for cyber security and cyber security risk is lacking. Time is a functional aspect of cyber security and cyber security risk beyond the long‐recognized vulnerabilities of CIA of information. At the time of the expert elicitation (2014) and of the recent reviews of cyber security definitions (Craigen et al., 2014; Schatz et al., 2017), the critical nature of time dependence and time windows in the cyber control of critical infrastructure and communication had not yet been acknowledged, although an ENISA guidance points out time as a critical cyber vulnerability (ENISA, 2006). Therefore, a more state‐of‐the‐art definition would expand security objectives beyond CIA to include time control. The formalization of key terms facilitates interdisciplinary cyber risk modeling, risk communication, and ultimately improve cyber security. Given the multidisciplinary nature of cyber security and the complexities associated with quantifying and managing cyber risk, it is important that researchers, from collaborating fields share a common understanding of what is meant by cyber security and cyber security risk. This common understanding can help move cyber security from a multidisciplinary effort to a truly interdisciplinary domain. Table SI. First‐order themes from responses to “What is your definition of cyber security?” Table SII. Second‐order themes from responses to “What is your definition of cyber security?” Table SIII. Third‐order themes from responses to “What is your definition of cyber security?” Table SIV. First‐order themes from responses to “What is your definition of cyber security risk?” Table SV. Second‐order themes from responses to “What is your definition of cyber security risk?” Table SVI. Third‐order themes from responses to “What is your definition of cyber security risk?” Table SVII. Cross‐tabulation of interviewees per discipline per cyber security third‐order theme. Table SVIII. Cross‐tabulation of interviewees per discipline per cyber security risk third‐order theme. Click here for additional data file.
  10 in total

Review 1.  Perception and communication of flood risks: a systematic review of empirical research.

Authors:  Wim Kellens; Teun Terpstra; Philippe De Maeyer
Journal:  Risk Anal       Date:  2012-05-31       Impact factor: 4.000

2.  Do doctors, nurses and managers have different thinking styles?

Authors:  Ruth M Sladek; Malcolm J Bond; Paddy A Phillips
Journal:  Aust Health Rev       Date:  2010-08       Impact factor: 1.990

Review 3.  Heuristic decision making.

Authors:  Gerd Gigerenzer; Wolfgang Gaissmaier
Journal:  Annu Rev Psychol       Date:  2011       Impact factor: 24.137

4.  Judgment under Uncertainty: Heuristics and Biases.

Authors:  A Tversky; D Kahneman
Journal:  Science       Date:  1974-09-27       Impact factor: 47.728

5.  Some limitations of "Risk = Threat x Vulnerability x Consequence" for risk analysis of terrorist attacks.

Authors:  Louis Anthony Tony Cox
Journal:  Risk Anal       Date:  2008-10-16       Impact factor: 4.000

6.  Interdisciplinary communication in the intensive care unit.

Authors:  T W Reader; R Flin; K Mearns; B H Cuthbertson
Journal:  Br J Anaesth       Date:  2007-02-01       Impact factor: 9.166

Review 7.  Cognitive biases and heuristics in medical decision making: a critical review using a systematic search strategy.

Authors:  J S Blumenthal-Barby; Heather Krieger
Journal:  Med Decis Making       Date:  2014-08-21       Impact factor: 2.583

8.  Statistical illiteracy undermines informed shared decision making.

Authors:  Wolfgang Gaissmaier; Gerd Gigerenzer
Journal:  Z Evid Fortbild Qual Gesundhwes       Date:  2008

Review 9.  Characterizing and Measuring Maliciousness for Cybersecurity Risk Assessment.

Authors:  Zoe M King; Diane S Henshel; Liberty Flora; Mariana G Cains; Blaine Hoffman; Char Sample
Journal:  Front Psychol       Date:  2018-02-05

10.  The Role of Expert Judgment in Statistical Inference and Evidence-Based Decision-Making.

Authors:  Naomi C Brownstein; Thomas A Louis; Anthony O'Hagan; Jane Pendergast
Journal:  Am Stat       Date:  2019-03-20       Impact factor: 8.710

  10 in total
  1 in total

1.  Reconceptualizing cybersecurity awareness capability in the data-driven digital economy.

Authors:  Shahriar Akter; Mohammad Rajib Uddin; Shahriar Sajib; Wai Jin Thomas Lee; Katina Michael; Mohammad Alamgir Hossain
Journal:  Ann Oper Res       Date:  2022-08-02       Impact factor: 4.820

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.