| Literature DB >> 33223952 |
Abstract
With the rise of online platforms where individuals could gather and spread information came the rise of online cybercrimes aimed at taking advantage of not just single individuals but collectives. In response, researchers and practitioners began trying to understand this digital playground and the way in which individuals who were socially and digitally embedded could be manipulated. What is emerging is a new scientific and engineering discipline-social cybersecurity. This paper defines this emerging area, provides case examples of the research issues and types of tools needed, and lays out a program of research in this area.Entities:
Keywords: Dynamic network analysis; Review; Social cybersecurity; Social media analytics; Social network analysis
Year: 2020 PMID: 33223952 PMCID: PMC7668017 DOI: 10.1007/s10588-020-09322-9
Source DB: PubMed Journal: Comput Math Organ Theory ISSN: 1381-298X Impact factor: 2.023
Fig. 1Network diagram of the interdisciplinary nature of the field of social cybersecurity. Nodes are disciplines and are sized by number of articles. Links are number of articles associated with both disciplines
Fig. 2Research topic areas in social cybersecurity
Communication objectives BEND
| Manipulating the narrative | Manipulating the social network | |||
|---|---|---|---|---|
| Positive | Messages that bring up a related but relevant topic | Actions that increase the importance of the opinion leader or create a new opinion leader | ||
| Messages that provides details on or elaborate the topic | Actions that create a group or the appearance of a group | |||
| messages that elicit a positive emotion such as joy or excitement | Actions that build a connection between two or more groups | |||
| Messages that encourage the topic-group to continue with the topic | Actions that grow the size of the group or make it appear that it has grown | |||
| Negative | Messages about why the topic is not important | Actions decrease the importance of the opinion leader | ||
| Messages that alter the main message of the topic | Actions that lead to a group being dismantled or breaking up, or appearing to be broken up | |||
| Messages that elicit a negative emotion such as sadness or anger | Actions that lead to a group becoming sequestered from other groups or marginalized | |||
| Discussion about a totally different topic and irrelevant | Actions that reduce the size of the group or make it appear that the group has grown smaller | |||
Types of “disinformation”
| Disinformation types | Example | Potential for AI techniques to detect |
|---|---|---|
| Fake news (story made to look like news) | Naval Destroyer crash in Hurricane Harvey | AI could be used to identify sites, and do fact checking |
| Fabrication with visual | Parkland student ripping up constitution | AI could be used to create and identify fake images |
| Fabrication without visual | Opposition peso scam in Philippines | AI might be of some assistance in finding all instances of story |
| Propaganda | Duerte’s helicopter scaring off the Chinese | AI could help classify underlying BEND objectives |
| Conspiracy | Pizzagate | AI could be used to do fact checking |
| Misleading—due to misquoting | Captain Marvel—Brie Larson is a racist/sexist | AI could be used to do fact checking, and stance checking |
| Misleading—due to being out of context | voting makes you lose your hunting license | AI might provide support tools |
| Innuendo and illogic | Anti-vax campaign | AI might provide some support but won’t solve |
Fig. 3Evolving co-authorship network. The top image shows the central core in 2018 and the bottom image the central core in 2019. Each node is an author and the links are weighted by the number of papers those two authors co-authored