| Literature DB >> 35528358 |
Qingchuan Zhang1,2, Menghan Li1,2, Wei Dong1,2, Min Zuo1,2, Siwei Wei1,2, Shaoyi Song1,2, Dongmei Ai3.
Abstract
Dealing with food safety issues in time through online public opinion incidents can reduce the impact of incidents and protect human health effectively. Therefore, by the smart technology of extracting the entity relationship of public opinion events in the food field, the knowledge graph of the food safety field is constructed to discover the relationship between food safety issues. To solve the problem of multi-entity relationships in food safety incident sentences for few-shot learning, this paper adopts the pipeline-type extraction method. Entity relationship is extracted from Bidirectional Encoder Representation from Transformers (BERTs) joined Bidirectional Long Short-Term Memory (BLSTM), namely, the BERT-BLSTM network model. Based on the entity relationship types extracted from the BERT-BLSTM model and the introduction of Chinese character features, an entity pair extraction model based on the BERT-BLSTM-conditional random field (CRF) is established. In this paper, several common deep neural network models are compared with the BERT-BLSTM-CRF model with a food public opinion events dataset. Experimental results show that the precision of the entity relationship extraction model based on BERT-BLSTM-CRF is 3.29%∼23.25% higher than that of other models in the food public opinion events dataset, which verifies the validity and rationality of the model proposed in this paper.Entities:
Mesh:
Year: 2022 PMID: 35528358 PMCID: PMC9071985 DOI: 10.1155/2022/7773259
Source DB: PubMed Journal: Comput Intell Neurosci
Figure 1Manual annotation instance.
Figure 2Sequence annotation instance.
Definition of relationships for the BBC dataset.
| ID | Relation name | Abbreviations |
|---|---|---|
| 1 | Time | TI |
| 2 | Scene | SC |
| 3 | Name of food | NF |
| 4 | Food contaminant | FC |
| 5 | Adverse reaction | AR |
| 6 | Manufacturer | MF |
| 7 | Inspection body | IB |
Figure 3The radical feature construction.
Figure 4Structure diagram of the BERT-BLSTM-CRF model.
Figure 5The framework of the relationship extraction model.
Figure 6Entity extraction model structure diagram.
Experimental dataset.
| Dataset | Training | Validation | Test | Label |
|---|---|---|---|---|
| BBC-DATA | 1500 | 300 | 400 | 7 |
| OP-DATA | 180000 | 50000 | 50000 | 35 |
Figure 7Details of the BBC experimental dataset.
Figure 8Different overlap types of relational triple.
BBC experimental dataset division for different entity overlap scenarios.
| Dataset | Training | Validation | Test |
|---|---|---|---|
| Normal | 573 | 96 | 143 |
| EPO | 281 | 35 | 79 |
| SEO | 646 | 169 | 178 |
Experimental parameter setting.
| Model | Parameters | Value |
|---|---|---|
| BERT | Layer | 12 |
| Dimensions | 768 | |
| Learning rate | 5 | |
| Pad size | 128 | |
| Function | Tan | |
|
| ||
| BLSTM | Layer | 2-Layer |
| Dimensions | 256 | |
| Learning rate | 1 | |
| Pad size | 128 | |
| Function | ReLu Tan | |
Experimental results of relationship extraction.
| Model | BBC-DATA | OP-DATA | ||||
|---|---|---|---|---|---|---|
|
|
|
|
|
|
| |
| CNN | 68.77 | 69.22 | 68.99 | 66.37 | 67.65 | 67.00 |
| CNN-ATT | 72.37 | 73.33 | 72.39 | 71.65 | 73.25 | 72.44 |
| BLSTM-ATT | 77.37 | 71.33 | 74.23 | 75.25 | 70.37 | 72.72 |
| BERT-BLSTM-ATT | 86.65 | 86.25 | 87.44 | 84.22 | 83.37 | 83.79 |
| BERT-BLSTM-CRF | 95.48 | 95.12 | 95.30 | 96.15 | 95.82 | 95.98 |
Experimental results of entity extraction.
| Model | BBC-DATA | OP-DATA | ||||
|---|---|---|---|---|---|---|
|
|
|
|
|
|
| |
| CNN | 69.23 | 69.72 | 69.47 | 66.37 | 67.65 | 67.00 |
| CNN-ATT | 72.89 | 73.88 | 73.38 | 71.65 | 73.25 | 72.44 |
| BLSTM-ATT | 77.85 | 71.87 | 74.74 | 75.25 | 70.37 | 72.72 |
| BERT-BLSTM-ATT | 89.19 | 86.78 | 87.97 | 84.22 | 83.37 | 83.79 |
| BERT-BLSTM-CRF | 92.48 | 91.82 | 92.15 | 93.15 | 92.42 | 92.78 |
Figure 9Precision, recall rate, and F1 scores for different number of triples.
Figure 10Precision, recall rate, and F1 scores of relational triples in different overlapping patterns.
Figure 11The loss function trending graph.