| Literature DB >> 33364628 |
Travis R Goodwin1, Dina Demner-Fushman1.
Abstract
Deep neural networks have demonstrated high performance on many natural language processing (NLP) tasks that can be answered directly from text, and have struggled to solve NLP tasks requiring external (e.g., world) knowledge. In this paper, we present OSCR (Ontology-based Semantic Composition Regularization), a method for injecting task-agnostic knowledge from an Ontology or knowledge graph into a neural network during pre-training. We evaluated the performance of BERT pre-trained on Wikipedia with and without OSCR by measuring the performance when fine-tuning on two question answering tasks involving world knowledge and causal reasoning and one requiring domain (healthcare) knowledge and obtained 33.3 %, 18.6 %, and 4 % improved accuracy compared to pre-training BERT without OSCR.Entities:
Year: 2020 PMID: 33364628 PMCID: PMC7757122 DOI: 10.18653/v1/2020.deelio-1.7
Source DB: PubMed Journal: Proc Conf Empir Methods Nat Lang Process