Literature DB >> 33364628

Enhancing Question Answering by Injecting Ontological Knowledge through Regularization.

Travis R Goodwin1, Dina Demner-Fushman1.   

Abstract

Deep neural networks have demonstrated high performance on many natural language processing (NLP) tasks that can be answered directly from text, and have struggled to solve NLP tasks requiring external (e.g., world) knowledge. In this paper, we present OSCR (Ontology-based Semantic Composition Regularization), a method for injecting task-agnostic knowledge from an Ontology or knowledge graph into a neural network during pre-training. We evaluated the performance of BERT pre-trained on Wikipedia with and without OSCR by measuring the performance when fine-tuning on two question answering tasks involving world knowledge and causal reasoning and one requiring domain (healthcare) knowledge and obtained 33.3 %, 18.6 %, and 4 % improved accuracy compared to pre-training BERT without OSCR.

Entities:  

Year:  2020        PMID: 33364628      PMCID: PMC7757122          DOI: 10.18653/v1/2020.deelio-1.7

Source DB:  PubMed          Journal:  Proc Conf Empir Methods Nat Lang Process


  2 in total

1.  Recognizing Question Entailment for Medical Question Answering.

Authors:  Asma Ben Abacha; Dina Demner-Fushman
Journal:  AMIA Annu Symp Proc       Date:  2017-02-10

2.  Embedding Open-domain Common-sense Knowledge from Text.

Authors:  Travis Goodwin; Sanda Harabagiu
Journal:  LREC Int Conf Lang Resour Eval       Date:  2016-05
  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.