| Literature DB >> 35769927 |
Yiming Cui1, Wei-Nan Zhang2, Ting Liu2.
Abstract
The attention mechanism plays an important role in the machine reading comprehension (MRC) model. Here, we describe a pipeline for building an MRC model with a pretrained language model and visualizing the effect of each attention zone in different layers, which can indicate the explainability of the model. With the presented protocol and accompanying code, researchers can easily visualize the relevance of each attention zone in the MRC model. This approach can be generalized to other pretrained language models. For complete details on the use and execution of this protocol, please refer to Cui et al. (2022).Entities:
Keywords: Bioinformatics; Cognitive Neuroscience; Computer sciences; Systems biology
Mesh:
Year: 2022 PMID: 35769927 PMCID: PMC9234076 DOI: 10.1016/j.xpro.2022.101481
Source DB: PubMed Journal: STAR Protoc ISSN: 2666-1667