Skip to main content Skip to main navigation

Publication

CopyBERT: A Unified Approach to Question Generation with Self-Attention

Stalin Varanasi; Saadullah Amin; Günter Neumann
In: NLP for Conversational AI - Proceedings of the 2nd Workshop. NLP for Conversational AI (NLPConvAI-2020), July 9, Pages 25-31, ISBN 978-1-952148-08-8, ACL, 2020.

Abstract

Contextualized word embeddings provide better initialization for neural networks that deal with various natural language understanding (NLU) tasks including Question Answering (QA) and more recently, Question Generation (QG). Apart from providing meaningful word representations, pre-trained transformer models, such as BERT also provide self-attentions which encode syntactic information that can be probed for dependency parsing and POStagging. In this paper, we show that the information from self-attentions of BERT are useful for language modeling of questions conditioned on paragraph and answer phrases. To control the attention span, we use semidiagonal mask and utilize a shared model for encoding and decoding, unlike sequence-tosequence. We further employ copy mechanism over self-attentions to achieve state-of-the-art results for Question Generation on SQuAD dataset.

Projekte

Weitere Links