Konferensartikel

Mark my Word: A Sequence-to-Sequence Approach to Definition Modeling

Timothee Mickus
Université de Lorraine, CNRS, ATILF, France

Denis Paperno
Utrecht University, The Netherlands

Mathieu Constant
Université de Lorraine, CNRS, ATILF, France

Ladda ner artikel

Ingår i: DL4NLP 2019. Proceedings of the First NLPL Workshop on Deep Learning for Natural Language Processing, 30 September, 2019, University of Turku, Turku, Finland

Linköping Electronic Conference Proceedings 163:1, s. 1-11

NEALT Proceedings Series 38:1, p. 1-11

Visa mer +

Publicerad: 2019-09-27

ISBN: 978-91-7929-999-6

ISSN: 1650-3686 (tryckt), 1650-3740 (online)

Abstract

Defining words in a textual context is a useful task both for practical purposes and for gaining insight into distributed word representations. Building on the distributional hypothesis, we argue here that the most natural formalization of definition modeling is to treat it as a sequence-to-sequence task, rather than a word-to-sequence task: given an input sequence with a highlighted word, generate a contextually appropriate definition for it. We implement this approach in a Transformer-based sequence-to-sequence model. Our proposal allows to train contextualization and definition generation in an end-to-end fashion, which is a conceptual improvement over earlier works. We achieve state-of-the-art results both in contextual and non-contextual definition modeling.

Nyckelord

natural language generation, word embeddings, distributional semantics, definition modeling, word embeddings analysis

Referenser

Inga referenser tillgängliga

Citeringar i Crossref