Article | Proceedings of the 22nd Nordic Conference on Computational Linguistics (NoDaLiDa), September 30 - October 2, Turku, Finland | Language Modeling with Syntactic and Semantic Representation for Sentence Acceptability Predictions Linköping University Electronic Press Conference Proceedings
Göm menyn

Title:
Language Modeling with Syntactic and Semantic Representation for Sentence Acceptability Predictions
Author:
Adam Ek: Centre for Linguistic Theory and Studies in Probability, Department of Philosophy, Linguistics and Theory of Science, University of Gothenburg, Sweden Jean-Phillipe Bernardy: Centre for Linguistic Theory and Studies in Probability, Department of Philosophy, Linguistics and Theory of Science, University of Gothenburg, Sweden Shalom Lappin: Centre for Linguistic Theory and Studies in Probability, Department of Philosophy, Linguistics and Theory of Science, University of Gothenburg, Sweden
Download:
Full text (pdf)
Year:
2019
Conference:
Proceedings of the 22nd Nordic Conference on Computational Linguistics (NoDaLiDa), September 30 - October 2, Turku, Finland
Issue:
167
Article no.:
008
Pages:
76--85
No. of pages:
9
Publication type:
Abstract and Fulltext
Published:
2019-10-02
ISBN:
978-91-7929-995-8
Series:
Linköping Electronic Conference Proceedings
ISSN (print):
1650-3686
ISSN (online):
1650-3740
Series:
NEALT Proceedings Series
Publisher:
Linköping University Electronic Press, Linköpings universitet


Export in BibTex, RIS or text

In this paper, we investigate the effect of enhancing lexical embeddings in LSTM language models (LM) with syntactic and semantic representations. We evaluate the language models using perplexity, and we evaluate the performance of the models on the task of predicting human sentence acceptability judgments. We train LSTM language models on sentences automatically annotated with universal syntactic dependency roles (Nivre, 2016), dependency depth and universal semantic tags (Abzianidze et al., 2017) to predict sentence acceptability judgments. Our experiments indicate that syntactic tags lower perplexity, while semantic tags increase it. Our experiments also show that neither syntactic nor semantic tags improve the performance of LSTM language models on the task of predicting sentence acceptability judgments.

Keywords: Sentence acceptability Language modeling Semantic representations Syntactic representations

Proceedings of the 22nd Nordic Conference on Computational Linguistics (NoDaLiDa), September 30 - October 2, Turku, Finland

Author:
Adam Ek, Jean-Phillipe Bernardy, Shalom Lappin
Title:
Language Modeling with Syntactic and Semantic Representation for Sentence Acceptability Predictions
References:
No references available

Proceedings of the 22nd Nordic Conference on Computational Linguistics (NoDaLiDa), September 30 - October 2, Turku, Finland

Author:
Adam Ek, Jean-Phillipe Bernardy, Shalom Lappin
Title:
Language Modeling with Syntactic and Semantic Representation for Sentence Acceptability Predictions
Note: the following are taken directly from CrossRef
Citations:
No citations available at the moment


Responsible for this page: Peter Berkesand
Last updated: 2019-11-06