Konferensartikel

Capturing Dependency Syntax with "Deep" Sequential Models

Yoav Goldberg
Bar Ilan University, Department of Computer Science, Ramat-Gan, Israel

Ladda ner artikel

Ingår i: Proceedings of the Fourth International Conference on Dependency Linguistics (Depling 2017), September 18-20, 2017, Università di Pisa, Italy

Linköping Electronic Conference Proceedings 139:1, s. 1

Visa mer +

Publicerad: 2017-09-13

ISBN: 978-91-7685-467-9

ISSN: 1650-3686 (tryckt), 1650-3740 (online)

Abstract

Neural network (“deep learning”) models are taking over machine learning approaches for language by storm. In particular, recurrent neural networks (RNNs), which are flexible non-markovian models of sequential data, were shown to be effective for a variety of language processing tasks. Somewhat surprisingly, these seemingly purely sequential models are very capable at modeling syntactic phenomena, and using them result in very strong dependency parsers, for a variety of languages. In this talk, I will briefly describe recurrent-networks, and present empirical evidence for their capabilities of learning the subject-verb agreement relation in naturally occuring text, from relatively indirect supervision. This part is based on my joint work with Tal Linzen and Emmanuel Dupoux. I will then describe bi-directional recurrent networks---a simple extension of recurrent networks---and show how they can be used as the basis of state-of-the-art dependency parsers. This is based on my work with Eliyahu Kipperwasser, but will also touch on work by other researchers in that space.

Nyckelord

Inga nyckelord är tillgängliga

Referenser

Inga referenser tillgängliga

Citeringar i Crossref