License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/DagRep.7.1.129
URN: urn:nbn:de:0030-drops-72489
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2017/7248/
Go back to Dagstuhl Reports


Blunsom, Phil ; Cho, Kyunghyun ; Dyer, Chris ; Schütze, Hinrich
Weitere Beteiligte (Hrsg. etc.): Phil Blunsom and Kyunghyun Cho and Chris Dyer and Hinrich Schütze

From Characters to Understanding Natural Language (C2NLU): Robust End-to-End Deep Learning for NLP (Dagstuhl Seminar 17042)

pdf-format:
dagrep_v007_i001_p129_s17042.pdf (1 MB)


Abstract

This report documents the program and the outcomes of Dagstuhl Seminar 17042 "From Characters to Understanding Natural Language (C2NLU): Robust End-to-End Deep Learning for NLP". The seminar brought together researchers from different fields, including natural language processing, computational linguistics, deep learning and general machine learning. 31 participants from 22 academic and industrial institutions discussed advantages and challenges of using characters, i.e., "raw text", as input for deep learning models instead of language-specific tokens. Eight talks provided overviews of different topics, approaches and challenges in current natural language processing research. In five working groups, the participants discussed current natural language processing/understanding topics in the context of character-based modeling, namely, morphology, machine translation, representation learning, end-to-end systems and dialogue. In most of the discussions, the need for a more detailed model analysis was pointed out. Especially for character-based input, it is important to analyze what a deep learning model is able to learn about language - about tokens, morphology or syntax in general. For an efficient and effective understanding of language, it might furthermore be beneficial to share representations learned from multiple objectives to enable the models to focus on their specific understanding task instead of needing to learn syntactic regularities of language first. Therefore, benefits and challenges of transfer learning were an important topic of the working groups as well as of the panel discussion and the final plenary discussion.

BibTeX - Entry

@Article{blunsom_et_al:DR:2017:7248,
  author =	{Phil Blunsom and Kyunghyun Cho and Chris Dyer and Hinrich Sch{\"u}tze},
  title =	{{From Characters to Understanding Natural Language (C2NLU): Robust End-to-End Deep Learning for NLP (Dagstuhl Seminar 17042)}},
  pages =	{129--157},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2017},
  volume =	{7},
  number =	{1},
  editor =	{Phil Blunsom and Kyunghyun Cho and Chris Dyer and Hinrich Sch{\"u}tze},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{http://drops.dagstuhl.de/opus/volltexte/2017/7248},
  URN =		{urn:nbn:de:0030-drops-72489},
  doi =		{10.4230/DagRep.7.1.129},
  annote =	{Keywords: Natural Language Understanding, Artificial Intelligence, Deep Learning, Natural Language Processing, Representation Learning}
}

Keywords: Natural Language Understanding, Artificial Intelligence, Deep Learning, Natural Language Processing, Representation Learning
Collection: Dagstuhl Reports, Volume 7, Issue 1
Issue Date: 2017
Date of publication: 08.06.2017


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI