Séminaire FRIIAM
Natural Language Processing and Machine Translation with Deep Neural Networks
Holger Schwenk, Facebook Research Paris, and professor in Computer Science


In this talk I'll give an overlook on the history and current research in natural language processing using deep neural networks. First, the principles of popular neural networks architectures will be reviewed, namely feed-forward and recurrent neural networks, as well as Long Short-Term Memory units (LSTM). The talk will then focus on two areas where deep neural networks are particularly successful: language modeling and machine translation. The different approaches are discussed in detail and results are presented for several large-scale applications. The talk will conclude with an outlook on possible future research directions.

The talk will be given in French.


Last updated on 2016-O4-01