Avalon Seminar: A decade of progress in Deep Learning
Quand ? |
Le 13/12/2016, de 14:00 à 15:00 |
---|---|
Où ? | Amphi D - ENS de Lyon - Site Monod |
Ajouter un événement au calendrier |
vCal iCal |
Abstract
Although the concept of artificial neural networks has existed since the seventies, they were rarely used because of their training cost and low performance. Recent research dramatically improved on both sides \ in a few years. In this presentation, I will explain the innovations which contributed the most to this improvement.
Media often over-simplify this improvement by saying that it is due to the use of GPUs and bigger datasets. They are indeed the main factor. However, contributions in network architecture (LSTM, parallel paths,\ attention mechanisms), training (variants of gradient descent, dropout, batch normalization) and initialization (from pretraining, to clever random distributions) also played a big role.
After a brief introduction to deep learning, I will present those innovations, focusing on text-understanding applications. I will conclude with a discussion about the state of the art in text understanding.