Keynote – Vincent Gripon, IMT-Atlantique
– Tuesday, May 7, 2019
Robust Deep Learning Inference with Limited Resources
Abstract: Deep learning architectures are the golden standard for many machine learning problems. Thanks to their large number of trainable parameters, they are able to absorb complex dependencies in the input data to produce correct decisions, when trained appropriately. However, this dependency, on a very large number of parameters is also a weakness: their computation and memory footprints are considerable and it is hard — if not impossible — to guarantee their ability to perform well when dealing with corrupted and noisy inputs. In this talk, we shall review the main strategies that have been proposed in the literature to reduce computations and memory of deep learning systems, including quantization, factorization, and pruning. We shall also discuss how adequate these systems are to faulty implementations. Finally, we will discuss the susceptibility of deep learning architectures to deviations of the inputs, what appears to have become a major open question.
Bio: Vincent Gripon is a permanent researcher with IMT-Atlantique. He obtained his M.S from École Normale Supérieure Paris-Saclay in 2008 and his PhD from IMT-Atlantique in 2011. He spent one year as a visiting scientist at McGill University between 2011 and 2012 and he is currently an invited Professor at Mila and Université de Montréal. His research mainly focuses on efficient implementation of artificial neural networks, graph signal processing, deep learning robustness and associative memories. He co-authored more than 70 papers in these domains in prestigious venues (AAAI, tPAMI, Statistical Physics, TNNLS, TSP…).