# Author: crm50

## June 11 – July 6, 2018 » Causal Inference in the Presence of Dependence and Network Structure

#### Organizers: Erica E.M. Moodie (McGill), David A. Stephens (McGill), Alexandra M. Schmidt (McGill)

## Deep Learning for AI by Yoshua Bengio Monday April 16, 11:30

**Lundi 16 avril / Monday, April 16 11:30 – 12:30 **

**Université de Montréal, **

**Pavillon André-Aisenstadt, **

**salle / room 1360 **

**Conférence inaugurale / Opening keynote lecture**

*Deep Learning for AI*

**Yoshua Bengio (Université de Montréal) **

There has been rather impressive progress recently with brain-inspired statistical learning algorithms based on the idea of learning multiple levels of representation, also known as neural networks or deep learning. They shine in artificial intelligence tasks involving perception and generation of sensory data like images or sounds and to some extent in understanding and generating natural language. We have proposed new generative models which lead to training frameworks very different from the traditional maximum likelihood framework, and borrowing from game theory. Theoretical understanding of the success of deep learning is work in progress but relies on representation aspects as well as optimization aspects, which interact. At the heart is the ability of these learning mechanisms to capitalize on the compositional nature of the underlying data distributions, meaning that some functions can be represented exponentially more efficiently with deep distributed networks compared to approaches like standard non-parametric methods which lack both depth and distributed representations. On the optimization side, we now have evidence that local minima (due to the highly non-convex nature of the training objective) may not be as much of a problem as thought a few years ago, and that training with variants of stochastic gradient descent actually helps to quickly find better-generalizing solutions. Finally, new interesting questions and answers are arising regarding learning theory for deep networks, why even very large networks do not necessarily overfit and how the representation-forming structure of these networks may give rise to better error bounds which do not absolutely depend on the iid data hypothesis.

## March 12-16, 2018 » Workshop in Geometric Analysis

**Workshop organizers: Pengfei Guan (McGill), Alina Stancu (Concordia),Gábor Székelyhidi (University of Notre Dame), Jérôme Vétois (McGill), Ben Weinkove (Northwestern University)**

### Information and registration

## March 12 – 16, 2018 » Nirenberg Lectures by Eugenia Malinnikova

**CRM Nirenberg Lectures organizers: Pengfei Guan (McGill), Dima Jakobson (McGill), Iosif Polterovich (Montréal), Alina Stancu (Concordia)**

The CRM Nirenberg Lectures in Geometric Analysis have taken place every year since 2014. The series is named in honour of Louis Nirenberg, one of the most prominent geometric analysts of our time. The 2018 lectures will be delivered by Professor Eugenia Malinnikova from the Norwegian University of Science and Technology in Trondheim. Malinnikova’s contributions include a groundbreaking joint work with A. Logunov on the nodal geometry of Laplace eigenfunctions, that has led to a proof of two major conjectures in the field due to Shing-Tung Yau and Nikolai Nadirashvili. The research achievements of Eugenia Malinnikova have been recognized by the 2017 Clay Research Award and an invitation to speak at the 2018 ICM in Rio de Janeiro.