Conference on Mathematical Theory of Deep Neural Networks

Oct 31 - Nov 1,  2019

Important Dates

Call for submissions:

Submissions closed:

Jun 28, 2019

Notifications of decision sent:



Registration open:


Registration deadline:

Oct 31 - Nov 1, 2019


Recent advances in deep neural networks (DNNs), combined with open, easily-accessible implementations, have made DNNs a powerful, versatile method used widely in both machine learning and neuroscience. These advances in practical results, however, have far outpaced a formal understanding of these networks and their training. The dearth of rigorous analysis for these techniques limits their usefulness in addressing scientific questions and, more broadly, hinders systematic design of the next generation of networks. Recently, long-past-due theoretical results have begun to emerge from researchers in a number of fields. The purpose of this conference is to give visibility to these results, and those that will follow in their wake, to shed light on the properties of large, adaptive, distributed learning architectures, and to revolutionize our understanding of these systems.​​

Invited Speakers

Anima Anandkumar ​​

California Institute of Technology

Yasaman Bahri

Google Brain

Minmin Chen


Michael Elad ​​


Surya Ganguli

Stanford University

David Schwab ​​

The Graduate Center,
The City University of New York

Shai Shalev-Shwartz ​​

Hebrew University

Tomaso Poggio

Massachusetts Institute of Technology

Haim Sompolinsky ​​

Hebrew University

Naftali Tishby ​​

Hebrew University