Mathematical Theory of Deep Neural Networks
Tuesday March 20th, Princeton Neuroscience Institute.
PNI Psychology Lecture Hall 101
Recent advances in deep networks, combined with open, easily-accessible implementations, have moved empirical results far faster than formal understanding. The lack of rigorous analysis for these techniques limits their use in addressing scientific questions in the physical and biological sciences, and prevents systematic design of the next generation of networks. Recently, long-past-due theoretical results have begun to emerge. These results, and those that will follow in their wake, will begin to shed light on the properties of large, adaptive, distributed learning architectures, and stand to revolutionize how computer science and neuroscience understand these systems.
This intensive one-day technical workshop will focus on state of the art theoretical understanding of deep learning. We aim to bring together researchers from the Princeton Neuroscience Institute (PNI) and of the theoretical machine learning group at the Institute for Advanced Studies (IAS) interested in more rigorously understanding deep networks to foster increased discussion and collaboration across these intrinsically related groups.
Pessimism of the Intellect, Optimism of the Will Favorite posts | Manifold podcast | Twitter: @hsu_steve
Saturday, January 27, 2018
Mathematical Theory of Deep Neural Networks (Princeton workshop)
This looks interesting. Deep Learning would benefit from a stronger theoretical understanding of why it works so well. I hope they put the talks online!
No comments:
Post a Comment