Yaoyu Zhang, Institute for Advanced Study at Princeton
Room 305, No.5 Science Building
In this talk, I will present my theoretical works on deep learning as well as the ongoing and future projects about its application. For the theoretical part, I will talk about the quantitative theories for the Frequency-Principle and the impact of initialization. The combination of these works provide an intuitive explanation of the key puzzle of deep learning theory, i.e., why overparameterized neural networks often generalize well. For the application part, I will talk about the advantages of DNN-based approaches. Specifically, I will demonstrate how DNN can be used to assist the modeling and analysis of the nonlinear system of neuronal circuits. I will also talk about potential application of DNN to other scientific problems.
Yaoyu Zhang received B.S. in Applied Physics at Zhiyuan College and Ph.D. in Applied Mathematics at INS and School of Mathematical Sciences of Shanghai Jiao Tong University. He worked as Post-doctoral Associate at NYU Abu Dhabi and Courant Institute from 2016 to 2019. He is currently a member of the Institute for Advanced Study at Princeton. His research interests lie in theory and application of deep learning and computational neuroscience.