Learning Overparametrized Neural Networks and Statistical Models
报告人： Pengkun Yang, Tsinghua University
地点：Science Building No.1, 1114
Modern machine learning has constantly presented puzzling empirical properties and surprised the classical statistical theory. Learning with overparametrized models is becoming a norm in data-analytic applications, and the tension of memorization rarely bothers practitioners. In this talk, I will discuss the training of overparametrized neural networks from both the neural tangent kernel and the mean-field perspectives, which guarantees the global convergence property despite the non-convexity of the optimization landscape. I will also discuss more interesting phenomena in a series of overparametrized statistical questions.
About the Speaker:
Pengkun Yang is an assistant professor in the Center for Statistical Science at Tsinghua University. Prior to joining Tsinghua, he was a Postdoctoral Research Associate in the Department of Electrical Engineering at Princeton University. He received a Ph.D. degree (2018) and a master’s degree (2016) from the Department of Electrical and Computer Engineering at the University of Illinois at Urbana-Champaign, and a B.E. degree (2013) from the Department of Electronic Engineering at Tsinghua University. His research interests include statistical inference, learning, optimization, and systems. He is a recipient of Thomas M. Cover Dissertation Award in 2020, and a recipient of Jack Keil Wolf ISIT Student Paper Award at the 2015 IEEE International Symposium on Information Theory (semi-plenary talk).
Your participation is warmly welcomed!