Abstract: Recent theoretical work has shown that massively overparameterized neural networks are equivalent to regressors with kernels called Neural Tangent Kernels (NTK). Experiments indicate that these kernels yield accuracies that are roughly equal those obtained with their real neural network counterparts. Our work in this subject aims to better understand the properties of NTK and relate them to properties of real neural networks. In particular, I will argue that NTK exisbits bias toward low frequency predictions, potentially explaining why overparameterized networks do not overfit their training data. I will further discuss the behavior of NTK when data is distributed nonuniformly, and finally show that NTK is tightly related to the classical Laplace kernel, which has a simple closed form. Our results suggest that much insight about neural networks can be obtained from the analysis of NTK.
Friday, April 9, 2021 - 10:00am
Weizmann Institute of Science
Bio: Ronen Basri is Professor of Computer Science and Dean of Mathematics and Computer Science at the Weizmann Institute of Science. Additionally, he is the incumbent of the Elaine and Bram Goldsmith Chair of Applied Mathematics. Ronen Basri received his Ph.D. degree from the Weizmann Institute of Science, and was postdoctoral fellow at the Massachusetts Institute of Technology before assuming a faculty position at Weizmann. He further held visiting positions at NEC Research Institute, Toyota Technological Institute at Chicago, Howard Hughes Janelia Farm Campus and the University of Maryland at College Park. His research interests include computer vision and machine learning. His recent work focuses primarily on shape modeling and 3D reconstruction, as well as theoretical aspects of deep learning.