Seminar: Machine Learning Seminar
Revisiting Key Factors Behind the Success of Deep Learning
Date:
June,11,2025
Start Time:
11:30 - 12:30
Location:
506, Zisapel Building
Zoom:
Zoom link
Add to:
Lecturer:
Ofir Lindenbaum
Research Areas:
Despite its impressive performance, deep learning remains partly understood. In this talk, we revisit key factors driving its success, focusing on optimization dynamics, model architecture, and representation learning. We first explore stochastic gradient noise (SGN), providing evidence that it follows heavy-tailed Lévy stable distributions rather than the commonly assumed Gaussian distribution. Our findings demonstrate that distinct neural network parameters exhibit unique noise characteristics, enabling more accurate predictions of escape times from local minima by modeling stochastic gradient descent (SGD) as a Lévy-driven process.
Next, we demonstrate how sample permutation can enhance cluster separation in multi-modal data representation methods. Finally, we analyze the effects of overparameterization in unsupervised autoencoders (AEs). Our findings reveal unexpected multiple descent patterns in test loss curves, showing that overparameterized AEs improve both reconstruction capabilities and downstream tasks, such as anomaly detection and domain adaptation. |
Ofir Lindenbaum is a senior lecturer in the faculty of Engineering at Bar Ilan University. Ofir obtained his Ph.D. and M.Sc. from Tel Aviv University and his B.Sc. in Electrical Engineering and Physics (both summa cum laude) from the Technion. Following his Ph.D., he was a Gibbs assistant professor at Yale University. His research is focused on the theory and practice of machine learning. His main goal is to enable the practical use of machine learning algorithms for scientific discovery.
|