סמינר: Machine Learning Seminar

קהילת נשות הנדסת חשמל ומחשבים

Parameter-Free and Asynchronous Stochastic Optimization

Date: July,15,2024 Start Time: 10:30 - 11:30
Location: 1061, Meyer Building
Add to:
Lecturer: Amit Attia
In this talk, we will explore two important aspects of first-order optimization methods for machine learning: tuning optimization parameters and asynchronous training with multiple workers. The first part addresses parameter-free stochastic optimization. Current parameter-free methods are only "partially" parameter-free, relying on non-trivial knowledge of parameters such as bounds on stochastic gradient norms. We will discuss whether fully parameter-free methods exist and under what conditions they can achieve convergence rates competitive with optimally tuned methods.

The second part focuses on asynchronous stochastic optimization, which deals with updates based on stale stochastic gradients subject to arbitrary delays. Existing methods achieve rates that depend on the average gradient delay. We will introduce new methods that depend on more robust properties of the delay distribution by employing asynchronous mini-batching, filtering of stale gradients and utilizing classical convergence results.

Amit Attia is a PhD student at the Department of Computer Science at Tel Aviv University, advised by Prof. Tomer Koren. He completed his MSc under the supervision of Tomer and before that, obtained a BSc in Computer Science and a minor in Physics from the Hebrew University of Jerusalem. His research focuses on optimization for machine learning, and in particular on adaptivity, stability and generalization of first-order methods.

 

כל הסמינרים
דילוג לתוכן