סמינר: Machine Learning Seminar

קהילת נשות הנדסת חשמל ומחשבים

Leveraging Symmetries for Learning in Deep Weight Spaces.

Date: November,20,2024 Start Time: 11:30 - 12:30
Location: 1061, Meyer Building
Add to:
Lecturer: Aviv Navon
Given the current defense guidelines, we are not allowed to have more than 20 participants in the conference room. Therefore, the upcoming seminar talks will be hybrid (please see the attached Zoom link).
In practice, since we still encourage people to attend physically, I will arrange the room to fit 20 people, and seating will be on a first-come, first-served basis.
Sorry for the inconvenience. Hoping for better times ahead. ZOOM LINK
Learning to process and analyze the raw weight matrices of neural networks is an exciting emerging research area with promising applications such as editing and analyzing Implicit Neural Representations (INRs), weight pruning, and function editing. A key challenge in this field arises from the inherent permutation symmetries of NNs — permutations can be applied to the weights to produce different weights representing the same function. Like other structured data like graphs and point clouds, these symmetries make learning in weight spaces challenging.

The talk will highlight recent progress in designing architectures that effectively operate on weight spaces while respecting their underlying symmetries. We will begin by discussing our ICML 2023 paper, which introduces novel equivariant architectures for learning on multilayer perceptron weight spaces. We will characterize all linear equivariant layers for these symmetries and construct networks using these layers.

We will also discuss two ICML 2024 papers: one on efficient weight space augmentations, and another that presents a method for learning to align neural network models. Together, these works make important strides toward building versatile and principled architectures for weight-space learning.

Aviv recently completed my PhD at BIU, where he was supervised by Prof. Ethan Fetaya and Prof. Gal Chechik. His research primarily focuses on multi-task learning and learning in neural weight spaces. Currently, he lead the research team at aiOla.
כל הסמינרים
דילוג לתוכן