סמינר: Machine Learning Seminar
Leveraging Symmetries for Learning in Deep Weight Spaces.
Date:
November,20,2024
Start Time:
11:30 - 12:30
Location:
1061, Meyer Building
Zoom:
Zoom link
Add to:
Lecturer:
Aviv Navon
Research Areas:
Learning to process and analyze the raw weight matrices of neural networks is an exciting emerging research area with promising applications such as editing and analyzing Implicit Neural Representations (INRs), weight pruning, and function editing. A key challenge in this field arises from the inherent permutation symmetries of NNs — permutations can be applied to the weights to produce different weights representing the same function. Like other structured data like graphs and point clouds, these symmetries make learning in weight spaces challenging.
The talk will highlight recent progress in designing architectures that effectively operate on weight spaces while respecting their underlying symmetries. We will begin by discussing our ICML 2023 paper, which introduces novel equivariant architectures for learning on multilayer perceptron weight spaces. We will characterize all linear equivariant layers for these symmetries and construct networks using these layers. We will also discuss two ICML 2024 papers: one on efficient weight space augmentations, and another that presents a method for learning to align neural network models. Together, these works make important strides toward building versatile and principled architectures for weight-space learning. |
Aviv recently completed my PhD at BIU, where he was supervised by Prof. Ethan Fetaya and Prof. Gal Chechik. His research primarily focuses on multi-task learning and learning in neural weight spaces. Currently, he lead the research team at aiOla. |