Seminar: Signal Processing and Systems

ECE Women Community

Low distortion embeddings with bottom-up manifold learning

Date: March,27,2024 Start Time: 13:00 - 14:30
Location: 1061, Meyer Building
Add to:
Lecturer: Gal Mishne
Manifold learning algorithms aim to map high-dimensional data into lower dimensions while preserving local and global structure, however popular methods distort distances between points in the low-dimensional space. In this talk, I present a bottom-up manifold learning framework that constructs low-distortion local views of a dataset in lower dimensions and registers these to obtain a global embedding. Our global alignment formulation enables tearing manifolds so as to embed them into their intrinsic dimension, including manifolds without boundary and non-orientable manifolds. To quantitatively evaluate the quality of low-dimensional embeddings, we present a new strong and weak notion of global distortion. We show that Riemannian Gradient Descent (RGD) converges to a global embedding with guaranteed low global distortion. Compared to competing manifold learning and data visualization approaches, our framework achieves the lowest local and global distortion, as well as lowest reconstruction error in downstream decoding tasks, on synthetic and real-world neuroscience datasets.

Joint work with Dhruv Kohli, Alex Cloninger, Bas Nieuwenhuis and Devika Narain.

Gal Mishne is an assistant professor in the Halıcıoğlu Data Science Institute (HDSI) and affiliated with the ECE department, the CSE department, the Neurosciences Graduate program and the Institute for Neural Computation at UC San Diego.

 

All Seminars
Skip to content