Iterative local optimization of normalizing flows

Robust Collaborative Learning

Iterative local optimization of normalizing flows

Robust Collaborative Learning

This project is focused on making collaborative learning, particularly on an unreliable and dynamic set of edge devices, robust to imperfect computational conditions. These could include faults in the devices or communication but could also include heterogeneity in the edge device datasets. We have explored robust collaborative learning and localized learning research diretions. We are currently exploring methods for handling device elasticity (i.e., adaptability to increasing the number of edge devices) and robustness to topology of peer-to-peer edge learning systems.

I co-organized the ICML 2023 Workshop on Localized Learning. Accepted papers on OpenReview.

Avatar
David I. Inouye
Assistant Professor

I research trustworthy ML methods that are robust to imperfect distributional and computational assumptions using explainability, causality, and collaborative learning.

Publications

When each edge device of a network only perceives a local part of the environment, collaborative inference across multiple devices is …

While prior federated learning (FL) methods mainly consider client heterogeneity, we focus on the Federated Domain Generalization (DG) …

Spatial reasoning tasks in multi-agent environments such as event prediction, agent type identification, or missing data imputation are …

A central theme in federated learning (FL) is the fact that client data distributions are often not independent and identically …

While normalizing flows for continuous data have been extensively researched, flows for discrete data have only recently been explored. …

The unsupervised task of aligning two or more distributions in a shared latent space has many applications including fair …

We propose a unified framework for deep density models by formally defining density destructors. A density destructor is an invertible …