Seminars and Events
Addressing Heterogeneity in Federated Learning
Event Details
Data relevant to machine learning problems are distributed across multiple data sources. Often data cannot be shared due to regulatory, security, or privacy reasons, inherently creating data silos. Federated Learning has emerged as a standard computational paradigm to facilitate the training of machine and deep learning models over distributed silos. However, the participating silos may have heterogeneous computational capabilities and data specifications. This talk will provide a systematic view of the challenges arising in such heterogeneous federated learning environments. We will present training policies to accelerate the convergence of federated models and enable their secure and private computation. We will show the efficacy of these policies across a wide range of challenging federated environments in simulated and real-world neuroimaging settings. We will introduce our comprehensive federated learning and integration system architecture, MetisFL, which encompasses all required federated training components. We will conclude by benchmarking the scalability and efficiency of various federated learning systems.
Meeting ID: 990 3116 7058
Speaker Bio
Dimitris Stripelis received a PhD in Computer Science from the University of Southern California in May 2023. He holds a BSc in Computer Science from the Athens University of Economics and Business. His research interests focus on federated learning, machine learning systems, data integration, and data management systems. He received the USC Myronis Fellowship (2020) and the A.G. Leventis Foundation Educational Grant (2019, 2020, 2021). He has served as a PC member for the IAAI conference and FL4NLP workshop and a reviewer for NeurIPS, IAAI, WISE conferences, and IEEE TMI and IEEE TKDE journals. He organized the first workshop on Federated Learning systems at the MLSys 2023 conference.