Seminars and Events

Scientific Computing at Large

Expanding Federated Learning to Reduce the Communication Cost

Event Details

REMINDER:

Meeting hosts only admit guests that they know to the Zoom meeting. Hence, you’re highly encouraged to use your USC account to sign into Zoom.

Abstract: In the era of advanced technologies, mobile devices are equipped with computing and sensing capabilities that gather excessive amounts of data. These amounts of data are suitable for training different learning models. Cooperated with Deep Learning (DL) advancements, these learning models empower numerous useful applications, e.g., image processing, speech recognition, healthcare, vehicular network, and many more. Traditionally, Machine Learning (ML) approaches require data to be centralized in cloud-based data centers. However, this data is often large in quantity and privacy-sensitive, preventing logging into these data centers for training the learning models. In turn, this results in critical issues of high latency and communication inefficiency. Recently, in light of new privacy legislation in many countries, the concept of Federated Learning (FL) has been introduced. In FL, mobile users are empowered to learn a global model by aggregating their local models without sharing privacy-sensitive data. Usually, these mobile users have slow network connections to the data center where the global model is maintained. Moreover, in a complicated and extensive scale network, heterogeneous devices with various energy constraints are involved. This raises the challenge of communication costs when implementing FL on a large scale. In addition, data privacy could still be compromised once sharing the local learning models’ parameters to update the global model. To this end, in this webinar, I present several strategies to improve communication efficiency in FL. Besides, those strategies leverage the encryption techniques that additionally prevent data leakage.

Speaker Bio

Muhammad Asad is currently pursuing a Ph.D. in Computer Science from the Department of Computer Science, Nagoya Institute of Technology, Japan, under the supervision of Prof. Shohei Kato. Earlier, he received a 3-years MS degree in Computer Science (MSCS) from Dalian University of Technology, China, under the supervision of Professor Yao Nianmin in June 2018. He was a recipient of the Chinese Government Scholarship (CSC) from 2015 to 2018. He is also a recipient of the Japanese Government Scholarship (MEXT) from 2019 to 2023. Based on his research achievements, he received a President's Award from the Nagoya Institute of Technology in February 2022. His primary research interests include Federated Learning, Deep Learning, Wireless Sensor Networks, Internet of Things, and Software-Defined Networking.