|   |
Time | Activity |
  |   |
Opening Remarks | |
Invited Talk 1: Algorithms for Efficient Federated and Decentralized Learning, by Sebastian U. Stich, Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland | |
Contributed Oral Presentation Session 1 (15 minutes per talk including Q&A)
|
|
Invited Talk 2: The ML data center is dead: What comes next?, by Nic Lane, the University of Cambridge, UK | |
Break | |
Contributed Oral Presentation Session 2 (15 minutes per talk including Q&A)
|
|
Invited Talk 3: Pandemic Response with Crowdsourced Data: The Participatory Privacy Preserving Approach, by Ramesh Raskar, Massachusetts Institute of Technology (MIT), USA | |
Industrial Panel
Topic: Challenges and Experiences in Practical Federated Learning Systems Questions:
|
|
Poster Session 1 (for both Oral and Poster Presenters) & Industrial Booths (Gathertown)
|
|
Break | |
Invited Talk 4: Federated Hyperparameter Tuning: Challenges, Baselines, and Connections to Weight-Sharing, by Ameet Talwalkar, Carnegie Mellon University (CMU), USA | |
Contributed Oral Presentation Session 3 (15 minutes per talk including Q&A)
|
|
Invited Talk 5: Optimization Aspects of Personalized Federated Learning, by Filip Hanzely, Toyota Technological Institute at Chicago (TTIC), USA | |
Poster Session 2 (for both Oral and Poster Presenters) & Industrial Booths (Gathertown)
|
|
Break | |
Invited Talk 6: Dreaming of Federated Robustness: Inherent Barriers and Unavoidable Tradeoffs, by Dimitris Papailiopoulos, The University of Wisconsin–Madison (UW–Madison), USA | |
Invited Talk 7: Securing Secure Aggregation: Mitigating Multi-Round Privacy Leakage in Federated Learning, by Salman Avestimehr, University of Southern California (USC), USA | |
Closing Remarks | |
  |   |
    |
Title: Algorithms for Efficient Federated and Decentralized Learning Speaker: Sebastian U. Stich, Ecole Polytechnique Fédérale de Lausanne (EPFL), Switzerland Biography
|
|
    |
Title: The ML data center is dead: What comes next? Speaker: Nic Lane, the University of Cambridge, UK Biography
|
|
    |
Title: Pandemic Response with Crowdsourced Data: The Participatory Privacy Preserving Approach Speaker: Ramesh Raskar, Massachusetts Institute of Technology (MIT), USA Biography
|
|
    |
Title: Federated Hyperparameter Tuning: Challenges, Baselines, and Connections to Weight-Sharing Speaker: Ameet Talwalkar, Carnegie Mellon University (CMU), USA Biography
|
|
    |
Title: Optimization Aspects of Personalized Federated Learning Speaker: Filip Hanzely, Toyota Technological Institute at Chicago (TTIC), USA Biography
|
|
    |
Title: Dreaming of Federated Robustness: Inherent Barriers and Unavoidable Tradeoffs Speaker: Dimitris Papailiopoulos, The University of Wisconsin–Madison (UW–Madison), USA Biography
|
|
    |
Title: Securing Secure Aggregation: Mitigating Multi-Round Privacy Leakage in Federated Learning Speaker: Salman Avestimehr, University of Southern California (USC), USA Biography
Dr. Avestimehr has received a number of awards for his research, including the James L. Massey Research & Teaching Award from IEEE Information Theory Society, an Information Theory Society and Communication Society Joint Paper Award, a Presidential Early Career Award for Scientists and Engineers (PECASE) from the White House (President Obama), a Young Investigator Program (YIP) award from the U. S. Air Force Office of Scientific Research, a National Science Foundation CAREER award, the David J. Sakrison MemorialPrize, and several Best Paper Awards at Conferences. He has been an Associate Editor for IEEE Transactions on Information Theory and a general Co-Chair of the 2020 International Symposium on Information Theory (ISIT). He is a fellow of IEEE. |
Training machine learning models in a centralized fashion often faces significant challenges due to regulatory and privacy concerns in real-world use cases. These include distributed training data, computational resources to create and maintain a central data repository, and regulatory guidelines (GDPR, HIPAA) that restrict sharing sensitive data. Federated learning (FL) is a new paradigm in machine learning that can mitigate these challenges by training a global model using distributed data, without the need for data sharing. The extensive application of machine learning to analyze and draw insight from real-world, distributed, and sensitive data necessitates familiarization with and adoption of this relevant and timely topic among the scientific community.
Despite the advantages of FL, and its successful application in certain industry-based cases, this field is still in its infancy due to new challenges that are imposed by limited visibility of the training data, potential lack of trust among participants training a single model, potential privacy inferences, and in some cases, limited or unreliable connectivity.
The goal of this workshop is to bring together researchers and practitioners interested in FL. This day-long event will facilitate interaction among students, scholars, and industry professionals from around the world to understand the topic, identify technical challenges, and discuss potential solutions. This will lead to an overall advancement of FL and its impact in the community, while noting that FL has become an increasingly popular topic in the ICML community in recent years.
Topics of interest include, but are not limited to, the following:
|
|
|
The workshop will have invited talks on a diverse set of topics related to FL. In addition, we plan to have an industrial panel (over Zoom) and booth (on GatherTown), where researchers from industry will talk about challenges and solutions from an industrial perspective.
More information on previous workshops can be found here.
Our workshop has no formal proceedings. Accepted papers will be posted on the workshop webpage. We welcome submissions of unpublished papers, including those that are submitted/accepted to other venues if that other venue allows so. We will not accept papers that are already published though, because the goal of the workshop is to share recent results and discuss open problems.
Submissions are recommended (but not mandatory) to be no more than 6 pages long, excluding references, and follow ICML-21 template. Submissions are double-blind (author identity shall not be revealed to the reviewers). An optional appendix of arbitrary length is allowed and should be put at the end of the paper (after references).
Easychair submission link: https://easychair.org/conferences/?conf=flicml21
If you have any enquiries, please email us at: flicml21@easychair.org