International Workshop on Federated Learning: Recent Advances and New Challenges
in Conjunction with NeurIPS 2022 (FL-NeurIPS'22)


Final Submission Deadline: September 22, 2022 (23:59:59 AoE)
Notification Due: October 20, 2022
Workshop Date: Friday, December 2, 2022 (08:30-17:00)
Venue: Room 298-299 (2nd Floor), New Orleans Convention Center, New Orleans, LA, USA

Virtual Session

Based on the survey that was sent earlier, we have 9 papers with authors interested in a virtual presentation. We will have a 90-minute virtual session of the workshop this Friday (Dec. 9) starting at 11 am EST / 4 pm UTC.

If you are interested in attending this virtual session, please fill in the following form by the end of Thursday (Dec. 8): https://forms.gle/a432U639Sx2sn7Xe8. We will send meeting info to the email address that you include in the form before the virtual session starts on Friday.


Workshop Program

Please note: Due to logistics reason, this workshop starts at 8:30 am in New Orleans local time, an hour earlier than the NeurIPS'22 main conference.

  
New Orleans Time
(UTC-6)
Activity
  
08:30 – 08:35 Opening Remarks (by Shiqiang Wang)
08:35 – 09:00 Invited Talk 1: Trustworthy Federated Learning, by Bo Li
09:00 – 09:20 Invited Talk 2: Asynchronous Optimization: Delays, Stability, and the Impact of Data Heterogeneity, by Konstantin Mishchenko
09:20 – 10:00 Oral Presentation Session 1 (7 min talk + 3 min Q&A each)
  1. Outstanding Paper Award: Jayanth Reddy Regatti, Songtao Lu, Abhishek Gupta and Ness Shroff. Conditional Moment Alignment for Improved Generalization in Federated Learning
  2. Outstanding Paper Award: Sai Praneeth Karimireddy, Wenshuo Guo and Michael Jordan. Mechanisms that Incentivize Data Sharing in Federated Learning
  3. Hanhan Zhou, Tian Lan, Guru Prasadh Venkataramani and Wenbo Ding. Federated Learning with Online Adaptive Heterogeneous Local Models
  4. Baturalp Buyukates, Jinhyun So, Hessam Mahdavifar and Salman Avestimehr. LightVeriFL: Lightweight and Verifiable Secure Federated Learning
10:00 – 10:30 Coffee Break
10:30 – 11:10 Oral Presentation Session 2 (7 min talk + 3 min Q&A each)
  1. Francesco Pase, Berivan Isik, Deniz Gunduz, Tsachy Weissman and Michele Zorzi. Efficient Federated Random Subnetwork Training
  2. Filippo Galli, Sayan Biswas, Gangsoo Zeong, Tommaso Cucinotta and Catuscia Palamidessi. Group privacy for personalized federated learning
  3. Yae Jee Cho, Divyansh Jhunjhunwala, Tian Li, Virginia Smith and Gauri Joshi. To Federate or Not To Federate: Incentivizing Client Participation in Federated Learning
  4. Marco Bornstein, Tahseen Rabbani, Evan Wang, Amrit Bedi and Furong Huang. SWIFT: Rapid Decentralized Federated Learning via Wait-Free Model Communication
11:10 – 11:15 Award Ceremony
11:15 – 12:00 Poster Session 1
  1. Jaeheon Kim and Bong Jun Choi. FedTH : Tree-based Hierarchical Image Classification in Federated Learning
  2. M. Taha Toghani and Cesar Uribe. Unbounded Gradients in Federated Learning with Buffered Asynchronous Aggregation
  3. Khaoula Chehbouni, Gilles Caporossi, Reihaneh Rabbany, Martine De Cock and Golnoosh Farnadi. Early Detection of Sexual Predators with Federated Learning
  4. Timothy Castiglia, Shiqiang Wang and Stacy Patterson. Self-Supervised Vertical Federated Learning
  5. Pei Fang and Jinghui Chen. On the Vulnerability of Backdoor Defenses for Federated Learning
  6. Mariel AF Werner, Lie He, Sai Praneeth Karimireddy, Michael Jordan and Martin Jaggi. Towards Provably Personalized Federated Learning via Threshold-Clustering of Similar Clients
  7. Xinwei Zhang, Bingqing Song, Mehrdad Honarkhah, Jie Ding and Mingyi Hong. Building Large Machine Learning Models from Small Distributed Models: A Layer Matching Approach
  8. Yuqing Zhu, Xiang Yu, Yi-Hsuan Tsai, Francesco Pittaluga, Masoud Faraki, Manmohan Chandraker and Yu-Xiang Wang. Voting-Based Approaches for Differentially Private Federated Learning
  9. Md Ibrahim Ibne Alam, Koushik Kar, Theodoros Salonidis and Horst Samulowitz. DASH: Decentralized CASH for Federated Learning
  10. Yujia Wang, Pei Fang and Jinghui Chen. Accelerating Adaptive Federated Optimization with Local Gossip Communications
  11. Dimitris Stripelis, Umang Gupta, Greg Ver Steeg and Jose Luis Ambite. Federated Progressive Sparsification (Purge-Merge-Tune)+
  12. Pedro Valdeira, Yuejie Chi, Claudia Soares and Joao Xavier. A Multi-Token Coordinate Descent Method for Vertical Federated Learning
  13. Rajarshi Saha, Michal Yemini, Emre Ozfatura, Deniz Gunduz and Andrea Goldsmith. ColRel: Collaborative Relaying for Federated Learning over Intermittently Connected Networks
  14. Ziwei Li, Hong-You Chen, Han Wei Shen and Wei-Lun Chao. Understanding Federated Learning through Loss Landscape Visualizations: A Pilot Study
  15. Krishna Pillutla, Yassine Laguel, Jérôme Malick and Zaid Harchaoui. Differentially Private Federated Quantiles with the Distributed Discrete Gaussian Mechanism
  16. Chen Dun, Mirian Hipolito Garcia, Dimitrios Dimitriadis, Christopher Jermaine and Anastasios Kyrillidis. Efficient and Light-Weight Federated Learning via Asynchronous Distributed Dropout
  17. Junyi Li and Heng Huang. FedGRec: Federated Graph Recommender System with Lazy Update of Latent Embeddings
  18. Ljubomir Rokvic, Panayiotis Danassis and Boi Faltings. Privacy-Preserving Data Filtering in Federated Learning Using Influence Approximation
  19. John Nguyen, Jianyu Wang, Kshitiz Malik, Maziar Sanjabi and Mike Rabbat. Where to Begin? On the Impact of Pre-Training and Initialization in Federated Learning
  20. Joseph Lavond, Minhao Cheng and Yao Li. Trusted Aggregation (TAG): Model Filtering Backdoor Defense In Federated Learning
  21. Stefanos Laskaridis, Javier Fernandez-Marques and Łukasz Dudziak. Cross-device Federated Architecture Search
  22. Parker Newton, Olivia Choudhury, Bill Horne, Vidya Ravipati, Divya Bhargavi and Ujjwal Ratan. Client-Private Secure Aggregation for Privacy-Preserving Federated Learning
  23. Shanshan Wu, Tian Li, Zachary Charles, Yu Xiao, Ken Liu, Zheng Xu and Virginia Smith. Motley: Benchmarking Heterogeneity and Personalization in Federated Learning
  24. Daniel Lopes, João Nadkarni, Filipe Assunção, Miguel Lopes and Luís Rodrigues. Federated Learning for Predicting the Next Node in Action Flows
  25. Chuan Guo, Kamalika Chaudhuri, Pierre Stock and Mike Rabbat. The Interpolated MVU Mechanism For Communication-efficient Private Federated Learning
  26. Yi Sui, Junfeng Wen, Yenson Lau, Brendan Ross and Jesse Cresswell. Find Your Friends: Personalized Federated Learning with the Right Collaborators
  27. Saeed Vahidian, Mahdi Morafah, Chen Chen, Mubarak Shah and Bill Lin. Rethinking Data Heterogeneity in Federated Learning: Introducing a New Notion and Standard Benchmarks
  28. Afroditi Papadaki, Natalia Martinez, Martin Bertran, Guillermo Sapiro and Miguel Rodrigues. Federated Fairness without Access to Demographics
  29. Mirian Hipolito Garcia, Andre Manoel, Daniel Madrigal, Robert Sim and Dimitrios Dimitriadis. FLUTE: A Scalable, Extensible Framework for High-Performance Federated Learning Simulations
  30. Aleksei Triastcyn, Matthias Reisser and Christos Louizos. Decentralized Learning with Random Walks and Communication-Efficient Adaptive Optimization
  31. Parsa Assadi, Byung Hoon Ahn and Hadi Esmaeilzadeh. Accelerating Federated Learning Through Attention on Local Model Updates
12:00 – 13:30 Lunch Break
13:30 – 14:10 Oral Presentation Session 3 (7 min talk + 3 min Q&A each)
  1. Sharut Gupta, Kartik Ahuja, Mohammad Havaei, Niladri Chatterjee and Yoshua Bengio. FL Games: A Federated Learning Framework for Distribution Shifts
  2. Simone Bottoni, Giulio Zizzo, Stefano Braghin and Alberto Trombetta. Verifiable Federated Machine Learning
  3. Yeojoon Youn, Bhuvesh Kumar and Jacob Abernethy. Accelerated Federated Optimization with Quantization
  4. Xingchen Ma, Junyi Zhu and Matthew Blaschko. Tackling Personalized Federated Learning with Label Concept Drift via Hierarchical Bayesian Modeling
14:10 – 15:00 Panel Discussion
15:00 – 15:30 Coffee Break
15:30 – 15:50 Invited Talk 3: On the Unreasonable Effectiveness of Federated Averaging with Heterogenous Data, by Jianyu Wang
15:50 – 16:15 Invited Talk 4: Scalable and Communication-Efficient Vertical Federated Learning, by Stacy Patterson
16:15 – 17:00 Poster Session 2
  1. Ali Dadras, Karthik Prakhya and Alp Yurtsever. Federated Frank-Wolfe Algorithm
  2. Atahan Ozer, Kadir Burak Buldu, Abdullah Akgül and Gozde Unal. How to Combine Variational Bayesian Networks in Federated Learning
  3. Chhavi Sharma, Vishnu Narayanan and Balamurugan Palaniappan. Stochastic Gradient Methods with Compressed Communication for Decentralized Saddle Point Problems
  4. Batiste Le bars, Aurélien Bellet, Marc Tommasi, Erick Lavoie and Anne-marie Kermarrec. Refined Convergence and Topology Learning for Decentralized Optimization with Heterogeneous Data
  5. Amr Abourayya, Michael Kamp, Erman Ayday, Jens Kleesiek, Kanishka Rao, Geoffrey Webb and Bharat Rao. AIMHI: Protecting Sensitive Data through Federated Co-Training
  6. Ilias Driouich, Chuan Xu, Giovanni Neglia, Frederic Giroire and Eoin Thomas. A Novel Model-Based Attribute Inference Attack in Federated Learning
  7. Athul Sreemathy Raj, Irene Tenison, Kacem Khaled, Felipe Gohring de Magalhães and Gabriela Nicolescu. FedSHIBU: Federated Similarity-based Head Independent Body Update
  8. Jaewoo Shin, Taehyeon Kim and Se-Young Yun. Revisiting the Activation Function for Federated Image Classification
  9. Zhaozhuo Xu, Luyang Liu, Zheng Xu and Anshumali Shrivastava. Adaptive Sparse Federated Learning in Large Output Spaces via Hashing
  10. Holger R Roth, Yan Cheng, Yuhong Wen, Isaac Yang, Ziyue Xu, YuanTing Hsieh, Kristopher Kersten, Ahmed Harouni, Can Zhao, Kevin Lu, Zhihong Zhang, Wenqi Li, Andriy Myronenko, Dong Yang, Sean Yang, Nicola Rieke, Abood Quraini, Chester Chen, Daguang Xu, Nic Ma, Prerna Dogra, Mona G Flores and Andrew Feng. FLARE: Federated Learning from Simulation to Real-World
  11. Liam Collins, Enmao Diao, Tanya Roosta, Jie Ding and Tao Zhang. PerFedSI: A Framework for Personalized Federated Learning with Side Information
  12. Shengyuan Hu, Jack Goetz, Kshitiz Malik, Hongyuan Zhan, Zhe Liu and Yue Liu. FedSynth: Gradient Compression via Synthetic Data in Federated Learning
  13. Sourasekhar Banerjee, Alp Yurtsever and Monowar H Bhuyan. Personalized Multi-tier Federated Learning
  14. Saeed Vahidian, Mahdi Morafah, Weijia Wang and Bill Lin. FLIS: Clustered Federated Learning via Inference Similarity for Non-IID Data Distribution
  15. Hamid Mozaffari, Virendra Marathe and Dave Dice. Private and Robust Federated Learning using Private Information Retrieval and Norm Bounding
  16. Karthik Prasad, Sayan Ghosh, Graham Cormode, Ilya Mironov, Ashkan Yousefpour and Pierre Stock. Reconciling Security and Communication Efficiency in Federated Learning
  17. Shashi Raj Pandey, Lam Nguyen and Petar Popovski. FedToken: Tokenized Incentives for Data Contribution in Federated Learning
  18. Giulio Zizzo, Ambrish Rawat, Naoise Holohan and Seshu Tirupathi. Federated Continual Learning with Differentially Private Data Sharing
  19. Zhiwei Tang, Yanmeng Wang and Tsung-Hui Chang. z-SignFedAvg: A unified sign-based stochastic compression for federated learning
  20. Kiwan Maeng, Chuan Guo, Sanjay Kariyappa and Edward Suh. Measuring and Controlling Split Layer Privacy Leakage Using Fisher Information
  21. Yuanhao Xiong, Ruochen Wang, Minhao Cheng, Felix Yu and Cho-Jui Hsieh. FedDM: Iterative Distribution Matching for Communication-Efficient Federated Learning
  22. Haibo Yang, Peiwen Qiu, Prashant Khanduri and Jia Liu. With a Little Help from My Friend: Server-Aided Federated Learning with Partial Client Participation
  23. Yue Niu, Saurav Prakash, Souvik Kundu, Sunwoo Lee and Salman Avestimehr. Federated Learning of Large Models at the Edge via Principal Sub-Model Training
  24. Yuhang Yao, Mohammad Mahdi Kamani, Zhongwei Cheng, Lin Chen, Carlee Joe-Wong and Tianqiang Liu. FedRule: Federated Rule Recommendation System with Graph Neural Networks
  25. Chulin Xie, Pin-Yu Chen, Ce Zhang and Bo Li. Improving Vertical Federated Learning by Efficient Communication with ADMM
  26. Virendra Marathe, Pallika Kanani and Daniel W. Peterson. Subject Level Differential Privacy with Hierarchical Gradient Averaging
  27. Jingtao Li, Lingjuan Lyu, Daisuke Iso, Chaitali Chakrabarti and Michael Spranger. MocoSFL: Enabling Cross-Client Collaborative Self-Supervised Learning
  28. Marco Schreyer, Hamed Hemati, Damian Borth and Miklos A. Vasarhelyi. Federated Continual Learning to Detect Accounting Anomalies in Financial Auditing
  29. Motasem Alfarra, Juan Camilo Perez, Egor Shulgin, Peter Richtárik and Bernard Ghanem. Certified Robustness in Federated Learning
  30. Sara Babakniya, Souvik Kundu, Saurav Prakash, Yue Niu and Salman Avestimehr. Federated Sparse Training: Lottery Aware Model Compression for Resource Constrained Edge
  31. Mathieu Even, Hadrien Hendrikx and Laurent Massoulié. Asynchronous Speedup in Decentralized Optimization
17:00 End of Workshop
   


Invited Talks

   

Title: Trustworthy Federated Learning

Speaker: Bo Li, Assistant Professor, University of Illinois at Urbana–Champaign (UIUC)

Biography
Dr. Bo Li is an assistant professor in the Department of Computer Science at the University of Illinois at Urbana–Champaign. She is the recipient of the IJCAI Computers and Thought Award, Alfred P. Sloan Research Fellowship, NSF CAREER Award, MIT Technology Review TR-35 Award, Dean's Award for Excellence in Research, C.W. Gear Outstanding Junior Faculty Award, Intel Rising Star award, Symantec Research Labs Fellowship, Rising Star Award, Research Awards from Tech companies such as Amazon, Facebook, Intel, and IBM, and best paper awards at several top machine learning and security conferences. Her research focuses on both theoretical and practical aspects of trustworthy machine learning, security, machine learning, privacy, and game theory. She has designed several scalable frameworks for trustworthy machine learning and privacy-preserving data publishing systems. Her work has been featured by major publications and media outlets such as Nature, Wired, Fortune, and New York Times.

   

Title: Asynchronous Optimization: Delays, Stability, and the Impact of Data Heterogeneity

Speaker: Konstantin Mishchenko, Research Scientist, Samsung

Biography
Konstantin Mishchenko is a Research Scientist at Samsung in Cambridge, UK, working on optimization theory and federated learning. He received his double-degree MSc from Paris-Dauphine and École normale supérieure Paris-Saclay in 2017, and he did his PhD under the supervision of Peter Richtárik from 2017 to 2021. From December 2021 to October 2022, he was a postdoc in the group of Francis Bach at Inria Paris. Konstantin had research internships at Google Brain and Amazon, has been recognized as an outstanding reviewer for NeurIPS19, ICML20, AAAI20, ICLR21, ICML21, NeurIPS21, ICLR22, ICML22 and served as an Area Chair for ACML 2022. He was named a Rising Career in Data Science by the University of Chicago in 2021 and has published 11 conference papers at ICML, ICLR, NeurIPS, AISTATS, and UAI.

   

Title: On the Unreasonable Effectiveness of Federated Averaging with Heterogenous Data

Speaker: Jianyu Wang, Research Scientist, Meta

Biography
Jianyu Wang is a research scientist at Meta. He received his Ph.D. from ECE department at Carnegie Mellon University in 2022 and received his B.Eng. in Electrical Engineering from Tsinghua University in 2017. He was a research intern with Google Research in 2020 and 2021, and with Facebook AI Research in 2019. His research interests are federated learning, distributed optimization, and systems for large-scale machine learning. His awards and honors include the Qualcomm Ph.D. Fellowship (2019), the best student paper award at NeurIPS 2019 federated learning workshop, and the best poster award at NSF CEDO workshop (2021).

   

Title: Scalable and Communication-Efficient Vertical Federated Learning

Speaker: Stacy Patterson, Associate Professor, Rensselaer Polytechnic Institute

Biography
Stacy Patterson is an Associate Professor in the Department of Computer Science at Rensselaer Polytechnic Institute. She received the MS and PhD in computer science from UC Santa Barbara in 2003 and 2009, respectively. From 2009-2011, she was a postdoctoral scholar at the Center for Control, Dynamical Systems and Computation at UC Santa Barbara. From 2011-2013, she was a postdoctoral fellow in the Department of Electrical Engineering at Technion - Israel Institute of Technology. Dr. Patterson is the recipient of a Viterbi postdoctoral fellowship, the IEEE CSS Axelby Outstanding Paper Award, and an NSF CAREER award. She serves as an Associate Editor for the IEEE Transactions on Control of Network Systems. Her research interests include distributed algorithms, cooperative control, and edge and cloud computing.


Awards


Accepted Papers (Oral Presentation)

  1. Jayanth Reddy Regatti, Songtao Lu, Abhishek Gupta and Ness Shroff. Conditional Moment Alignment for Improved Generalization in Federated Learning
  2. Sai Praneeth Karimireddy, Wenshuo Guo and Michael Jordan. Mechanisms that Incentivize Data Sharing in Federated Learning
  3. Hanhan Zhou, Tian Lan, Guru Prasadh Venkataramani and Wenbo Ding. Federated Learning with Online Adaptive Heterogeneous Local Models
  4. Baturalp Buyukates, Jinhyun So, Hessam Mahdavifar and Salman Avestimehr. LightVeriFL: Lightweight and Verifiable Secure Federated Learning
  5. Francesco Pase, Berivan Isik, Deniz Gunduz, Tsachy Weissman and Michele Zorzi. Efficient Federated Random Subnetwork Training
  6. Filippo Galli, Sayan Biswas, Gangsoo Zeong, Tommaso Cucinotta and Catuscia Palamidessi. Group privacy for personalized federated learning
  7. Yae Jee Cho, Divyansh Jhunjhunwala, Tian Li, Virginia Smith and Gauri Joshi. To Federate or Not To Federate: Incentivizing Client Participation in Federated Learning
  8. Marco Bornstein, Tahseen Rabbani, Evan Wang, Amrit Bedi and Furong Huang. SWIFT: Rapid Decentralized Federated Learning via Wait-Free Model Communication
  9. Sharut Gupta, Kartik Ahuja, Mohammad Havaei, Niladri Chatterjee and Yoshua Bengio. FL Games: A Federated Learning Framework for Distribution Shifts
  10. Simone Bottoni, Giulio Zizzo, Stefano Braghin and Alberto Trombetta. Verifiable Federated Machine Learning
  11. Yeojoon Youn, Bhuvesh Kumar and Jacob Abernethy. Accelerated Federated Optimization with Quantization
  12. Xingchen Ma, Junyi Zhu and Matthew Blaschko. Tackling Personalized Federated Learning with Label Concept Drift via Hierarchical Bayesian Modeling

Accepted Papers (Poster Presentation)

  1. Jaeheon Kim and Bong Jun Choi. FedTH : Tree-based Hierarchical Image Classification in Federated Learning
  2. M. Taha Toghani and Cesar Uribe. Unbounded Gradients in Federated Learning with Buffered Asynchronous Aggregation
  3. Khaoula Chehbouni, Gilles Caporossi, Reihaneh Rabbany, Martine De Cock and Golnoosh Farnadi. Early Detection of Sexual Predators with Federated Learning
  4. Timothy Castiglia, Shiqiang Wang and Stacy Patterson. Self-Supervised Vertical Federated Learning
  5. Pei Fang and Jinghui Chen. On the Vulnerability of Backdoor Defenses for Federated Learning
  6. Mariel AF Werner, Lie He, Sai Praneeth Karimireddy, Michael Jordan and Martin Jaggi. Towards Provably Personalized Federated Learning via Threshold-Clustering of Similar Clients
  7. Xinwei Zhang, Bingqing Song, Mehrdad Honarkhah, Jie Ding and Mingyi Hong. Building Large Machine Learning Models from Small Distributed Models: A Layer Matching Approach
  8. Yuqing Zhu, Xiang Yu, Yi-Hsuan Tsai, Francesco Pittaluga, Masoud Faraki, Manmohan Chandraker and Yu-Xiang Wang. Voting-Based Approaches for Differentially Private Federated Learning
  9. Md Ibrahim Ibne Alam, Koushik Kar, Theodoros Salonidis and Horst Samulowitz. DASH: Decentralized CASH for Federated Learning
  10. Yujia Wang, Pei Fang and Jinghui Chen. Accelerating Adaptive Federated Optimization with Local Gossip Communications
  11. Dimitris Stripelis, Umang Gupta, Greg Ver Steeg and Jose Luis Ambite. Federated Progressive Sparsification (Purge-Merge-Tune)+
  12. Pedro Valdeira, Yuejie Chi, Claudia Soares and Joao Xavier. A Multi-Token Coordinate Descent Method for Vertical Federated Learning
  13. Rajarshi Saha, Michal Yemini, Emre Ozfatura, Deniz Gunduz and Andrea Goldsmith. ColRel: Collaborative Relaying for Federated Learning over Intermittently Connected Networks
  14. Ziwei Li, Hong-You Chen, Han Wei Shen and Wei-Lun Chao. Understanding Federated Learning through Loss Landscape Visualizations: A Pilot Study
  15. Krishna Pillutla, Yassine Laguel, Jérôme Malick and Zaid Harchaoui. Differentially Private Federated Quantiles with the Distributed Discrete Gaussian Mechanism
  16. Chen Dun, Mirian Hipolito Garcia, Dimitrios Dimitriadis, Christopher Jermaine and Anastasios Kyrillidis. Efficient and Light-Weight Federated Learning via Asynchronous Distributed Dropout
  17. Junyi Li and Heng Huang. FedGRec: Federated Graph Recommender System with Lazy Update of Latent Embeddings
  18. Ljubomir Rokvic, Panayiotis Danassis and Boi Faltings. Privacy-Preserving Data Filtering in Federated Learning Using Influence Approximation
  19. John Nguyen, Jianyu Wang, Kshitiz Malik, Maziar Sanjabi and Mike Rabbat. Where to Begin? On the Impact of Pre-Training and Initialization in Federated Learning
  20. Joseph Lavond, Minhao Cheng and Yao Li. Trusted Aggregation (TAG): Model Filtering Backdoor Defense In Federated Learning
  21. Stefanos Laskaridis, Javier Fernandez-Marques and Łukasz Dudziak. Cross-device Federated Architecture Search
  22. Parker Newton, Olivia Choudhury, Bill Horne, Vidya Ravipati, Divya Bhargavi and Ujjwal Ratan. Client-Private Secure Aggregation for Privacy-Preserving Federated Learning
  23. Shanshan Wu, Tian Li, Zachary Charles, Yu Xiao, Ken Liu, Zheng Xu and Virginia Smith. Motley: Benchmarking Heterogeneity and Personalization in Federated Learning
  24. Daniel Lopes, João Nadkarni, Filipe Assunção, Miguel Lopes and Luís Rodrigues. Federated Learning for Predicting the Next Node in Action Flows
  25. Chuan Guo, Kamalika Chaudhuri, Pierre Stock and Mike Rabbat. The Interpolated MVU Mechanism For Communication-efficient Private Federated Learning
  26. Yi Sui, Junfeng Wen, Yenson Lau, Brendan Ross and Jesse Cresswell. Find Your Friends: Personalized Federated Learning with the Right Collaborators
  27. Saeed Vahidian, Mahdi Morafah, Chen Chen, Mubarak Shah and Bill Lin. Rethinking Data Heterogeneity in Federated Learning: Introducing a New Notion and Standard Benchmarks
  28. Afroditi Papadaki, Natalia Martinez, Martin Bertran, Guillermo Sapiro and Miguel Rodrigues. Federated Fairness without Access to Demographics
  29. Mirian Hipolito Garcia, Andre Manoel, Daniel Madrigal, Robert Sim and Dimitrios Dimitriadis. FLUTE: A Scalable, Extensible Framework for High-Performance Federated Learning Simulations
  30. Aleksei Triastcyn, Matthias Reisser and Christos Louizos. Decentralized Learning with Random Walks and Communication-Efficient Adaptive Optimization
  31. Parsa Assadi, Byung Hoon Ahn and Hadi Esmaeilzadeh. Accelerating Federated Learning Through Attention on Local Model Updates
  32. Ali Dadras, Karthik Prakhya and Alp Yurtsever. Federated Frank-Wolfe Algorithm
  33. Atahan Ozer, Kadir Burak Buldu, Abdullah Akgül and Gozde Unal. How to Combine Variational Bayesian Networks in Federated Learning
  34. Chhavi Sharma, Vishnu Narayanan and Balamurugan Palaniappan. Stochastic Gradient Methods with Compressed Communication for Decentralized Saddle Point Problems
  35. Batiste Le bars, Aurélien Bellet, Marc Tommasi, Erick Lavoie and Anne-marie Kermarrec. Refined Convergence and Topology Learning for Decentralized Optimization with Heterogeneous Data
  36. Amr Abourayya, Michael Kamp, Erman Ayday, Jens Kleesiek, Kanishka Rao, Geoffrey Webb and Bharat Rao. AIMHI: Protecting Sensitive Data through Federated Co-Training
  37. Ilias Driouich, Chuan Xu, Giovanni Neglia, Frederic Giroire and Eoin Thomas. A Novel Model-Based Attribute Inference Attack in Federated Learning
  38. Athul Sreemathy Raj, Irene Tenison, Kacem Khaled, Felipe Gohring de Magalhães and Gabriela Nicolescu. FedSHIBU: Federated Similarity-based Head Independent Body Update
  39. Jaewoo Shin, Taehyeon Kim and Se-Young Yun. Revisiting the Activation Function for Federated Image Classification
  40. Zhaozhuo Xu, Luyang Liu, Zheng Xu and Anshumali Shrivastava. Adaptive Sparse Federated Learning in Large Output Spaces via Hashing
  41. Holger R Roth, Yan Cheng, Yuhong Wen, Isaac Yang, Ziyue Xu, YuanTing Hsieh, Kristopher Kersten, Ahmed Harouni, Can Zhao, Kevin Lu, Zhihong Zhang, Wenqi Li, Andriy Myronenko, Dong Yang, Sean Yang, Nicola Rieke, Abood Quraini, Chester Chen, Daguang Xu, Nic Ma, Prerna Dogra, Mona G Flores and Andrew Feng. FLARE: Federated Learning from Simulation to Real-World
  42. Liam Collins, Enmao Diao, Tanya Roosta, Jie Ding and Tao Zhang. PerFedSI: A Framework for Personalized Federated Learning with Side Information
  43. Shengyuan Hu, Jack Goetz, Kshitiz Malik, Hongyuan Zhan, Zhe Liu and Yue Liu. FedSynth: Gradient Compression via Synthetic Data in Federated Learning
  44. Sourasekhar Banerjee, Alp Yurtsever and Monowar H Bhuyan. Personalized Multi-tier Federated Learning
  45. Saeed Vahidian, Mahdi Morafah, Weijia Wang and Bill Lin. FLIS: Clustered Federated Learning via Inference Similarity for Non-IID Data Distribution
  46. Hamid Mozaffari, Virendra Marathe and Dave Dice. Private and Robust Federated Learning using Private Information Retrieval and Norm Bounding
  47. Karthik Prasad, Sayan Ghosh, Graham Cormode, Ilya Mironov, Ashkan Yousefpour and Pierre Stock. Reconciling Security and Communication Efficiency in Federated Learning
  48. Shashi Raj Pandey, Lam Nguyen and Petar Popovski. FedToken: Tokenized Incentives for Data Contribution in Federated Learning
  49. Giulio Zizzo, Ambrish Rawat, Naoise Holohan and Seshu Tirupathi. Federated Continual Learning with Differentially Private Data Sharing
  50. Zhiwei Tang, Yanmeng Wang and Tsung-Hui Chang. z-SignFedAvg: A unified sign-based stochastic compression for federated learning
  51. Kiwan Maeng, Chuan Guo, Sanjay Kariyappa and Edward Suh. Measuring and Controlling Split Layer Privacy Leakage Using Fisher Information
  52. Yuanhao Xiong, Ruochen Wang, Minhao Cheng, Felix Yu and Cho-Jui Hsieh. FedDM: Iterative Distribution Matching for Communication-Efficient Federated Learning
  53. Haibo Yang, Peiwen Qiu, Prashant Khanduri and Jia Liu. With a Little Help from My Friend: Server-Aided Federated Learning with Partial Client Participation
  54. Yue Niu, Saurav Prakash, Souvik Kundu, Sunwoo Lee and Salman Avestimehr. Federated Learning of Large Models at the Edge via Principal Sub-Model Training
  55. Yuhang Yao, Mohammad Mahdi Kamani, Zhongwei Cheng, Lin Chen, Carlee Joe-Wong and Tianqiang Liu. FedRule: Federated Rule Recommendation System with Graph Neural Networks
  56. Chulin Xie, Pin-Yu Chen, Ce Zhang and Bo Li. Improving Vertical Federated Learning by Efficient Communication with ADMM
  57. Virendra Marathe, Pallika Kanani and Daniel W. Peterson. Subject Level Differential Privacy with Hierarchical Gradient Averaging
  58. Jingtao Li, Lingjuan Lyu, Daisuke Iso, Chaitali Chakrabarti and Michael Spranger. MocoSFL: Enabling Cross-Client Collaborative Self-Supervised Learning
  59. Marco Schreyer, Hamed Hemati, Damian Borth and Miklos A. Vasarhelyi. Federated Continual Learning to Detect Accounting Anomalies in Financial Auditing
  60. Motasem Alfarra, Juan Camilo Perez, Egor Shulgin, Peter Richtárik and Bernard Ghanem. Certified Robustness in Federated Learning
  61. Sara Babakniya, Souvik Kundu, Saurav Prakash, Yue Niu and Salman Avestimehr. Federated Sparse Training: Lottery Aware Model Compression for Resource Constrained Edge
  62. Mathieu Even, Hadrien Hendrikx and Laurent Massoulié. Asynchronous Speedup in Decentralized Optimization

Call for Papers

Training machine learning models in a centralized fashion often faces significant challenges due to regulatory and privacy concerns in real-world use cases. These include distributed training data, computational resources to create and maintain a central data repository, and regulatory guidelines (GDPR, HIPAA) that restrict sharing sensitive data. Federated learning (FL) is a new paradigm in machine learning that can mitigate these challenges by training a global model using distributed data, without the need for data sharing. The extensive application of machine learning to analyze and draw insight from real-world, distributed, and sensitive data necessitates familiarization with and adoption of this relevant and timely topic among the scientific community.

Despite the advantages of FL, and its successful application in certain industry-based cases, this field is still in its infancy due to new challenges that are imposed by limited visibility of the training data, potential lack of trust among participants training a single model, potential privacy inferences, and in some cases, limited or unreliable connectivity.

The goal of this workshop is to bring together researchers and practitioners interested in FL. This day-long event will facilitate interaction among students, scholars, and industry professionals from around the world to understand the topic, identify technical challenges, and discuss potential solutions. This will lead to an overall advancement of FL and its impact in the community, while noting that FL has become an increasingly popular topic in the machine learning community in recent years.

Topics of interest include, but are not limited to, the following:
  • Adversarial attacks on FL
  • Applications of FL
  • Blockchain for FL
  • Beyond first-order methods in FL
  • Beyond local methods in FL
  • Communication compression in FL
  • Data heterogeneity in FL
  • Decentralized FL
  • Device heterogeneity in FL
  • Fairness in FL
  • Hardware for on-device FL
  • Variants of FL like split learning
  • Local methods in FL
  • Nonconvex FL
  • Operational challenges in FL
  • Optimization advances in FL
  • Partial participation in FL
  • Personalization in FL
  • Privacy concerns in FL
  • Privacy-preserving methods for FL
  • Resource-efficient FL
  • Systems and infrastructure for FL
  • Theoretical contributions to FL
  • Uncertainty in FL
  • Vertical FL

The workshop will have invited talks on a diverse set of topics related to FL. In addition, we plan to have an industrial panel and booth, where researchers from industry will talk about challenges and solutions from an industrial perspective.

More information on previous workshops can be found here.


Submission Instructions

Submissions should be no more than 6 pages long, excluding references, and follow NeurIPS'22 template. Submissions are double-blind (author identity shall not be revealed to the reviewers), so the submitted PDF file should not include any identifiable information of authors. An optional appendix of any length is allowed and should be put at the end of the paper (after references).

Submissions are collected on OpenReview at the following link: https://openreview.net/group?id=NeurIPS.cc/2022/Workshop/Federated_Learning.
Accepted papers and their review comments will be posted on OpenReview in public. Due to the short timeline, we will not have a rebuttal period, but the authors are encouraged to interact and discuss with reviewers on OpenReview after the acceptance notifications are sent out. Rejected papers and their reviews will remain private and not posted in public.

For questions, please contact: fl-neurips-2022@googlegroups.com


Proceedings and Dual Submission Policy

Our workshop does not have formal proceedings, i.e., it is non-archival. Accepted papers will be available in public on OpenReview together with the reviewers' comments. Revisions to accepted papers will be allowed until shortly before the workshop date.

We welcome submissions of unpublished papers, including those that are submitted to other venues if that other venue allows so. However, papers that have been accepted to an archival venue as of Sept. 21, 2022 should not be resubmitted to this workshop, because the goal of the workshop is to share recent results and discuss open problems. Specifically, papers that have been accepted to NeurIPS'22 main conference should not be resubmitted to this workshop.


Presentation Format

The workshop will primarily take place physically with in person attendance. For presenters who cannot attend in person, it is planned to be made possible to connect remotely over Zoom for the oral talks. However, the poster sessions will be in-person only. Depending on the situation, we may include a lightening talk session for accepted poster presentations where the presenters cannot attend physically, or organize a separate virtual session after the official workshop date. If a paper is accepted as an oral talk, the NeurIPS organizers require a pre-recording of the presentation by early November, which will be made available for virtual participants to view. All accepted papers will be posted on OpenReview and linked on our webpage.


Organizing Committee


Program Committee

  • Aditya Balu (Iowa State University)
  • Ali Anwar (University of Minnesota)
  • Alp Yurtsever (Umea University)
  • Ambrish Rawat (IBM Research)
  • Anastasios Kyrillidis (Rice University)
  • Andre Manoel (Microsoft)
  • Andrew Silva (Georgia Institute of Technology)
  • Ang Li (Duke University)
  • Anran Li (Nanyang Technological University)
  • Ashkan Yousefpour (Meta)
  • Aurélien Bellet (INRIA)
  • Berivan Isik (Amazon)
  • Bing Luo (Duke University)
  • Bingsheng He (National University of Singapore)
  • Carlee Joe-Wong (Carnegie Mellon University)
  • Chao Ren (Nanyang Technological University)
  • Chaoyang He (University of Southern California)
  • Chuan Xu (INRIA)
  • Chuizheng Meng (University of Southern California)
  • Chulin Xie (University of Illinois, Urbana Champaign)
  • Dianbo Liu (University of Montreal)
  • Dimitrios Dimitriadis (Microsoft Research)
  • Divyansh Jhunjhunwala (Carnegie Mellon University)
  • Egor Shulgin (KAUST)
  • Enmao Diao (Duke University)
  • Farzin Haddadpour (Yale University)
  • Feng Yan (University of Houston)
  • Giovanni Neglia (INRIA)
  • Giulio Zizzo (IBM Research)
  • Grigory Malinovsky (KAUST)
  • Haibo Yang (Ohio State University)
  • Hongyi Wang (Carnegie Mellon University)
  • Hongyuan Zhan (Meta)
  • Javier Fernandez-Marques (Samsung AI)
  • Jayanth Reddy Regatti (Ohio State University)
  • Jesse C Cresswell (Layer 6 AI)
  • Jia Liu (Ohio State University)
  • Jiankai Sun (ByteDance Inc.)
  • Jianyu Wang (Facebook)
  • Jiayi Wang (University of Utah)
  • Jihong Park (Deakin University)
  • Jinghui Chen (Pennsylvania State University)
  • Jinhyun So (University of Southern California)
  • John Nguyen (Facebook)
  • Junyi Li (University of Pittsburgh)
  • Kshitiz Malik (University of Illinois, Urbana-Champaign)
  • Kai Yi (KAUST)
  • Kallista Bonawitz (Google)
  • Kamalika Chaudhuri (Facebook)
  • Kevin Hsieh (Microsoft)
  • Konstantin Mishchenko (Ecole Normale Supérieure de Paris)
  • Lie He (Swiss Federal Institute of Technology Lausanne)
  • Lingjuan Lyu (Sony AI)
  • Mathieu Even (INRIA)
  • Matthias Reisser (Qualcomm)
  • Mehrdad Mahdavi (Pennsylvania State University)
  • Mi Zhang (Ohio State University)
  • Michael Kamp (Institute for AI in Medicine IKIM)
  • Michael Rabbat (McGill University)
  • Michal Yemini (Princeton University)
  • Mingyi Hong (Iowa State University)
  • Mingzhe Chen (University of Miami)
  • Minhao Cheng (Hong Kong University of Science and Technology)
  • M. Taha Toghani (Rice University)
  • Nikola Konstantinov (ETH Zurich)
  • Ningning Ding (Northwestern University)
  • Pranay Sharma (Carnegie Mellon University)
  • Paulo Abelha Ferreira (Dell Technologies)
  • Pengchao Han (Chinese University of Hong Kong, Shenzhen)
  • Peter Kairouz (Google)
  • Pierre Stock (Facebook)
  • Prashant Khanduri (Wayne State University)
  • Radu Marculescu (University of Texas, Austin)
  • Rui Lin (Chalmers University of Technology)
  • Ruihan Wu (Cornell University)
  • Saeed Vahidian (University of California, San Diego)
  • Sai Praneeth Karimireddy (University of California, Berkeley)
  • Samuel Horváth (Mohamed bin Zayed University of Artificial Intelligence)
  • Satoshi Hara (Osaka University)
  • Sayak Mukherjee (Pacific Northwest National Laboratory)
  • Se-Young Yun (KAIST)
  • Sebastian U Stich (CISPA Helmholtz Center for Information Security)
  • Shangwei Guo (Chongqing University)
  • Songtao Lu (IBM Research)
  • Songze Li (Hong Kong University of Science and Technology)
  • Stefanos Laskaridis (Samsung AI Center Cambridge)
  • Swanand Kadhe (IBM Research)
  • Tahseen Rabbani (University of Maryland, College Park)
  • Tara Javidi (University of California, San Diego)
  • Theodoros Salonidis (IBM Research)
  • Tianyi Chen (Rensselaer Polytechnic Institute)
  • Victor Valls (Trinity College, Dublin)
  • Virendra Marathe (Oracle)
  • Wenshuo Guo (University of California, Berkeley)
  • Xiang Yu (NEC)
  • Xiaoyong Yuan (Michigan Technological University)
  • Yae Jee Cho (Carnegie Mellon University)
  • Yang Liu (Tsinghua University)
  • Yi Zhou (IBM Research)
  • Zachary Charles (Google)
  • Zehui Xiong (Singapore University of Technology and Design)
  • Zhanhong Jiang (Johnson Controls Inc.)
  • Zhaozhuo Xu (Rice University)
  • Zheng Xu (Google)

Sponsored by

         

Organized by