18-898C: Special Topics in Signal Processing: Distributed and Federated Learning Algorithms
In this seminar-style class, students will read, critique, and present papers on distributed and federated optimization algorithms. Topics to be covered include but are not limited to mini-batch SGD and its convergence analysis, synchronous and asynchronous SGD, local-update SGD and federated optimization, gradient compression/quantization in distributed SGD, privacy/security in federated learning, and decentralized SGD. This course is closely related to 18-667, which is a lecture- and exam-based course on distributed and federated optimization. Instead, this will be a reading group-style class featuring student presentations and discussions, and it is targeted towards Ph.D. students doing conducting research in this area.
Last Modified: 2021-11-11 10:19AM
- Spring 2022