Efficient Algorithms for Distributed Networks with Applications in Communication and Learning
2024-08
Loading...
View/Download File
Persistent link to this item
Statistics
View StatisticsJournal Title
Journal ISSN
Volume Title
Title
Efficient Algorithms for Distributed Networks with Applications in Communication and Learning
Alternative title
Authors
Published Date
2024-08
Publisher
Type
Thesis or Dissertation
Abstract
Distributed networks connect independent nodes to act as one powerful system. These spread-out machines share resources (data, storage, processing) to handle large tasks efficiently. Such networks have various applications in communication and learning systems including distributed caching and distributed machine learning. Distributed caching improves performance by storing frequently accessed data closer to users, reducing retrieval times and network load. Distributed machine learning leverages the combined computing power of multiple machines to train complex models on massive datasets, leading to faster and more scalable results. However, this distributed nature introduces its own set of challenges, including (1) communication efficiency due to the exchange of large amounts of data and noisy communication links, (2) time-varying networks because of the dynamic nature of networks, and (3) privacy concerns due to the presence of eavesdropper nodes. This dissertation dives into two crucial distributed systems: distributed caching and distributed machine learning. The goal is to make them more efficient, reliable, and adaptable by tackling key challenges. To achieve this, the research leverages advanced techniques from various fields, including distributed optimization, statistical learning theory, probability theory, and communication and coding theory. In the first part of the thesis, we study a distributed caching setup in a fast-fading environment where each user has storage with a fraction of the transmitter's files. We focus on uncoded cache placement and the challenges of assigning signal levels without knowing channel conditions. We study the asymptotic behavior of the system and characterize the maximum achievable source rate for various scenarios, and finally provide an upper bound. We also propose a scheme using linear programming to show the characterization's looseness for the general case. Moving on to the second part, we consider distributed machine learning paradigms, focusing on methods to mitigate communication costs, time-varying links, and privacy concerns. To address the first two challenges, we propose a two-time scale algorithm. In the proposed method, one time scale suppresses the imperfect incoming information from neighboring agents, while the other time scale operates on the gradients of local cost functions. To resolve the privacy concern, we employ a differential privacy mechanism. We also support our theoretical results in both parts with significant improvements in numerical experiments.
Keywords
Description
University of Minnesota Ph.D. dissertation. August 2024. Major: Electrical/Computer Engineering. Advisors: Soheil Mohajer, Mohammad Ali Maddah-Ali. 1 computer file (PDF); xii, 240 pages.
Related to
Replaces
License
Collections
Series/Report Number
Funding information
Isbn identifier
Doi identifier
Previously Published Citation
Other identifiers
Suggested citation
Reisizadeh, Hadi. (2024). Efficient Algorithms for Distributed Networks with Applications in Communication and Learning. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/269608.
Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.