In an age defined by explosive growth in information technology, data generation, storage and transmission have increased dramatically. This data fuels the core of machine learning and artificial intelligence. However, we are witnessing increasingly pressing questions raised about data ownership and privacy, given the pivotal role of individuals as data generators. In this context, research efforts in distributed machine learning, particularly Federated Learning (FL), have recently gained momentum. FL enables multiple agents, each with private datasets, to collaborate on machine learning tasks without sharing their data. In recent years, the design of communication-efficient FL methods has garnered significant attention, given the inherent need for frequent information exchange among agents to train distributed machine learning algorithms. Given this premise, in this thesis we explore the boundaries of FL, focusing on two aspects. First, we study second-order methods with superlinear convergence rate that can effectively deal with ill-conditioned problems while being communication efficient. Towards this direction, we introduce SHED (Sharing Hessian Eigenvectors for Distributed learning), a novel Newton-type algorithm for FL with state-of-the-art empirical performance that excels in terms of communication efficiency and convergence guarantees. Second, we study the theoretical foundations of Federated Reinforcement Learning (FRL) within the constraints of communication, with special emphasis on wireless networks. In these settings, we provide finite-time convergence rates for FRL under communication constraints and show the beneficial effect of cooperation, establishing convergence speedups with the number of agents in different configurations.

Pushing the Boundaries of Federated Learning: Super-Linear Convergence and Reinforcement Learning Over Wireless / DAL FABBRO, Nicolò. - (2024 Mar 21).

Pushing the Boundaries of Federated Learning: Super-Linear Convergence and Reinforcement Learning Over Wireless

DAL FABBRO, NICOLÒ
2024

Abstract

In an age defined by explosive growth in information technology, data generation, storage and transmission have increased dramatically. This data fuels the core of machine learning and artificial intelligence. However, we are witnessing increasingly pressing questions raised about data ownership and privacy, given the pivotal role of individuals as data generators. In this context, research efforts in distributed machine learning, particularly Federated Learning (FL), have recently gained momentum. FL enables multiple agents, each with private datasets, to collaborate on machine learning tasks without sharing their data. In recent years, the design of communication-efficient FL methods has garnered significant attention, given the inherent need for frequent information exchange among agents to train distributed machine learning algorithms. Given this premise, in this thesis we explore the boundaries of FL, focusing on two aspects. First, we study second-order methods with superlinear convergence rate that can effectively deal with ill-conditioned problems while being communication efficient. Towards this direction, we introduce SHED (Sharing Hessian Eigenvectors for Distributed learning), a novel Newton-type algorithm for FL with state-of-the-art empirical performance that excels in terms of communication efficiency and convergence guarantees. Second, we study the theoretical foundations of Federated Reinforcement Learning (FRL) within the constraints of communication, with special emphasis on wireless networks. In these settings, we provide finite-time convergence rates for FRL under communication constraints and show the beneficial effect of cooperation, establishing convergence speedups with the number of agents in different configurations.
Pushing the Boundaries of Federated Learning: Super-Linear Convergence and Reinforcement Learning Over Wireless
21-mar-2024
Pushing the Boundaries of Federated Learning: Super-Linear Convergence and Reinforcement Learning Over Wireless / DAL FABBRO, Nicolò. - (2024 Mar 21).
File in questo prodotto:
File Dimensione Formato  
thesis_pdfA.pdf

accesso aperto

Descrizione: thesis
Tipologia: Tesi di dottorato
Dimensione 8.34 MB
Formato Adobe PDF
8.34 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3511487
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact