In this letter we address the problem of convex optimization in a multi-agent setting where the objective is to minimize the average of local cost functions whose derivatives are not available (e.g., black-box models). Moreover agents can only communicate with local neighbors according to a connected network topology. Zeroth-order (ZO) optimization has recently gained increasing attention in federated learning and multi-agent scenarios exploiting finite-difference approximations of the gradient using from 2 (directional gradient) to 2d (central difference full gradient) evaluations of the cost functions, where d is the dimension of the problem. The contribution of this letter is to extend ZO distributed optimization by estimating the curvature of the local cost functions via finite-difference approximations. In particular, we propose a novel algorithm named ZO-JADE, that by adding just one extra point, i.e., 2d + 1 in total, allows to simultaneously estimate the gradient and the diagonal of the local Hessian, which are then combined via average tracking consensus to obtain an approximated Jacobi descent. Guarantees of semi-global exponential stability are established via separation of time-scales. Extensive numerical experiments on real-world data confirm the efficiency and superiority of our algorithm with respect to several other distributed zeroth-order methods available in the literature based on only gradient estimates.

ZO-JADE: Zeroth-Order Curvature-Aware Distributed Multi-Agent Convex Optimization

Alessio Maritan;Luca Schenato
2023

Abstract

In this letter we address the problem of convex optimization in a multi-agent setting where the objective is to minimize the average of local cost functions whose derivatives are not available (e.g., black-box models). Moreover agents can only communicate with local neighbors according to a connected network topology. Zeroth-order (ZO) optimization has recently gained increasing attention in federated learning and multi-agent scenarios exploiting finite-difference approximations of the gradient using from 2 (directional gradient) to 2d (central difference full gradient) evaluations of the cost functions, where d is the dimension of the problem. The contribution of this letter is to extend ZO distributed optimization by estimating the curvature of the local cost functions via finite-difference approximations. In particular, we propose a novel algorithm named ZO-JADE, that by adding just one extra point, i.e., 2d + 1 in total, allows to simultaneously estimate the gradient and the diagonal of the local Hessian, which are then combined via average tracking consensus to obtain an approximated Jacobi descent. Guarantees of semi-global exponential stability are established via separation of time-scales. Extensive numerical experiments on real-world data confirm the efficiency and superiority of our algorithm with respect to several other distributed zeroth-order methods available in the literature based on only gradient estimates.
File in questo prodotto:
File Dimensione Formato  
ZO-JADE_Zeroth-Order_Curvature-Aware_Distributed_Multi-Agent_Convex_Optimization.pdf

accesso aperto

Tipologia: Published (publisher's version)
Licenza: Creative commons
Dimensione 1.1 MB
Formato Adobe PDF
1.1 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3494612
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact