Thursday, 18 May 2023
17:00 - 18:00

The link for participation in the event is the following:

Stochastic Mirror Descent for Convex Optimization with Consensus Constraints

Talk by Panos Parpas, Imperial College London

The mirror descent algorithm is known to be effective in applications where it is beneficial to adapt the mirror map to the underlying geometry of the optimization model.

However, the effect of mirror maps on the geometry of distributed optimization problems has not been previously addressed. 

In this talk we will study an exact distributed mirror descent algorithms in continuous-time under additive noise and present the settings that enable linear convergence rates.

Our analysis draws motivation from the augmented Lagrangian and its relation to gradient tracking. 

To further explore the benefits of mirror maps in a distributed setting we present a preconditioned variant of our algorithm with an additional mirror map over the Lagrangian dual variables. This allows our method to adapt to the geometry of the consensus manifold and leads to faster convergence. 

We illustrate the performance of the algorithms in convex settings both with and without constraints.

This is joint work with: Anastasia Borovykh, Nikolas Kantas,  and  Grigorios A. Pavliotis


Find out more about Panos Parpas on his profile page at Imperial College London.

Sign up

Sign up below to receive a reminder 24 hours before the event.