Visitez notre page

 

 

 

 

 

 


Gradient descent with a general cost

Séminaire
Organisme intervenant (ou équipe pour les séminaires internes)
MOKAPLAN (INRIA), CEREMADE (Dauphine)
Nom intervenant
Flavien Léger
Résumé

Abstract: In this talk I will present an approach to iteratively minimize a given objective function using minimizing movement schemes built on general cost functions. I will introduce an explicit method, gradient descent with a general cost (GDGC), as well as an implicit, proximal-like scheme and an explicit-implicit (forward-backward) method. 

GDGC unifies several standard gradient descent-type methods: gradient descent, mirror descent, Newton’s method, and Riemannian gradient descent. I will explain how the notion of nonnegative cross-curvature, originally developed for the regularity of the optimal transport problem, provides tractable conditions to prove convergence rates for GDGC. 

Direct byproducts of this framework include: (1) a new nonsmooth mirror descent, (2) global convergence rates for Newton’s method, and (3) a clear picture of the type of convexity needed for converging schemes in the Riemannian setting. 

Lieu
Amphi C2.0.37
Date du jour
Date de fin du Workshop