Visitez notre page

 

 

 

 

 

 


Kernelization of Natural Gradient Methods for Physics Informed Machine-Learning.


Séminaire
Organisme intervenant (ou équipe pour les séminaires internes)
LISN
Nom intervenant
Nilo Schwencke
Résumé

We present two different contributions:

  • Through a thorough analysis of the training dynamics of Physics-Informed Neural Networks (PINNs) under Natural Gradient, we introduce the notion of the Natural Neural Tangent Kernel (NNTK). We leverage this to define the empirical Tangent Space and empirical Natural Gradient, yielding the AnaGRAM algorithm, which scales as O(min(PS², SP²)) with P being the number of parameters and S the number of samples. We also prove connections between the natural gradient of PINNs and Green’s function theory.

  • Building on a deep empirical analysis of the training dynamics of the AnaGRAM algorithm, we propose an adaptive cutoff-regularization scheme denoted as AMStraMGRAM. This scheme improves the performance of the AnaGRAM algorithm up to machine-error precision for simple problems

Lieu
Amphi A0.04
Date du jour
Date de fin du Workshop