We present two different contributions:
Through a thorough analysis of the training dynamics of Physics-Informed Neural Networks (PINNs) under Natural Gradient, we introduce the notion of the Natural Neural Tangent Kernel (NNTK). We leverage this to define the empirical Tangent Space and empirical Natural Gradient, yielding the AnaGRAM algorithm, which scales as O(min(PS², SP²)) with P being the number of parameters and S the number of samples. We also prove connections between the natural gradient of PINNs and Green’s function theory.
Building on a deep empirical analysis of the training dynamics of the AnaGRAM algorithm, we propose an adaptive cutoff-regularization scheme denoted as AMStraMGRAM. This scheme improves the performance of the AnaGRAM algorithm up to machine-error precision for simple problems