Learning Multiple Initial Conditions using Physics Informed Neural Networks
Abstract
Physics-Informed Neural Networks (PINNs) and its variants have emerged as a tool for solving differential equations in the past few years. Although several variants of PINNs have been proposed, the majority of these physics-based approaches are based on solving a problem for a single set of initial/boundary conditions. In this work, we consider one-dimensional time-dependent problems and focus on solving multiple initial value problems with a single network. We leverage Fourier features to deal with the spectral bias that is present when we have initial conditions of different frequencies, allowing for faster convergence. We further extend this approach to the FastVPINNs framework to solve multiple initial value problems using Variational Physics-Informed Neural Networks (VPINNs). We also present some results with respect to activation functions and their influence on FastVPINNs on some standard problems. Lastly, we present an ablation study to see how the parameters of the network affect each initial value problem.
Training a PINN with multiple initial conditions also presents a problem. What about the new initial conditions and how do we incorporate the new initial conditions into the model? A brute-force retraining of all the old and new initial conditions is one approach, but such an approach may not always be computationally feasible. A more feasible way is to use continual learning-based approaches to train the network for a new initial condition. In our work, we use a regularization-based strategy, namely the elastic weight consolidation (EWC) and see how we can incorporate a new initial condition into our network.