Talk by Vasiliauskaite Vaiva, ETH Zurich
A framework for approximating complex network dynamics with graph neural networks and identifying the limits of model's generalisation
Date: 11.04.24 Time: 12.15 - 13.45 Room: Y27H12
Abstract:
Differential equations are a ubiquitous tool to study dynamics, ranging from physical systems to complex systems, where a large number of agents interact through a graph. Data-driven approximations of differential equations offer a promising alternative to classical methods of discovering a dynamical system model, particularly in complex systems lacking explicit first principles. One popular machine learning tool used to study dynamics is neural networks, which can either approximate solutions of known governing differential equations, or the governing equations themselves. Specifically for the latter task, however, deploying deep learning models in unfamiliar settings—such as predicting dynamics in unobserved state space regions or on novel graphs—can lead to spurious predictions. Focusing on complex systems whose dynamics is described with a system of first order differential equations, coupled through a graph, we show that extending model's generalizability beyond traditional statistical learning theory limits is feasible. However, achieving this advanced level of generalization requires neural network models to conform to fundamental assumptions about the dynamical model. Additionally, we propose a statistical significance test to assess prediction quality during inference, enabling identification of a neural network's confidence level in its predictions, even without ground truth. This framework offers a robust method for accurate and reliable deep learning approximations of high-dimensional, nonlinear dynamical systems.