Graphs and Differential Equations in Machine Learning
Speaker: Yves van Gennip
Abstract: At first glance the discrete world of graphs with its nodes and edges and the continuum world of differential equations, in which quantities change over continuous time or space, may seem far apart. In this talk we will discover some connections between these worlds that are not only of mathematical interest, but also give rise to useful methods in machine learning and image processing, for example data classification methods and image segmentation methods. This talk will give a (very incomplete) overview of an area of mathematical research that has been very active and actively growing over the past decade and a half.
Inferring Time-Varying Signals over Graphs via SPDEs
Speaker: Mohammad Sabbaqi
Abstract: Inference of time varying data over graphs is of importance in real-world applications such as urban water networks, economics, and brain recordings. It typically relies on identifying a computationally affordable joint spatiotemporal method that can leverage the patterns in the data. While this per se is a challenging task, it becomes even more so when the network comes with uncertainties, which, if not accounted for, can lead to unpredictable consequences. To target this setting, we model graph uncertainties as Gaussian noise on the edges and design a stochastic partial differential equation (SPDE) based on it. We use this SPDE as a state equation to model the time varying signal evolution and extend it further to a state-space model where the observations are graph-filtered versions of the state. This allows us to have a joint spatiotemporal expressive kernel that can be estimated online via Kalman filtering and which parameters can also be estimated online via maximum likelihood principles, ultimately, reducing the computational cost.