When: Sep 14 2023 @ 1:30 PM
Where: Gilman 132
3400 North Charles st
Baltimore, MD 21218
Categories:

Location: Gilman 132

When: September 14th at 1:30 p.m.

Title: Manifold Neural Networks for Large-Scale Geometric Information Processing

Abstract: We discuss the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from points sampled from the manifold, thus encoding geometric information. Manifold filters are defined in terms of the Laplace-Beltrami operator exponential and admit a spectral representation which is a generalization of the spectral representations of both graph filters and standard convolutional filters in continuous time. We analyze the stability of manifold filters and MNNs to smooth deformations of the manifold. This analysis generalizes known stability properties of graph filters and GNNs and it is also a generalization of known stability properties of standard convolutional filters and neural networks. We observe that manifold filters, like graph and continuous time filters, have difficulty discriminating high frequency components in the presence of deformations. Subsequently, we study the convergence of graph filters and GNNs to their manifold counterparts. Using the appropriate kernels, we analyze both dense and moderately sparse graphs. We prove non-asymptotic error bounds showing that graph filters and GNNs on these graphs converge to manifold filters and MNNs. As a byproduct, we observe a trade-off between the discriminability of graph filters and their ability to approximate the desired behavior of manifold filters. We conclude with a discussion on how both the stability and the convergence trade-offs of convolutions are ameliorated in neural networks due to the frequency mixing property of nonlinearities.

Zoom link: https://wse.zoom.us/j/94601022340