Data Science Seminar: Tyrus Berry (George Mason University) @ Hodson 203

November 15, 2017 @ 3:00 pm – 4:00 pm

What geometries can we learn from data?

In the field of manifold learning, the foundational theoretical results of Coifman and Lafon (Diffusion Maps, 2006) showed that for data sampled near an embedded manifold, certain graph Laplacian constructions are consistent estimators of the Laplace-Beltrami operator on the underlying manifold. Since these operators determine the Riemannian metric, they completely describe the geometry of the manifold (as inherited from the embedding). It was later shown that different kernel functions could be used to recover any desired geometry, at least in terms of pointwise estimation of the associated Laplace-Beltrami operator. In this talk I will first briefly review the above results and then introduce new results on the spectral convergence of these graph Laplacians. These results reveal that not all geometries are accessible in the stronger spectral sense. However, when the data set is sampled from a smooth density, there is a natural conformally invariant geometry which is accessible on all compact manifolds, and even on a large class of non-compact manifolds. Moreover, the kernel which estimates this geometry has a very natural construction which we call Continuous k-Nearest Neighbors (CkNN).

Back to top