
Somdatta Goswami, Assistant Professor in Johns Hopkins University’s Department of Civil and Systems Engineering will present “Foundation models for physics: The neural operator paradigm.”
Abstract
Neural operators are emerging as powerful tools for learning mappings between infinite-dimensional function spaces, offering a paradigm shift in modeling complex physical systems. Unlike traditional machine learning models, neural operators are discretization-invariant and can generalize across domains with varying geometries and resolutions. Among these, Deep Operator Network (DeepONet) has gained significant attention due to its architectural flexibility and has established itself as one of the foundational architectures, capable of approximating nonlinear operators with theoretical guarantees and strong empirical performance. DeepONet employs a dual-network design, branch and trunk networks, to encode input functions and spatial coordinates, respectively, enabling the learning of rich solution manifolds across diverse partial differential equations (PDEs). Complementing DeepONet, architectures like Fourier Neural Operator (FNO), Wavelet Neural Operator and Laplace Neural Operator leverage integral kernel parameterizations, spectral convolutions, and multi-scale structures to further enhance efficiency and generalizability. These models not only achieve orders-of-magnitude speedups over traditional solvers but also exhibit superior extrapolation in space and time, making them particularly suitable for solving forward and inverse problems in computational physics, fluid dynamics, and materials science. We propose that such models, especially when trained on diverse families of PDEs and physical systems, can serve as foundation models for scientific computing: pre-trained, adaptable, and generalizable across tasks, boundary conditions, and discretization. The composability, differentiability, and resolution-agnostic nature of neural operators position them at the frontier of next-generation scientific machine learning. This presentation will synthesize recent developments in operator learning, architecture design, and cross-domain applications, illustrating how neural operators can underpin foundation-scale models that accelerate discovery and decision-making in complex physical systems.
Bio
Somdatta Goswami is an assistant professor in Johns Hopkins University’s Department of Civil and Systems Engineering, and a member of the Institute for Data Intensive Engineering and Science and the Hopkins Extreme Materials Institute. Her research focuses on advancing the fields of Scientific Computing, Computational Mechanics, and Machine Learning, encompassing fundamental and applied aspects. She earned her bachelor’s degree in civil engineering from the Birla Institute of Technology, Mesra, India, and her master’s degree in structural engineering from the Indian Institute of Engineering Science and Technology, India. Funded by the Deutscher Akademischer Austauschdienst (DAAD), Goswami earned her PhD in civil engineering and structural mechanics from Bauhaus University-Weimar, Germany.