Calendar

Sep
3
Thu
AMS Weekly Zoom Seminar- Meet & Greet
Sep 3 @ 1:30 pm – 2:30 pm

It is a pleasure for me to kick off this semester’s AMS seminar series this coming Thursday, Sept 3 at 1:30pm. This first seminar is going to be a meet-and-greet, with a few announcements about how the seminar is going to run this semester in a fully online manner. We do intend to have speakers virtually visit Hopkins and even meet and discuss with people. It will just be in a different format. More details on Thursday!

The following is the passcode protected link for you to access the Zoom meeting. This is recurring meeting, so the same link should be used every Thursday this Fall. For students, the following information should also be available from the Blackboard page for the Department seminar EN.553.801.01.F20.

https://wse.zoom.us/j/98200438645?pwd=d3M3WEljc0sxd3BRQldUU3dudzhvdz09

 

In case it does not work, please use the following information:

Meeting ID: 982 0043 8645

Passcode: 374212

To avoid instances of zoom-bombing, please do not share the link above with anyone else.

 

Important: We still have not been able to fill our Sept 10 slot for the seminar. So if one of you can save the day and give a cool scientific talk to kick us off next week, that would be awesome. Please email me if you are interested and available.

See you all this Thursday at 1:30pm!

 

Sep
10
Thu
AMS Weekly Seminar w/ Zachary Lubberts (AMS) on Zoom
Sep 10 @ 1:30 pm – 2:30 pm

Title: Numerical tolerance for spectral decompositions of random matrices

 

Abstract: The computation of parametric estimates often involves iterative numerical approximations, which introduce numerical error. But when these estimates depend on random observations, they necessarily involve statistical error as well. Thus the common approach of minimizing numerical error without accounting for inherent statistical error can be both costly and wasteful, since it results in no improvement to the estimator’s accuracy. We quantify this tradeoff between numerical and statistical error in a problem of estimating the eigendecomposition for the mean of a random matrix from its observed value, and show that one can save significant computation by terminating the iterative procedure early, with no loss of accuracy. We demonstrate this in a setting of estimating the latent positions of a random network from the observed adjacency matrix, on real and simulated data.

 

Your cloud recording is now available.

Topic: AMS Department Seminar (Fall 2020)
Date: Sep 10, 2020 12:18 PM Eastern Time (US and Canada)

For host only, click here to view your recording (Viewers cannot access this page):
https://wse.zoom.us/recording/detail?meeting_id=o3QrttwgRpWP7tUxvIeD0g%3D%3D

Share recording with viewers:
https://wse.zoom.us/rec/share/4AKeQRT7O46d3cCsr-82-YqVzqfi58sHJ42n-zFBIQscU7jFBSIzNelTMzVA7GXP.IR-GocHrS2lpCmpH Passcode: L+58iB^b

Sep
17
Thu
AMS Seminar w/ Fabio Mercurio (Bloomberg) on Zoom
Sep 17 @ 1:30 pm – 2:30 pm

Title: Looking Forward to Backward-Looking Rates: A Modeling Framework for Term Rates Replacing LIBOR

 

Abstract: LIBOR and other similar IBOR rates represent the cost of short-term funding among large global banks, and are the reference rates in millions of financial contracts with a total market exposure worldwide of 400 trillion dollars. Lack of liquidity in the unsecured short-term lending market, as well as evidence of LIBOR manipulation during the 2007-09 credit crisis, led regulators to identify new rate benchmarks. In this talk, we introduce and model the new new interest-rate benchmarks and their compounded setting-in-arrears term rates, which will be replacing IBORs globally. We show that the classic interest-rate modeling framework can be naturally extended to describe the evolution of both the forward-looking (IBOR-like) and backward-looking (setting-in-arrears) term rates using the same stochastic process. We then introduce an extension of the LIBOR Market Model to backward-looking rates. Applications will be presented and numerical examples showcased.

 

Your cloud recording is now available.

Topic: AMS Department Seminar (Fall 2020)
Date: Sep 17, 2020 01:18 PM Eastern Time (US and Canada)

For host only, click here to view your recording (Viewers cannot access this page):
https://wse.zoom.us/recording/detail?meeting_id=VX%2FqA9N%2FQ%2BynIoBw1R9Mzg%3D%3D

Share recording with viewers:
https://wse.zoom.us/rec/share/CBAf80Hb_1ZlYLpz8DoKhdOwx7k9F1zOsmr4EUdXV9LTgmF5TNou-ugp9RkERWlP.bTMc0SwGWnbz4dqY Passcode: uL5&[email protected]!1

Sep
24
Thu
AMS Seminar w/ Dustin Mixon (Ohio State University) on Zoom
Sep 24 @ 1:30 pm – 2:30 pm

Title: Ingredients matter: Quick and easy recipes for estimating clusters, manifolds, and epidemics

Abstract: Data science resembles the culinary arts in the sense that better ingredients allow for better results. We consider three instances of this phenomenon. First, we estimate clusters in graphs, and we find that more signal allows for faster estimation. Here, “signal” refers to having more edges within planted communities than across communities. Next, in the context of manifolds, we find that an informative prior allows for estimates of lower error. In particular, we apply the prior that the unknown manifold enjoys a large, unknown symmetry group. Finally, we consider the problem of estimating parameters in epidemiological models, where we find that a certain diversity of data allows one to design estimation algorithms with provable guarantees. In this case, data diversity refers to certain combinatorial features of the social network. Joint work with Jameson Cahill, Charles Clum, Hans Parshall, and Kaiying Xie.

 

Your cloud recording is now available.

Topic: AMS Department Seminar (Fall 2020)
Date: Sep 24, 2020 12:59 PM Eastern Time (US and Canada)

For host only, click here to view your recording (Viewers cannot access this page):
https://wse.zoom.us/recording/detail?meeting_id=D3Hbv%2Fe5QXKcE0FUgQFdVg%3D%3D

Share recording with viewers:
https://wse.zoom.us/rec/share/fChPLSraWeF5AhXKbY0jkOOfv0zAhnX4d6qWeWVa9_Goyup0aLcKi0VETt7T2Wan.xDWyUYFDujlhPvqt Passcode: 79W*[email protected]

Oct
1
Thu
AMS Seminar w/ Aude Genevay (Massachusetts Institute of Technology) on Zoom
Oct 1 @ 1:30 pm – 2:30 pm

Title: Learning with entropy-regularized optimal transport

Abstract: Entropy-regularized OT (EOT) was first introduced by Cuturi in 2013 as a solution to the computational burden of OT for machine learning problems. In this talk, after studying the properties of EOT, we will introduce a new family of losses between probability measures called Sinkhorn Divergences. Based on EOT, this family of losses actually interpolates between OT (no regularization) and MMD (infinite regularization). We will illustrate these theoretical claims on a set of learning problems formulated as minimizations over the space of measures.

 

Your cloud recording is now available.

Topic: AMS Department Seminar (Fall 2020)
Date: Oct 1, 2020 01:21 PM Eastern Time (US and Canada)

For host only, click here to view your recording (Viewers cannot access this page):
https://wse.zoom.us/recording/detail?meeting_id=zXLatYK3QFieUi0kc9N%2BRA%3D%3D

Share recording with viewers:
https://wse.zoom.us/rec/share/cuYXVU99jAdaLuq4FfIew8x7dxjZ40hORkqQyQpfPCAB_B69q1XeDJmLFw5yuZrb.QIj2wn6azpc4V96E Passcode: *$xMJcX6

Oct
8
Thu
AMS Seminar w/ Kevin Pratt (Carnegie Mellon University) on Zoom
Oct 8 @ 1:30 pm – 2:30 pm

Title: Subgraph isomorphism via partial differentiation

Abstract: In this talk I will discuss a recent approach to the algorithmic problem of subgraph isomorphism: given a host graph G and target graph H, decide whether G contains a subgraph isomorphic to H. For simplicity, I will illustrate the approach in the case when H is a path. I will describe an algorithm whose runtime comes close to that of the state of the art, while using a new approach based on identifying polynomials with prescribed combinatorial supports (i.e., monomials appearing with nonzero coefficient), and whose partial derivatives (of all orders) span a vector space of small dimension. Connections to previous approaches and avenues for further improvement will also be discussed.

Part of this talk is based on joint work with Cornelius Brand.

 

Here is the link and the meeting info:

https://wse.zoom.us/j/98200438645?pwd=d3M3WEljc0sxd3BRQldUU3dudzhvdz09

Meeting ID: 982 0043 8645

Passcode: 374212

 

Oct
15
Thu
AMS Seminar w/ David Gu (Stony Brook University) on Zoom
Oct 15 @ 1:30 pm – 2:30 pm

Title: A Geometric Understanding of Deep Learning

Abstract: This work introduces an optimal transportation (OT) view of generative adversarial networks (GANs). Natural datasets have intrinsic patterns, which can be summarized as the manifold distribution principle: the distribution of a class of data is close to a low-dimensional manifold. GANs mainly accomplish two tasks: manifold learning and probability distribution transformation. The latter can be carried out using the classical OT method. From the OT perspective, the generator computes the OT map, while the discriminator computes the Wasserstein distance between the generated data distribution and the real data distribution; both can be reduced to a convex geometric optimization process. Furthermore, OT theory discovers the intrinsic collaborative—instead of competitive—relation between the generator and the discriminator, and the fundamental reason for mode collapse. We also propose a novel generative model, which uses an autoencoder (AE) for manifold learning and OT map for probability distribution transformation. This AE–OT model improves the theoretical rigor and transparency, as well as the computational stability and efficiency; in particular, it eliminates the mode collapse. The experimental results validate our hypothesis, and demonstrate the advantages of our proposed model.

 

Meeting Recording:

https://wse.zoom.us/rec/share/NmcAgaDnXT0YgkEVAa5vX2TaEDXq28gpdwBxve9QXRXfoi9vlqG_9IyqV8d337Fq.4piGXPQfnZi1oDCI

Access Passcode: Fc1=nKmE

Oct
29
Thu
AMS Seminar w/ Vince Lyzinski (University of Maryland, College Park) on Zoom
Oct 29 @ 1:30 pm – 2:30 pm

Title: The Importance of Being Correlated: Implications of Dependence in Joint Spectral Inference across Multiple Networks

Abstract: Spectral inference on multiple networks is a rapidly-developing subfield of graph statistics. Recent work has demonstrated that joint, or simultaneous, spectral embedding of multiple independent network realizations can deliver more accurate estimation than individual spectral decompositions of those same networks. Little attention has been paid, however, to the network correlation that such joint embedding procedures necessarily induce. In this paper, we present a detailed analysis of induced correlation in a {\em generalized omnibus} embedding for multiple networks. We show that our embedding procedure is flexible and robust, and, moreover, we prove a central limit theorem for this embedding and explicitly compute the limiting covariance. We examine how this covariance can impact inference in a network time series, and we construct an appropriately calibrated omnibus embedding that can detect changes in real biological networks that previous embedding procedures could not discern. Our analysis confirms that the effect of induced correlation can be both subtle and transformative, with import in theory and practice.

 

Your cloud recording is now available.

Topic: AMS Department Seminar (Fall 2020)
Date: Oct 29, 2020 01:18 PM Eastern Time (US and Canada)

For host only, click here to view your recording (Viewers cannot access this page):
https://wse.zoom.us/recording/detail?meeting_id=%2FQgtuDojRnaeVqoi0IWcuw%3D%3D

Share recording with viewers:
https://wse.zoom.us/rec/share/1fETcswYJGsGY6HgXmvs4Xd1EaAI1ThzZceI3AhmxD6c1g-0dkyxc1QLJ5BJFUd4.bXWPzF3wH4zpoKm6 Passcode: ?a9s6%xR

Nov
5
Thu
AMS Seminar w/ Eric Vanden-Eijnden (New York University) on Zoom
Nov 5 @ 1:30 pm – 2:30 pm

Title: Trainability and accuracy of artificial neural networks

Abstract: The methods and models of machine learning (ML) are rapidly becoming de facto tools for the analysis and interpretation of large data sets. Complex classification tasks such as speech and image recognition, automatic translation, decision making, etc. that were out of reach a decade ago are now routinely performed by computers with a high degree of reliability using (deep) neural networks. These performances suggest that DNNs may approximate high-dimensional functions with controllably small errors, potentially outperforming standard interpolation methods based e.g. on Galerkin truncation or finite elements that have been the workhorses of scientific computing. In support of this prospect, in this talk I will present results about the trainability and accuracy of neural networks, obtained by mapping the parameters of the network to a system of interacting particles relaxing on a potential determined by the loss function. This mapping can be used to prove a dynamical variant of the universal approximation theorem showing that the optimal neural network representation can be attained by (stochastic) gradient descent, with a approximation error scaling as the inverse of the network size.  I will also show how these findings can be used to accelerate the training of  networks and optimize their architecture, using e.g nonlocal transport involving birth/death processes in parameter space.

 

Meeting recording link:
https://wse.zoom.us/rec/share/WO_nf9zgmnKfPniZsBSzECdAdNBp5wiyMP34tsMNAbb1jgVtgqQAV4YtrJjCGPY7.S1xykYLxepdibbZQ

Passcode: MH!7JDN2

Nov
12
Thu
AMS Seminar w/ Mete Soner (Princeton University) on Zoom
Nov 12 @ 1:30 pm – 2:30 pm

Title: Monte-Carlo methods for high-dimensional problems in quantitative finance

Abstract: Stochastic optimal control has been an effective tool for many problems in quantitative finance and financial economics. Although it provides much needed quantitative modeling for such problems, until recently it has been intractable in high-dimensional settings. However, several recent studies report impressive numerical results: Cheredito et al. studied the optimal stopping problem (a problem closely connected to pricing American-type options in quantitative finance finale) providing tight error bounds and an efficient algorithm in problems in up to 100 dimensions.  Buehler et al., on the other hand, consider the problem of hedging and again report results for high-dimensional problems that were intractable. These papers use a Monte Carlo type algorithm combined with deep neural networks proposed by E. Han and Jentzen.  In this talk I will outline this approach and discuss its properties.  Numerical results, while validating the power of the method in high dimensions, also show the dependence on the dimension and the size of the training data.  This is joint work with Max Reppen of Boston University.

 

Here is the link and the meeting info:

https://wse.zoom.us/j/98200438645?pwd=d3M3WEljc0sxd3BRQldUU3dudzhvdz09

Meeting ID: 982 0043 8645

Passcode: 374212

Enjoy.

Back to top