BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Department of Applied Mathematics and Statistics - ECPv6.5.0//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Department of Applied Mathematics and Statistics
X-ORIGINAL-URL:https://engineering.jhu.edu/ams
X-WR-CALDESC:Events for Department of Applied Mathematics and Statistics
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20220313T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20221106T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20220915T133000
DTEND;TZID=America/New_York:20220915T143000
DTSTAMP:20240516T021824
CREATED:20220825T193151Z
LAST-MODIFIED:20220906T141430Z
UID:42410-1663248600-1663252200@engineering.jhu.edu
SUMMARY:AMS Weekly Seminar w/ Haihao Lu (University of Chicago) @ Shaffer 303 or Zoom
DESCRIPTION:Title: First Order Methods for Linear Programming: Theory\, Computation\, and Applications \nAbstract: Linear programming (LP) is a fundamental tool in operations research with wide applications in practice. The state-of-the-art LP solvers are essentially based on either simplex method or barrier method\, which are quite mature and reliable at delivering highly accurate solutions. However\, it is highly challenging to further scale up these two methods. The computational bottleneck of both methods is the matrix factorization when solving linear equations\, which usually requires significantly more memory usage and cannot be directly applied on the modern computing resources\, i.e.\, distributed computing and/or GPUs. In contrast\, first-order methods (FOMs) only require matrix-vector multiplications\, which work very well on these modern computing infrastructures and have massively accelerated the machine learning training process during the last 15 years. In this talk\, I’ll present new FOMs for LP. On the computational side\, we build up a new LP solver based on the proposed FOMs and I’ll present a comprehensive numerical study on the proposed FOMs. The solver has been open-sourced through Google OR-Tools. On the theory side\, I’ll present new techniques that improve the existing complexity of FOMs for LP and show that the proposed algorithms achieve the optimal convergence rate in the class of FOMs. I’ll conclude the talk with open questions and new directions on this line of research. Part of this research was done at Google. \n \nBio: Haihao (Sean) Lu is an assistant professor of Operations Management at the University of Chicago Booth School of Business. His research interests are in extending the computational and mathematical boundaries of methods for solving the large-scale optimization problems that arise in data science\, machine learning\, and operations research. Before joining Booth\, he was a visiting researcher at Google Research large-scale optimization team\, where he primarily worked on designing and implementing a huge-scale linear programming solver. He obtained his Ph.D degree in Operations Research and Mathematics at MIT in 2019. He is the winner of INFORMS Optimization Society Young Researcher Prize (2021). \nJoin via Zoom link: \nhttps://wse.zoom.us/j/95738965246 \n
URL:https://engineering.jhu.edu/ams/event/ams-weekly-seminar-w-haihao-lu-university-of-chicago-shaffer-303-or-zoom/
END:VEVENT
END:VCALENDAR