BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Department of Applied Mathematics and Statistics - ECPv6.0.13.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Department of Applied Mathematics and Statistics
X-ORIGINAL-URL:https://engineering.jhu.edu/ams
X-WR-CALDESC:Events for Department of Applied Mathematics and Statistics
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20220313T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20221106T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20220303T133000
DTEND;TZID=America/New_York:20220303T143000
DTSTAMP:20230601T193901
CREATED:20220120T174001Z
LAST-MODIFIED:20220225T182736Z
UID:39304-1646314200-1646317800@engineering.jhu.edu
SUMMARY:AMS Weekly Seminar w/ Mike Dinitz (JHU) @ Krieger 205 or on Zoom
DESCRIPTION:Title: Faster Matchings via Learned Duals \nAbstract: A recent line of research investigates how algorithms can be augmented with machine-learned predictions to overcome worst case lower bounds. This area has revealed interesting algorithmic insights into problems\, with particular success in the design of competitive online algorithms. However\, the question of improving algorithm running times with predictions has largely been unexplored. \nWe take a first step in this direction by combining the idea of machine-learned predictions with the idea of “warm-starting” primal-dual algorithms. We consider one of the most important primitives in combinatorial optimization: weighted bipartite matching and its generalization to b-matching. We identify three key challenges when using learned dual variables in a primal-dual algorithm. First\, predicted duals may be infeasible\, so we give an algorithm that efficiently maps predicted infeasible duals to nearby feasible solutions. Second\, once the duals are feasible\, they may not be optimal\, so we show that they can be used to quickly find an optimal solution. Finally\, such predictions are useful only if they can be learned\, so we show that the problem of learning duals for matching has low sample complexity. We validate our theoretical findings through experiments on both real and synthetic data. As a result we give a rigorous\, practical\, and empirically effective method to compute bipartite matchings. \nAppeared as an oral presentation at NeurIPS 2021. Joint work with Sungjin Im\, Thomas Lavastida\, Benjamin Moseley\, and Sergei Vassilvitskii \nHere is the zoom link is: https://wse.zoom.us/j/95448608570
URL:https://engineering.jhu.edu/ams/event/ams-weekly-seminar-w-mike-dinitz-jhu-on-zoom/
END:VEVENT
END:VCALENDAR