Wed 15 Jun 2022 16:30 - 16:50 at Macaw - Neural Networks and Numbers Chair(s): Madan Musuvathi

In this paper, we give a simple and efficient implementation of reverse-mode automatic differentiation, which both extends easily to higher-order functions, and has run time and memory consumption linear in the run time of the original program. In addition to a formal description of the translation, we also describe an implementation of this algorithm, and prove its correctness by means of a logical relations argument.

Wed 15 Jun

Displayed time zone: Pacific Time (US & Canada) change

15:30 - 16:50
Neural Networks and NumbersSIGPLAN Track at Macaw
Chair(s): Madan Musuvathi Microsoft Research
15:30
20m
Talk
(OOPSLA 2021) FPL: fast Presburger arithmetic through transprecision
SIGPLAN Track
Arjun Pitchanathan University of Edinburgh, Christian Ulmann ETH Zurich, Michel Weber ETH Zurich, Torsten Hoefler ETH Zurich, Tobias Grosser University of Edinburgh
Link to publication DOI Authorizer link Pre-print
15:50
20m
Talk
(PLDI 2021) Provable Repair of Deep Neural Networks
SIGPLAN Track
Matthew Sotoudeh University of California, Davis, Aditya V. Thakur University of California at Davis
16:10
20m
Talk
(POPL 2022) One Polynomial Approximation to Produce Correctly Rounded Results of an Elementary Function for Multiple Representations and Rounding Modes
SIGPLAN Track
Jay P. Lim Yale University, Santosh Nagarakatte Rutgers University
16:30
20m
Talk
(POPL 2022) Provably Correct, Asymptotically Efficient, Higher-Order Reverse-Mode Automatic Differentiation
SIGPLAN Track
Faustyna Krawiec University of Cambridge, Simon Peyton Jones Microsoft Research, Neel Krishnaswami University of Cambridge, Tom Ellis Microsoft Research, Richard A. Eisenberg Tweag, Andrew Fitzgibbon Graphcore
DOI