(OOPSLA 2020) Just-in-Time Learning for Bottom-up Enumerative Synthesis
A key challenge in program synthesis is the astronomical size of the search space the synthesizer has to explore. In response to this challenge, recent work proposed to guide synthesis using learned probabilistic models. Obtaining such a model, however, might be infeasible for a problem domain where no high-quality training data is available. In this work we introduce an alternative approach to guided program synthesis: instead of training a model \emph{ahead of time} we show how to bootstrap one \emph{just in time}, during synthesis, by learning from partial solutions encountered along the way. To make the best use of the model, we also propose a new program enumeration algorithm we dub \emph{guided bottom-up search}, which extends the efficient bottom-up search with guidance from probabilistic models.
We implement this approach in a tool called Probe, which targets problems in the popular syntax-guided synthesis (SyGuS) format. We evaluate Probe on benchmarks from the literature and show that it achieves significant performance gains both over unguided bottom-up search and over a state-of-the-art probability-guided synthesizer, which had been trained on a corpus of existing solutions. Moreover, we show that these performance gains do not come at the cost of solution quality: programs generated by Probe are only slightly more verbose than the shortest solutions and perform no unnecessary case-splitting.
Thu 16 JunDisplayed time zone: Pacific Time (US & Canada) change
10:40 - 12:00 | |||
10:40 20mTalk | (PLDI 2021) DreamCoder: Bootstrapping inductive program synthesis with wake-sleep library learning SIGPLAN Track Kevin Ellis Cornell University, Lionel Wong Massachusetts Institute of Technology, Maxwell Nye Massachusetts Institute of Technology, Mathias Sablé-Meyer PSL University; Collège de France; NeuroSpin, Lucas Morales Massachusetts Institute of Technology, Luke Hewitt Massachusetts Institute of Technology, Luc Cary Massachusetts Institute of Technology, Armando Solar-Lezama Massachusetts Institute of Technology, Joshua B. Tenenbaum MIT | ||
11:00 20mTalk | (POPL 2021) egg: Fast and Extensible Equality Saturation SIGPLAN Track Max Willsey University of Washington, Chandrakana Nandi Certora, inc., Yisu Remy Wang University of Washington, Oliver Flatt University of Utah, Zachary Tatlock University of Washington, Pavel Panchekha University of Utah | ||
11:20 20mTalk | (POPL 2022) Relational E-Matching SIGPLAN Track Yihong Zhang University of Washington, Yisu Remy Wang University of Washington, Max Willsey University of Washington, Zachary Tatlock University of Washington | ||
11:40 20mTalk | (OOPSLA 2020) Just-in-Time Learning for Bottom-up Enumerative Synthesis SIGPLAN Track Shraddha Barke University of California at San Diego, Hila Peleg Technion, Nadia Polikarpova University of California at San Diego |