SPLASH 2025
Sun 12 - Sat 18 October 2025 Singapore
co-located with ICFP/SPLASH 2025
Sat 18 Oct 2025 10:45 - 11:00 at Orchid Plenary Ballroom - Synthesis 1 Chair(s): Yingfei Xiong

Large code models (LCMs), pre-trained on vast code corpora, have demonstrated remarkable performance across a wide array of code-related tasks. Supervised fine-tuning (SFT) plays a vital role in aligning these models with specific requirements and enhancing their performance in particular domains. However, synthesizing high-quality SFT datasets poses a significant challenge due to the uneven quality of datasets and the scarcity of domain-specific datasets.

Inspired by APIs as high-level abstractions of code that encapsulate rich semantic information in a concise structure, we propose DataScope, an API-guided dataset synthesis framework designed to enhance the SFT process for LCMs in both general and domain-specific scenarios. DataScope comprises two main components: Dsel and Dgen. On the one hand, Dsel employs API coverage as a core metric, enabling efficient dataset synthesis in general scenarios by selecting subsets of existing (uneven-quality) datasets with higher API coverage. On the other hand, Dgen recasts domain dataset synthesis as a process of using API-specified high-level functionality and deliberately-constituted code skeletons to synthesize concrete code.

Extensive experiments demonstrate DataScope’s effectiveness, with models fine-tuned on its synthesized datasets outperforming those tuned on unoptimized datasets five times larger. Furthermore, a series of analyses on model internals, relevant hyperparameters, and case studies provide additional evidence for the efficacy of our proposed methods. These findings underscore the significance of dataset quality in SFT and advance the field of LCMs by providing an efficient, cost-effective framework for constructing high-quality datasets, which in turn lead to more powerful and tailored LCMs for both general and domain-specific scenarios.

Sat 18 Oct

Displayed time zone: Perth change

10:30 - 12:15
Synthesis 1OOPSLA at Orchid Plenary Ballroom
Chair(s): Yingfei Xiong Peking University
10:30
15m
Talk
Abstraction Refinement-guided Program Synthesis for Robot Learning from Demonstrations
OOPSLA
Guofeng Cui Rutgers University, Yuning Wang Rutgers University, Wensen Mao Rutgers University, Yuanlin Duan Rutgers University, He Zhu Rutgers University, USA
10:45
15m
Talk
API-guided Dataset Synthesis to Finetune Large Code Models
OOPSLA
Li Zongjie Hong Kong University of Science and Technology, Daoyuan Wu Lingnan University, Shuai Wang Hong Kong University of Science and Technology, Zhendong Su ETH Zurich
11:00
15m
Talk
Fast Constraint Synthesis for C++ Function Templates
OOPSLA
Shuo Ding Georgia Institute of Technology, Qirun Zhang Georgia Institute of Technology
11:15
15m
Talk
Hambazi: Spatial Coordination Synthesis for Augmented Reality
OOPSLA
Yi-Zhen Tsai University of California, Riverside, Jiasi Chen University of Michigan, Mohsen Lesani University of California at Santa Cruz
11:30
15m
Talk
Inductive Synthesis of Inductive Heap Predicates
OOPSLA
Ziyi Yang National University of Singapore, Ilya Sergey National University of Singapore
11:45
15m
Talk
LOUD: Synthesizing Strongest and Weakest Specifications
OOPSLA
Kanghee Park University of Wisconsin-Madison, Xuanyu Peng University of California, San Diego, Loris D'Antoni University of California at San Diego
12:00
15m
Talk
Metamorph: Synthesizing Large Objects from Dafny Specifications
OOPSLA
Aleksandr Fedchin Tufts University, Alexander Bai New York University, Jeffrey S. Foster Tufts University