API-guided Dataset Synthesis to Finetune Large Code Models
Large code models (LCMs), pre-trained on vast code corpora, have demonstrated remarkable performance across a wide array of code-related tasks. Supervised fine-tuning (SFT) plays a vital role in aligning these models with specific requirements and enhancing their performance in particular domains. However, synthesizing high-quality SFT datasets poses a significant challenge due to the uneven quality of datasets and the scarcity of domain-specific datasets.
Inspired by APIs as high-level abstractions of code that encapsulate rich semantic information in a concise structure, we propose DataScope, an API-guided dataset synthesis framework designed to enhance the SFT process for LCMs in both general and domain-specific scenarios. DataScope comprises two main components: Dsel and Dgen. On the one hand, Dsel employs API coverage as a core metric, enabling efficient dataset synthesis in general scenarios by selecting subsets of existing (uneven-quality) datasets with higher API coverage. On the other hand, Dgen recasts domain dataset synthesis as a process of using API-specified high-level functionality and deliberately-constituted code skeletons to synthesize concrete code.
Extensive experiments demonstrate DataScope’s effectiveness, with models fine-tuned on its synthesized datasets outperforming those tuned on unoptimized datasets five times larger. Furthermore, a series of analyses on model internals, relevant hyperparameters, and case studies provide additional evidence for the efficacy of our proposed methods. These findings underscore the significance of dataset quality in SFT and advance the field of LCMs by providing an efficient, cost-effective framework for constructing high-quality datasets, which in turn lead to more powerful and tailored LCMs for both general and domain-specific scenarios.
Sat 18 OctDisplayed time zone: Perth change
10:30 - 12:15 | |||
10:30 15mTalk | Abstraction Refinement-guided Program Synthesis for Robot Learning from Demonstrations OOPSLA Guofeng Cui Rutgers University, Yuning Wang Rutgers University, Wensen Mao Rutgers University, Yuanlin Duan Rutgers University, He Zhu Rutgers University, USA | ||
10:45 15mTalk | API-guided Dataset Synthesis to Finetune Large Code Models OOPSLA Li Zongjie Hong Kong University of Science and Technology, Daoyuan Wu Lingnan University, Shuai Wang Hong Kong University of Science and Technology, Zhendong Su ETH Zurich | ||
11:00 15mTalk | Fast Constraint Synthesis for C++ Function Templates OOPSLA | ||
11:15 15mTalk | Hambazi: Spatial Coordination Synthesis for Augmented Reality OOPSLA Yi-Zhen Tsai University of California, Riverside, Jiasi Chen University of Michigan, Mohsen Lesani University of California at Santa Cruz | ||
11:30 15mTalk | Inductive Synthesis of Inductive Heap Predicates OOPSLA | ||
11:45 15mTalk | LOUD: Synthesizing Strongest and Weakest Specifications OOPSLA Kanghee Park University of Wisconsin-Madison, Xuanyu Peng University of California, San Diego, Loris D'Antoni University of California at San Diego | ||
12:00 15mTalk | Metamorph: Synthesizing Large Objects from Dafny Specifications OOPSLA Aleksandr Fedchin Tufts University, Alexander Bai New York University, Jeffrey S. Foster Tufts University | ||