Towards a Theoretically-Backed and Practical Framework for Selective Object-Sensitive Pointer Analysis
This program is tentative and subject to change.
Context sensitivity is a foundational technique in static analysis, critical and essential for improving precision but often at the expense of significant analysis efficiency. Recent advances focus on selective context-sensitive analysis, where only a subset of program elements, such as methods or heaps, are analyzed under context sensitivity while the rest are analyzed under context insensitivity, aiming to balance precision with efficiency. Unfortunately, though the proliferation of numerous selective context-sensitive analysis approaches have been observed, all these research works are usually based on specific code patterns, therefore lacking a comprehensive theoretical foundation for systematically identifying code scenarios that benefit from context sensitivity. This paper presents a novel and foundational theory that establishes a sound over-approximation of the ground truth, i.e., heaps that de facto improve precision under context sensitivity. The proposed theory reformulates the identification of this upper bound into three sub-graph reachability problems within typical pointer flow graphs, each of which can be efficiently solved under context insensitivity, respectively. To note, our theory selects all heaps that improve precision, and our approximation is carefully designed to balance precision and scalability. Building on this theoretical foundation, we introduce our selective context-sensitive analysis approach, MOON. MOON performs both backward and forward traversal of pointer flow graphs, enabling it to systematically capture all heaps that improve precision under context sensitivity. Our theoretical foundation, along with carefully designed trade-offs within our approach, allows MOON to limit the scope of heaps to be selected, leading to an effective balance between its analysis precision and efficiency. Extensive experiments with MOON across 30 Java programs demonstrate that MOON achieves 37.2X and 382.0X speedups for 2-object-sensitive and 3-object-sensitive analyses, respectively with negligible precision losses of only 0.1% and 0.2%. These results highlight that the balance between efficiency and precision achieved by MOON significantly outperforms all previous approaches.