Leveraging a Corpus of Natural Language Descriptions for Program Similarity
Program similarity is a central challenge in many programming-related applications, such as code search, clone detection, automatic translation, and programming education.
We present a novel approach for establishing the similarity of code fragments by:
(i) obtaining textual descriptions of code fragments captured in millions of posts on question-answering sites, blogs and other sources, and
(ii) using natural language processing techniques to establish similarity between textual descriptions, and thus between their corresponding code fragments.
To improve precision, we use a simple static analysis that extracts type signatures, and combine the results of textual similarity with similarity of the signatures.
Because our notion of code similarity is based on similarity of textual descriptions, our approach can determine semantic relatedness and similarity of code across different libraries and even across different programming languages, a task considered extremely difficult using traditional approaches.
To evaluate our approach, we use data obtained from the popular question-answering site, Stackoverflow. To obtain a ground-truth to compare against, we developed a crowdsourcing system, Like2Drops, that allows users to label the similarity of code fragments. We used the system to collect similarity classifications for a massive corpus of 6,500 program pairs. Our results show that our technique is effective in determining similarity, and achieves more than 85 percent precision, recall and accuracy.
Fri 4 Nov (GMT+02:00) Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change
|10:30 - 10:55|
Exploring the Role of Sequential Computation in Distributed Systems: Motivating a Programming Paradigm ShiftDOI
|10:55 - 11:20|
|11:20 - 11:45|
Soumya IndelaUniversity of Maryland at College Park, Mukul KulkarniUniversity of Maryland at College Park, Kartik NayakUniversity of Maryland at College Park, Tudor DumitrasUniversity of Maryland at College ParkDOI
|11:45 - 12:10|