Loading…
Comparing Programming Language Models for Design Pattern Recognition
Design patterns (DPs) facilitate effective software architecture and design and must be maintained and enforced in existing complex software products, for example, automotive software. Implementing DPs in source code facilitates the development of high-quality software products with less effort. How...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Design patterns (DPs) facilitate effective software architecture and design and must be maintained and enforced in existing complex software products, for example, automotive software. Implementing DPs in source code facilitates the development of high-quality software products with less effort. However, recognizing DPs in program code is challenging, and this makes it difficult to keep architectural evolution under control in large software products over time. As DPs are abstract solutions, the programs used to recognize them in source code have significant limitations. In this paper, we employ four programming language models based on Bidirectional Encoder Representations from Transformers (BERT) to study to which extent these models can recognize an exemplar DP, in this case, Singleton. We compare four language representation models - OpenAI CodeX, Facebook AI TransCoder, ACoRA/BERT, and CCFlex/bag-of-words, and compare the models' rankings to a simple base metric. We found a discrepancy between models in identifying Singletons and found that the models are inconsistently sensitive to name and semantic changes. Specifically, CodeX recognizes the existence of Singletons better than other models, while only ACoRA shows some signs of recognizing DP semantics. |
---|---|
ISSN: | 2768-4288 |
DOI: | 10.1109/ICSA-C63560.2024.00041 |