Loading…

Data Dependence Analysis Techniques for Increased Accuracy and Extracted Parallelism

Parallelizing compilers rely on data dependence information in order to produce valide parallel code. Traditional data dependence analysis techniques, such as the Banerjee test and the I-Test, can efficiently compute data dependence information for simple instances of the data dependence problem. Ho...

Full description

Saved in:
Bibliographic Details
Published in:International journal of parallel programming 2004-08, Vol.32 (4), p.317-359
Main Authors: Kyriakopoulos, Konstantinos, Psarris, Kleanthis
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Parallelizing compilers rely on data dependence information in order to produce valide parallel code. Traditional data dependence analysis techniques, such as the Banerjee test and the I-Test, can efficiently compute data dependence information for simple instances of the data dependence problem. However, in more complex cases involving triangular or trapezoidal loop regions, symbolic variables, and multidimensional arrays with coupled subscripts these tests, including the triangular Banerjee test, ignore or simplify many of the constraints and thus introduce approximations, especially when testing for data dependence under direction vector constraints. The Omega test can accurately handle such complex cases, but at a higher computation cost. In this paper we extend the ideas behind the I-Test and present new techniques to handle complex instances of the dependence problem, which are frequently found in actual source code. In particular, we provide polynomial-time techniques that can prove or disprove data dependences, subject to any direction vector, in loops with triangular or trapezoidal bounds, symbolic variables, and multidimensional arrays with coupled subscripts. We also investigate the impact of the proposed data dependence analysis techniques in practice. We perform an extensive experimental evaluation of the data dependence analysis tests, including the I-Test, the Omega test and the proposed new techniques. We compare these tests in terms of data dependence accuracy, compilation efficiency and effectiveness in program parallelization. We run several experiments using the Perfect Club benchmarks and the scientific library Lapack. We analyze the trade-off between accuracy and efficiency and the reasons for any approximation of each data dependence test. We determine the impact of the dependence analysis phase on the total compilation time and we measure the number of loops parallelized by each test. We conclude that we can employ polynomial-time techniques to improve data dependence accuracy and increase program parallelization at a reasonable computation cost. [PUBLICATION ABSTRACT]
ISSN:0885-7458
1573-7640
DOI:10.1023/B:IJPP.0000035817.01263.d0