next up previous
Next: Representation Up: Data and Computation Previous: Synopsis and Rationale

Minimizing Synchronization and Communication

 

The problem of minimizing data communication is fundamental to all parallel machines, be they distributed or shared address space machines. A popular approach to this problem is to leave the responsibility to the user; the user specifies the data-to-processor mapping using a language such as HPF[17], and the compiler infers the computation mapping by using the owner-computes rule[18]. Recently a number of algorithms for finding data and/or computation decompositions automatically have been proposed[2,4,5,7,15,25,26]. In keeping with our observation that communication in efficient parallel programs is infrequent, our algorithm is unique in that it offers a simple procedure to find the largest available degree of parallelism that requires no major data communication[4]. We present a brief overview of the algorithm, and readers are referred to [4] for more details.





Saman Amarasinghe
Fri Apr 7 11:22:17 PDT 1995