Since the beginning of distributed computing, overlapping communication with computation has always been an attractive technique used to mask high communication costs. Although easy to detect by a human being, communication/computation overlapping requires knowledge about architectural and network details in order to be performed effectively.
When low level details influence performance and productivity, compilers and run-time libraries play the critical role of translating the high level statements understandable by humans into efficient commands suitable for machines.
With the advent of PGAS languages, parallelism becomes part of the programming language and communication can be expressed with simple variable assignments. As for serial programs, PGAS compilers should be able to optimize all aspects of the language. That would include communication, but unfortunately this is not yet the case.
In this work we consider parallel scientific programs written in Coarray Fortran and we focus on how to build a PGAS compiler capable to optimize the communication, in particular by automatically exploiting opportunities for communication/computation overlapping. We also sketch an extension for the Fortran language that allows one to express the relation between data and synchronization events; we finally show how this relation can be used by the compiler to perform additional communication optimizations.
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
Tel.: +1 703 830 6300
Fax: +1 703 830 2300 email@example.com
(Corporate matters and books only) IOS Press c/o Accucoms US, Inc.
For North America Sales and Customer Service
West Point Commons
Lansdale PA 19446
Tel.: +1 866 855 8967
Fax: +1 215 660 5042 firstname.lastname@example.org