Using Collective Communication
Christopher Cameron, Steve Lantz, Brandon Barker, CAC Staff (original)
Cornell Center for Advanced Computing
Revisions: 5/2022, 1/2014, 2001 (original)
This topic explains how to use collective communications effectively.
Objectives
After you complete this segment, you should be able to:
- Explain how MPI implementation affects performance
- Demonstrate two ways to broadcast
- Demonstrate two ways to scatter
- Distinguish between scatter and scatterv
Prerequisites
- A basic knowledge of parallel programming and MPI. Information on these prerequisites can be found in other topics (Parallel Programming Concepts and High-Performance Computing, MPI Basics).
- Ability to program in a high-level language such as Fortran or C.