Using Collective Communication
Christopher Cameron, Steve Lantz, Brandon Barker, CAC Staff (original)
Cornell Center for Advanced Computing
Revisions: 8/2025, 5/2022, 1/2014, 2001 (original)
This topic explains how to use collective communications effectively.
Objectives
After you complete this segment, you should be able to:
- Explain how MPI implementation affects performance
- Demonstrate two ways to broadcast
- Demonstrate two ways to scatter
- Distinguish between scatter and scatterv
Prerequisites
- A basic knowledge of parallel programming and MPI. Information on these prerequisites can be found in other topics (Parallel Programming Concepts and High-Performance Computing, MPI Basics).
- Ability to program in a high-level language such as Fortran or C.
©
|
Cornell University
|
Center for Advanced Computing
|
Copyright Statement
|
Access Statement
CVW material development is supported by NSF OAC awards 1854828, 2321040, 2323116 (UT Austin) and 2005506 (Indiana University)
CVW material development is supported by NSF OAC awards 1854828, 2321040, 2323116 (UT Austin) and 2005506 (Indiana University)