Communication Calls
Christopher Cameron, Steve Lantz (original)
Cornell Center for Advanced Computing
Revisions: 8/2025, 5/2022, 4/2013 (original)
This topic explains the three RMA communication calls for one-sided communication in MPI.
Objectives
After you complete this topic, you should be able to:
- List the three RMA communication calls supported by MPI
- Recognize the arguments used with MPI_Get, MPI_Put, and MPI_Accumulate
- List the operations supported by MPI_Accumulate
Prerequisites
- A basic knowledge of parallel programming and MPI. Information on these prerequisites can be found in other topics (Parallel Programming Concepts and High-Performance Computing, MPI Basics).
- Ability to program in a high-level language such as Fortran or C.
- MPI Collective Communications logically precedes this topic but is not a prerequisite.
©
|
Cornell University
|
Center for Advanced Computing
|
Copyright Statement
|
Access Statement
CVW material development is supported by NSF OAC awards 1854828, 2321040, 2323116 (UT Austin) and 2005506 (Indiana University)
CVW material development is supported by NSF OAC awards 1854828, 2321040, 2323116 (UT Austin) and 2005506 (Indiana University)