Introduction to MPI
Christopher Cameron, Peter Vaillancourt, CAC Staff (original)
Cornell Center for Advanced Computing
Revisions: 8/2025, 5/2022, 3/2019, 6/2017, 2/2001 (original)
This topic provides an overview of the features of the Message Passing Interface (MPI), a specification that is the de facto standard for distributed memory computing. It describes the differences among various MPI versions and implementations and presents a brief sketch of how to incorporate MPI into your development.
Objectives
After you complete this topic, you should be able to:
- Describe why people use MPI and identify appropriate use cases.
- Name four characteristics of high-quality MPI implementations
- Choose among available MPI implementations
- Plan how to incorporate MPI into your workflow
Prerequisites
- A working knowledge of general programming concepts
- Ability to program in a high-level language such as Fortran, C, or C++
- A basic familiarity with parallel programming concepts
©
|
Cornell University
|
Center for Advanced Computing
|
Copyright Statement
|
Access Statement
CVW material development is supported by NSF OAC awards 1854828, 2321040, 2323116 (UT Austin) and 2005506 (Indiana University)
CVW material development is supported by NSF OAC awards 1854828, 2321040, 2323116 (UT Austin) and 2005506 (Indiana University)