Parallel Computing: Lecture 8
Learning objectives
After this class, you should be able to:
- Explain the message passing programming model using MPI for communication.
- Given a problem, write parallel code to solve it, and evaluate the performance of your code, using the following MPI functions: (i)
MPI_Init
, (ii)MPI_Comm_rank
, (iii)MPI_Comm_size
, (iv)MPI_Reduce
, (v)MPI_Finalize
, (vi)MPI_Barrier
, (vii)MPI_Wtime
, and (viii)MPI_Wtick
.
Reading assignment
- Chapter 4, NCSA p690 documentation, especially on compiling and running MPI programs.
- MPI 1.1 standard, section on collective communication.
Exercises and review questions
- Questions on current lecture's material
- (Programming) Exercise 4.6.
- (Programming) Exercise 4.9.
- Questions on next lecture's material
- If you wanted the result of a reduce operation to be available on all processors, then which MPI function will you use?