Explicit Parallelism. ECE 1747H : Parallel Programming. Distributed Memory - Message Passing. Distributed Memory - Message Passing

Explicit Parallelism ECE 1747H : Parallel Programming Message Passing (MPI) • Same thing as multithreading for shared memory. • Explicit parallelism ...
Author: Cory Holland
4 downloads 0 Views 50KB Size
Explicit Parallelism ECE 1747H : Parallel Programming Message Passing (MPI)

• Same thing as multithreading for shared memory. • Explicit parallelism is more common with message passing. – User has explicit control over processes. – Good: control can be used to performance benefit. – Bad: user has to deal with it.

Distributed Memory - Message Passing

mem1

mem2

mem3

memN

proc1

proc2

proc3

procN

Distributed Memory - Message Passing • A variable x, a pointer p, or an array a[] refer to different memory locations, depending of the processor. • In this course, we discuss message passing as a programming model (can be on any hardware)

network

1

What does the user have to do? • This is what we said for shared memory: – Decide how to decompose the computation into parallel parts. – Create (and destroy) processes to support that decomposition. – Add synchronization to make sure dependences are covered.

• Is the same true for message passing?

proc1

1 2 3 4

proc2

temp

proc3

for some number of timesteps/iterations { for (i=0; i