Monday 25 June 2012

Shared Queue Pattern

In many parallel algorithms, a queue is very common data structure that is
to be shared among multiple units of execution (UE). For example, many
graph traversal algorithms maintain a queue that contains a set of vertexes
that need to be traversed in the next span. Parallel graph traversal involves
efficient sharing of the queue among concurrent UEs. Also, a task queue
is commonly used as a scheduling and load‐balancing framework,
which implements the Master/Worker pattern.

Forces

1. Simple protocols vs. increased concurrency: Simple concurrency‐control
protocols employing a single synchronization construct are easy to implement
correctly and hence are less error prone. However, if concurrency‐control
protocols encompass too much of the shared queue in a single synchronization
construct, they will increase the chances that UEs will remain blocked waiting to
access the queue and will limit available concurrency. On the contrary, if the
protocols are finely tuned, they will increase the available concurrency but such
tuning requires more synchronization constructs getting complicated, thus more
error‐prone.

2. Single queue abstraction vs. multiple queue abstraction: Depending on
the memory hierarchy of the underlying hardware and on the number of UEs,
maintaining a single queue can cause excess contention and increase parallel
overhead. Solutions may need to break with the single‐queue abstraction and
use multiple or distributed queues.



No comments:

Post a Comment