ehcache

net.sf.ehcache.constructs.refreshahead
Class ThreadedWorkQueue<W>

java.lang.Object
  extended by net.sf.ehcache.constructs.refreshahead.ThreadedWorkQueue<W>
Type Parameters:
W -

public class ThreadedWorkQueue<W>
extends Object

This class implements a work queue of pooled objects. You can offer a stream of objects to the backing poool of threads and it will consume them and hand them to the BatchWorker as a collection to be processed (batched).

Essentially, it uses BatchWorker as Callable/Future with a collection argument.

Author:
cschanck

Nested Class Summary
static interface ThreadedWorkQueue.BatchWorker<WW>
          Callback class, think of it as a Runnable with an argument that is a Collection.
 
Constructor Summary
ThreadedWorkQueue(ThreadedWorkQueue.BatchWorker<W> dispatcher, int numberOfThreads, ThreadFactory factory, int maximumQueueSize, int batchSize)
          Create a work queue where work is dispatched through the given dispatcher, which the specified number of threads.
 
Method Summary
 long getBacklogCount()
          Get the current backlog count.
 int getBatchSize()
          Get the batch size
 ThreadedWorkQueue.BatchWorker<W> getDispatcher()
          get the dispatcher being used for this queue.
 int getDroppedCount()
          Gets dropped counter.
 int getOfferedCount()
          Gets offer counter.
 int getProcessedCount()
          Gets processed count.
 boolean isAlive()
          Is this work queue still accepting work.
 void offer(W workUnit)
          Offer a work unit to queue.
 void shutdown()
          Shutdown this queue.
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

ThreadedWorkQueue

public ThreadedWorkQueue(ThreadedWorkQueue.BatchWorker<W> dispatcher,
                         int numberOfThreads,
                         ThreadFactory factory,
                         int maximumQueueSize,
                         int batchSize)
Create a work queue where work is dispatched through the given dispatcher, which the specified number of threads.

Parameters:
dispatcher - Thread safe dispatcher to use to dispatch work
numberOfThreads - Number of parallel threads used process work from this queue
factory - ThreadFactory used to create the threads
maximumQueueSize - maximum backlog of work items that can be queued before items get dropped
batchSize - number of items, at a maximum, to send to a dispatcher at a time.
Method Detail

offer

public void offer(W workUnit)
Offer a work unit to queue. Might push prior work units off of the work queue, dropped forever.

Parameters:
workUnit -

isAlive

public boolean isAlive()
Is this work queue still accepting work.

Returns:
true if still alive

getBacklogCount

public long getBacklogCount()
Get the current backlog count. An approximation, by necessity.

Returns:
count of items yet to be processed.

getOfferedCount

public int getOfferedCount()
Gets offer counter. Cumulative tripped

Returns:
the offer counter

getDroppedCount

public int getDroppedCount()
Gets dropped counter.

Returns:
the dropped counter

getProcessedCount

public int getProcessedCount()
Gets processed count.

Returns:
the processed count

getDispatcher

public ThreadedWorkQueue.BatchWorker<W> getDispatcher()
get the dispatcher being used for this queue.

Returns:
dispatcher

getBatchSize

public int getBatchSize()
Get the batch size

Returns:
batch size

shutdown

public void shutdown()
Shutdown this queue. Propagates an interrupt to currently executing ThreadedWorkQueue.BatchWorker threads.


ehcache

Copyright 2001-2017, Terracotta, Inc.