.netthreadpoolworkload

Optimal number of items to keep queued for the thread pool in .NET?


I am building a background processing engine which supports discarding both to-be-processed and is-being-processed items. This is for usage in a winforms application that will require heavy processing of some input elements, so I'm building a queue engine where I can enqueue workload items, and when they're processed, I get notified with the results.

The question is, this queue will almost always contain a lot of items to begin with, and I thought that instead of just dumping everything to the threadpool, I'd place only the first N items into the threadpool, and keep backfilling when they are processed. The reason I want to do this is that once I dump them into the threadpool, they will be processed, and even if they're tagged as discard, they will still take up queue time.

With the backfill implementation I've made, I can remove items from the queue if they become discarded, and only put them into the queue when it's their turn, so to speak.

So the question is, how would I go about calculating this number N, the number of items to place into and keep in the thread pool queue.

Issues I've considered:

What do you think?

New: Ok, due to one of the answers, I'll explain a bit more. Every item put into the queue is keyed by something unique. If I dump another item into the queue with the same key as an existing item, that old item is considered "Discarded", and should be removed. If the item is being processed, a property on the workload item is set to true, a "IsDicarded" property, which the processing method is responsible for calling. If it detects a discarded item, it should quit early, returning no results.

Perhaps I should experiment a bit more, and try to just dump everything into the threadpool.

New question: Is there a limit to the number of items I can queue up? If not, then this would easily simplify my class a lot.

Note: When I say "lengthy processing", I mean in the order of 1-10 seconds. Is the threadpool even the best for this? I see notes all over the web about "the processing should be quick", but what "quick" is is never mentioned. Is quick in the order of milliseconds here?


Solution

  • Do you know Ami Bar's Smart Thread Pool?

    Seems its implementation allows you to cancel an unprocessed item and dynamically increases threads as required, until a hard limit; I personally use 100 * Environment.ProcessorsCount