Showing posts with label Throttling. Show all posts
Showing posts with label Throttling. Show all posts

Sunday, October 4, 2015

Throttles - Delay vs Semaphore

A while back I had talked about how to await an interval with a simple Throttle class that I had made. This is a very easy way to control how often you start an operation, but it does not ensure a limit to how many operations are happening at a given time.

Problem: You want to only make 5 requests per second to a remote service.

With the throttle class set to only allow one new call to start every 200 milliseconds you will only be able to start 5 new requests per second. However, if those calls take longer than two seconds to complete, then during the next second you will have 10 requests in flight at the same time.

Solution: Set your throttle to 200 millisecond and add a semaphore with a count of 5.

You can solve this problem by combining your throttle with a semaphore. This will ensure that you only start a new operation at your schedule frequency, and also that you never have more than a predetermined number in flight at the same time.

Please note that only using a semaphore would not solve the problem because if, in the same example, the calls take sub 200 milliseconds to complete then more than 5 requests will start each second.

Below is a set of helper classes (and tests) to help extend the throttle class to support this functionality within a using block.

Interfaces

public interface IUsableSemaphore : IDisposable
{
    Task<IUsableSemaphoreWrapper> WaitAsync();
}
 
public interface IUsableSemaphoreWrapper : IDisposable
{
    TimeSpan Elapsed { get; }
}
 
public interface IThrottle
{
    Task WaitAsync();
}

Sunday, December 15, 2013

Throttling Datafow and the Task Parallel Library

The Task Parallel Library is an amazingly powerful and versatile library. However, knowledge of how dataflow blocks process their data is vital to using them correctly. Trying to link source and target blocks to each other without fully understanding them is like throwing a live grenade into your app domain; at some point it will tear it down!

I recently experienced a bug where linking two blocks together without managing their Bounded Capacity caused them queue actions at an unsustainable rate, and eventually the dataflow literally eat all of the available memory on the server. This could have been easily avoided by throttling the source and target blocks.

How do you throttle your source block based on your target block?

Once linked together, a source block will produce messages as fast as it's target block can consume them. To prevent a source block from being too greedy, you want to restrict the bounded capacity for both it and it's consumer. Even then, you still need to understand that setting a bounded capacity could cause message producers to either block or fail to post.

...I'm sorry, but this subject is very complex, but I think code will explain these details best! Below are a detailed set of tests to take you through all the basic scenarios for throttling a dataflow:

Real Time Web Analytics