Showing posts with label Semaphore. Show all posts
Showing posts with label Semaphore. Show all posts

Wednesday, May 4, 2016

IResourceLoader: Balancing Semaphores

Recently I need to get balance getting resources from a restricted number of sources. So, for example...

I am getting resource R, and I have factories A, B and C creating those Rs. Each of those factories has a very limited capacity for creating those resources, and can only create two Rs at a time. It is easy to put the factories behind a semaphore and limit how many threads can be requesting resources from each factory at a time.

The challenge is evenly balancing the workload between all three factories. Also, please note that you can't just round robin the semaphores because there is no way to ensure that each operation will complete in the same amount of time.

To do this I created a generic IResourceLoader interface, and made two implementations: one to wrap a semaphore, and the other to wrap and balance a collection of IResourceLoaders. Below is the implementation, complete with unit tests; let's take a look!

Interface

public interface IResourceLoader<T>
{
    int Available { get; }
    int Count { get; }
    int MaxConcurrency { get; }
 
    Task<T> GetAsync(CancellationToken cancelToken = default(CancellationToken));
    bool TryGet(out Task<T> resource, CancellationToken cancelToken = default(CancellationToken));
}

Friday, March 18, 2016

How to Release a Semaphore with a Using Block

I love that .NET has so many useful utilities available in the base framework. I often use the SemaphoreSlim, and I love that it supports async await. However, I don't like always having to create a try finally block around every use call to ensure that the release method gets called.

Below is a simple little extension method that will allow you to place the result of the Semaphore wait into a using block, thus ensuring that the dispose will always release the lock.

SemaphoreSlim Extensions

public static class SemaphoreSlimExtensions
{
    public static async Task<IDisposable> UseWaitAsync(
        this SemaphoreSlim semaphore, 
        CancellationToken cancelToken = default(CancellationToken))
    {
        await semaphore.WaitAsync(cancelToken).ConfigureAwait(false);
        return new ReleaseWrapper(semaphore);
    }
 
    private class ReleaseWrapper : IDisposable
    {
        private readonly SemaphoreSlim _semaphore;
 
        private bool _isDisposed;
 
        public ReleaseWrapper(SemaphoreSlim semaphore)
        {
            _semaphore = semaphore;
        }
 
        public void Dispose()
        {
            if (_isDisposed)
                return;
 
            _semaphore.Release();
            _isDisposed = true;
        }
    }
}

Monday, November 30, 2015

.NET Semaphore Slim that Supports Keys

While making a HUGE update to my CacheRepository project, I needed a way to have a dynamic number of semaphores that would lock on a specified cache key. The SemaphoreSlim is great, but I needed a wrapper around it that allowed me have one for each unique cache key being fetched.

The easiest solution was just to have a concurrent dictionary of string to semaphore, but at high load that would grow in size and I did not want to waste memory. Instead I created a class that does keep a dictionary of semaphores, but then removes them from the dictionary and stores them in a queue for reuse once there is nothing locking off on them.

Enough talking! Below is the code, and as always it comes with unit tests! :)

Sunday, October 4, 2015

Throttles - Delay vs Semaphore

A while back I had talked about how to await an interval with a simple Throttle class that I had made. This is a very easy way to control how often you start an operation, but it does not ensure a limit to how many operations are happening at a given time.

Problem: You want to only make 5 requests per second to a remote service.

With the throttle class set to only allow one new call to start every 200 milliseconds you will only be able to start 5 new requests per second. However, if those calls take longer than two seconds to complete, then during the next second you will have 10 requests in flight at the same time.

Solution: Set your throttle to 200 millisecond and add a semaphore with a count of 5.

You can solve this problem by combining your throttle with a semaphore. This will ensure that you only start a new operation at your schedule frequency, and also that you never have more than a predetermined number in flight at the same time.

Please note that only using a semaphore would not solve the problem because if, in the same example, the calls take sub 200 milliseconds to complete then more than 5 requests will start each second.

Below is a set of helper classes (and tests) to help extend the throttle class to support this functionality within a using block.

Interfaces

public interface IUsableSemaphore : IDisposable
{
    Task<IUsableSemaphoreWrapper> WaitAsync();
}
 
public interface IUsableSemaphoreWrapper : IDisposable
{
    TimeSpan Elapsed { get; }
}
 
public interface IThrottle
{
    Task WaitAsync();
}

Real Time Web Analytics