Saturday, December 31, 2016

2016 Retrospective

.NET

It has been a great year for .NET development! A Visual Studio Community is fully featured, .NET Core has arrived, and everything is open source. Regarding .NET Core, I am really enjoying working with it, and I simply cannot wait to get deeper into the Linux world.

Blog

I finally had to downgrade from three posts per month to only two posts per month. Unfortunately writing quality blog posts tasks time, and that was not something that I had in great abundance this year. Fortunately, I do think that the majority of posts this year were very high quality, especially when you look at the most recent ones. I have been working a lot with performance optimization, and have really been enjoying profiling and digging deep into code to see exactly what it is doing and why.

Tact.NET

I am very happy to have launched Tact.NET this year! I have always really enjoyed creating frameworks, so rather than continue to write one off posts on this blog I decide to put all of my extracurricular worth together under one repository. I am really enjoying making Tact, and I have every intention of continuing to grow it.

QQ Cast

Wow, the QQ Cast is back! We took a hiatus for the second half of 2015, but in 2016 we recorded 43 podcasts. Next week is actually going to be our 100th episode, be sure to check it out!

Happy new year,
Tom

Friday, December 30, 2016

Object Pooling and Memory Streams

The theme of this year, which I will talk about in my 2016 retrospective, has been optimization. It's been a fun journey, and I have really enjoyed getting down and dirty with profiling garbage collection, using spin waits, and aggressive inlining.

I want to end this year on a fun note: object pooling.

A great use case for this would be making HTTP requests with serialized objects. When you serialize an object, and then place it in a HttpContent object, you are probable creating several buffers (byte arrays) each time. For example, if you are using Newtonsoft to serialize an object and then adding that to a string content object for your request, then you are probably using more memory than you need. But that is getting ahead of ourselves...

Come back next week for a blog post about efficient JSON Content serialization!

For now, let's focus on building an object pool. Really all that we need is a preallocated array to store unused objects in, and then a super efficient thread safe data structure to pool (get and set) those objects.

How does pooling memory streams help us?

When you create a MemoryStream, it creates a byte array. As that byte array grows, the memory stream resizes it by allocating a new larger array and then copying your bytes into it. This is inefficient not only because it creates new objects and throws the old ones away, but also because it has to do the leg work of copying the content each time it resizes.

How can we reuse memory streams? Just set the length to zero!

Internally this will just set an index and empty the array, but the internal data structures will be preserved for future use. Thus, by putting memory streams into an object pool, we can drastically increase our efficiency.

Here is a demo of using the Tact.NET ObjectPool to pool MemoryStreams...

[Fact]
public void MemoryStreamPoolDemo()
{
    using (var pool = new ObjectPool<MemoryStream>(100, () => new MemoryStream()))
    {
        var memoryStream1 = pool.Acquire();
 
        memoryStream1.SetLength(0);
        Assert.Equal(0, memoryStream1.Capacity);
 
        memoryStream1.Write(new byte[] {1, 0, 1, 0, 1}, 0, 5);
 
        var array1 = memoryStream1.ToArray();
        Assert.Equal(5, array1.Length);
        Assert.Equal(1, array1.First());
        Assert.Equal(1, array1.First());
 
        pool.Release(memoryStream1);
 
        var memoryStream2 = pool.Acquire();
        Assert.Same(memoryStream1, memoryStream2);
 
        memoryStream2.SetLength(0);
        Assert.Equal(256, memoryStream2.Capacity);
 
        memoryStream2.Write(new byte[] { 0, 1, 0 }, 0, 3);
 
        var array2 = memoryStream2.ToArray();
        Assert.Equal(3, array2.Length);
        Assert.Equal(0, array2.First());
        Assert.Equal(0, array2.First());
    }
}

Enjoy,
Tom

Real Time Web Analytics