Tuesday, February 28, 2017

WebSocket Support for .NET Core

Full WebSocket support is coming with .NET Standard 2.0, which has now been delayed until Q3. In the meantime, there are still a few options to work with...

If you want to use Microsoft.AspNetCore.WebSockets.Server, I have added a middle ware wrapper that feel a lot more like Fleck:

public void Configure(IApplicationBuilder app)
{
    app.UseWebSockets();
    app.UseWebSocketHandler("test", connection =>
    {
        // Register your listeners here
        connection.OnMessage = m =>
        {
            if (m == "hi")
                connection.SendAsync("bye");
        };
    });
}

Enjoy,
Tom

Sunday, February 26, 2017

Run .NET Core xUnit tests from ReSharper in VS2015

Visual Studio 2017 is literally only a few days away from release; so it might be a little late, but I finally figured out how to run .NET Core xUnit tests from ReSharper in VS2015! Good News: If you can't upgrade to VS2017 right away, then at least you can still run your unit tests!

Just make sure that the following is included in your project.json file (with the appropriate runtime):

{
  "testRunner": "xunit",
 
  "dependencies": {
    "dotnet-test-xunit": "2.2.0-preview2-build1029",
    "xunit": "2.2.0"
  },
 
  "frameworks": {
    "netcoreapp1.0": {
      "imports": "dnxcore50"
    }
  },
 
  "runtimes": {
    "win10-x64": {}
  }
}

Enjoy,
Tom

Tuesday, January 31, 2017

.NET Standard Adoption as of January 2017

Updated 2/16 to include Elasticsearch

As should be obviously from my recently blog posts, I have really been enjoying working with .NET Core. Clearly I am not alone, as a significant number of libraries have been porting over to the .NET Standard.

Below is a list libraries that have added support for the .NET Standard, meaning that they should be able to run cross platform on both Windows and Linux.

While I have not yet had the opportunity to try all of the libraries listed below, I have had great luck with the ones that I have tested, and I am simply ecstatic to see this list growing as fast as it is.

Technology NuGet Package .NET Standard Support
Autofac Autofac Released for 1.1
Cassandra DataStax C# Driver for Apache Cassandra Released for 1.5
Couchbase Couchbase SDK 2.0 Beta for 1.5
Elasticsearch Elasticsearch.Net Released for 1.3
Kafka Confluent.Kafka Preview for 1.3
log4net Apache log4net Released for 1.3
MongoDB MongoDB.Driver Released for 1.4
NLog NLog Beta for 1.3
RabbitMQ RabbitMQ.Client Released for 1.5
RavenDB RavenDB Client Released for 1.3
Redis StackExchange.Redis Released for 1.5
Sqlite Microsoft.EntityFrameworkCore.Sqlite Released for 1.3
WebSocket Client WebSocket4Net Released for 1.3

How have these libraries been working out for you? Is there a better option than what I have listed? Please leave a comment and let me know!

Enjoy,
Tom

Sunday, January 29, 2017

.NET JsonContent for HttpClient

.NET already comes with a nice collection of HttpContent serializers, but it lacks a JsonContent type. A common solution is to just serialize their payload to a JSON string and that insert that into an instance of StringContent. However, this means that you need to remember to set your headers, and it is a little bit inefficient because of how it creates multiple strings and buffers for each payload.

I have create a simple implementation of JsonContent that uses Json.NET and pooled memory streams. The result is between 2% and 10% faster, and causes ~50% fewer garbage collections.

Check out the implementation in Tact.NET:

Enjoy,
Tom

Saturday, December 31, 2016

2016 Retrospective

.NET

It has been a great year for .NET development! A Visual Studio Community is fully featured, .NET Core has arrived, and everything is open source. Regarding .NET Core, I am really enjoying working with it, and I simply cannot wait to get deeper into the Linux world.

Blog

I finally had to downgrade from three posts per month to only two posts per month. Unfortunately writing quality blog posts tasks time, and that was not something that I had in great abundance this year. Fortunately, I do think that the majority of posts this year were very high quality, especially when you look at the most recent ones. I have been working a lot with performance optimization, and have really been enjoying profiling and digging deep into code to see exactly what it is doing and why.

Tact.NET

I am very happy to have launched Tact.NET this year! I have always really enjoyed creating frameworks, so rather than continue to write one off posts on this blog I decide to put all of my extracurricular worth together under one repository. I am really enjoying making Tact, and I have every intention of continuing to grow it.

QQ Cast

Wow, the QQ Cast is back! We took a hiatus for the second half of 2015, but in 2016 we recorded 43 podcasts. Next week is actually going to be our 100th episode, be sure to check it out!

Happy new year,
Tom

Friday, December 30, 2016

Object Pooling and Memory Streams

The theme of this year, which I will talk about in my 2016 retrospective, has been optimization. It's been a fun journey, and I have really enjoyed getting down and dirty with profiling garbage collection, using spin waits, and aggressive inlining.

I want to end this year on a fun note: object pooling.

A great use case for this would be making HTTP requests with serialized objects. When you serialize an object, and then place it in a HttpContent object, you are probable creating several buffers (byte arrays) each time. For example, if you are using Newtonsoft to serialize an object and then adding that to a string content object for your request, then you are probably using more memory than you need. But that is getting ahead of ourselves...

Come back next week for a blog post about efficient JSON Content serialization!

For now, let's focus on building an object pool. Really all that we need is a preallocated array to store unused objects in, and then a super efficient thread safe data structure to pool (get and set) those objects.

How does pooling memory streams help us?

When you create a MemoryStream, it creates a byte array. As that byte array grows, the memory stream resizes it by allocating a new larger array and then copying your bytes into it. This is inefficient not only because it creates new objects and throws the old ones away, but also because it has to do the leg work of copying the content each time it resizes.

How can we reuse memory streams? Just set the length to zero!

Internally this will just set an index and empty the array, but the internal data structures will be preserved for future use. Thus, by putting memory streams into an object pool, we can drastically increase our efficiency.

Here is a demo of using the Tact.NET ObjectPool to pool MemoryStreams...

[Fact]
public void MemoryStreamPoolDemo()
{
    using (var pool = new ObjectPool<MemoryStream>(100, () => new MemoryStream()))
    {
        var memoryStream1 = pool.Acquire();
 
        memoryStream1.SetLength(0);
        Assert.Equal(0, memoryStream1.Capacity);
 
        memoryStream1.Write(new byte[] {1, 0, 1, 0, 1}, 0, 5);
 
        var array1 = memoryStream1.ToArray();
        Assert.Equal(5, array1.Length);
        Assert.Equal(1, array1.First());
        Assert.Equal(1, array1.First());
 
        pool.Release(memoryStream1);
 
        var memoryStream2 = pool.Acquire();
        Assert.Same(memoryStream1, memoryStream2);
 
        memoryStream2.SetLength(0);
        Assert.Equal(256, memoryStream2.Capacity);
 
        memoryStream2.Write(new byte[] { 0, 1, 0 }, 0, 3);
 
        var array2 = memoryStream2.ToArray();
        Assert.Equal(3, array2.Length);
        Assert.Equal(0, array2.First());
        Assert.Equal(0, array2.First());
    }
}

Enjoy,
Tom

Sunday, November 27, 2016

The Performance Cost of Boxing in .NET

I recently had to do some performance optimizations against a sorted dictionary that yielded some interesting results...

Background: I am used to using Tuples a lot, simply because they are easy to use and normally quite efficient. Please remember that Tuples were changed from structs to classes back in .NET 4.0.

Problem: A struct decreased performance!

I had a SortedDictionary that was using a Tuple as a key, so I thought "hey, I'll just change that tuple to a struct and reduce the memory usage." ...bad news, that made performance WORSE!

Why would using a struct make performance worse? It's actually quite simple and obvious when you think about it: it was causing comparisons to repeatedly box the primitive data structure, thus allocating more memory on the heap and triggering more garbage collections.

Solution: Use a struct with an IComparer.

I then created a custom struct and used that; it was must faster, but it was still causing boxing because of the non-generic IComparable interface. So finally I added a generic IComparer and passed that into my dictionary constructor; my dictionary then ran fast and efficient, causing a total of ZERO garbage collections!

See for yourself:

The Moral of the Story

Try to be aware of what default implementations are doing, and always remember that boxing to object can add up fast. Also, pay attention to the Visual Studio Diagnostics Tools window; it can be very informative!

Here is how many lines of code it took to achieve a 5x performance increase:

private struct MyStruct
{
    public MyStruct(int i, string s) { I = i; S = s; }
    public readonly int I;
    public readonly string S;
}
 
private class MyStructComparer : IComparer<MyStruct>
{
    public int Compare(MyStruct x, MyStruct y)
    {
        var c = x.I.CompareTo(y.I);
        return c != 0 ? c : StringComparer.Ordinal.Compare(x.S, y.S);
    }
}

Test Program

I have written some detailed comments in the Main function about what each test is doing and how it will affect performance. Let's take a look...

Real Time Web Analytics