Friday, December 27, 2013

Understanding Unity Lifetime Managers

I absolutely love inversion of control, it is a best practice that I encourage developers to use in every project that they work on. However, like any great tool, dependency injection can cause serious problems if you do not fully understand it; lifetime management in particular.

As a developer you need to be cognisant of how many instances the container is constructing of your classes. You need to know when singleton is going to consume a transient or non thread safe resource. If you are not careful you can leak memory, share unsafe resources, or just make your garbage collector thrash.

Here is a series of examples regarding how to use lifetime managers in my favorite dependency injection container, Microsoft Unity.

  1. ResolveType
  2. Default Lifetime Manager
  3. TransientLifetimeManager
  4. RegisterInstance
  5. ContainerControlledLifetimeManager
  6. ExternallyControlledLifetimeManager
  7. ContainerControlledLifetimeManager with Multiple Keys
  8. RegisterType for Multiple Interfaces
  9. ContainerControlledLifetimeManager with Multiple Interfaces
  10. TransientLifetimeManager with Multiple Interfaces
  11. RegisterInstance Multiple Times
  12. RegisterInstance Multiple Times with External Lifetime Manager

Wednesday, December 18, 2013

Injectable Dataflow Blocks

I really enjoy working with Dataflows, but I always want to resolve my blocks with dependency injection. Thus I have created some abstract wrapper classes around the sealed ActionBlock and TransformBlock classes. This way your can put my logic into the superclass and inject it's dependencies via constructor injection. Additionally, the action method is public, making it even easier to test your code!

Update: I refactored to ditch the constructor parameters in favor of a new abstract BlockOptions.

Sunday, December 15, 2013

Throttling Datafow and the Task Parallel Library

The Task Parallel Library is an amazingly powerful and versatile library. However, knowledge of how dataflow blocks process their data is vital to using them correctly. Trying to link source and target blocks to each other without fully understanding them is like throwing a live grenade into your app domain; at some point it will tear it down!

I recently experienced a bug where linking two blocks together without managing their Bounded Capacity caused them queue actions at an unsustainable rate, and eventually the dataflow literally eat all of the available memory on the server. This could have been easily avoided by throttling the source and target blocks.

How do you throttle your source block based on your target block?

Once linked together, a source block will produce messages as fast as it's target block can consume them. To prevent a source block from being too greedy, you want to restrict the bounded capacity for both it and it's consumer. Even then, you still need to understand that setting a bounded capacity could cause message producers to either block or fail to post.

...I'm sorry, but this subject is very complex, but I think code will explain these details best! Below are a detailed set of tests to take you through all the basic scenarios for throttling a dataflow:

Saturday, December 7, 2013

ConcurrentDictionary.GetOrAdd and Thread Safety

.NET 4.0 added the awesome System.Collections.Concurrent namespace, which includes the very useful ConcurrentDictionary class. While the ConcurrentDictionary is thread safe, it can experience problems with adding values during high concurrency...

ConcurrentDictionary.GetOrAdd may invoke the valueFactory multiple times per key.

This behavior will only happens under high load, and even if the valueFactory does get invoked multiple times the dictionary entry will only ever be set once. Normally this is not much of a problem. However, if you are using this Dictionary to store large or expensive objects (such as unmanaged resources or database connections), then the accidental instantiation of multiple of these could be a real problem for your application.

Don't worry, there is a very simple solution to avoid this problem: just create Lazy wrappers for your expensive objects. That way it will not matter how many times the valueFactory is called, because only one instance of the resource itself will ever actually be accessed and instantiated.

Friday, November 29, 2013

QQ-Cast and the QQ-Review

This blog is devoted (at least primarily) to programming and software development. However, believe it or not, I do enjoy things other than C# and JavaScript! I am a co-host on the QQ-Cast, a nerd and video game podcast, and I also occasionally write video game reviews.

www.QQ-Cast.com

Hope you enjoy!

Happy Thanksgiving,
Tom

Sunday, November 24, 2013

Generic Enum Attribute Caching

Attributes are wonderful for decorating your Enums with additional markup and meta data. However, looking up attributes via reflection is not always a fast operation in terms of performance. Additionally, no one likes typing that code over and over again.

Well not to worry, just use the following extension methods to help cache your Enum attribute look ups, and increase your application's performance! But how much faster is this? Good question...

Iterations Average Elapsed Ticks Difference
No Cache With Cache
2 1182.00 3934.00 4x Slower
10 20.10 7.10 3x Faster
100 13.07 1.37 10x Faster
1,000 13.27 1.40 10x Faster
10,000 13.56 1.45 10x Faster
100,000 13.02 1.33 10x Faster

Monday, November 18, 2013

XUnit.PhantomQ v.1.2 Released

Want to run client side QUnit tests from Visual Studio or your build server? Now it is easier than ever; just grab newly updated XUnit.PhantomQ v1.2 from NuGet!

XUnit.PhantomQ will allow you to execute your QUnit tests as XUnit tests. It supports both library and web projects, and features the ability to easily specify test files and their dependencies by real relative path from the root of your project.

XUnit.PhantomQ on NuGet
XUnit.PhantomQ Source on GitHub

Change Log for v1.2

My thanks to James M Greene and the other authors of the PhantomJS Runner; their work served as the model for this version's improved test result information.

  • Significantly improved test result information and error details.
  • Added console.log support.
  • Added test timeout configuration support.
  • Added QUnit module support.
  • Added QUnit result details to QUnitTest.Context

Saturday, November 16, 2013

jQuery Mobile: Touch Events Only

jQuery Mobile touch events are awesome.

In a recent project I did not need ALL of the features that the jQuery Mobile framework had to offer; I only needed the excellent touch events. While jQuery Mobile does not offer a custom feature build that includes only the touch system, it is actually quite easy to create your own build.

Just download the following files and include them in your project; but be sure to delete the define method wrapper (the first and last line of each file), as you do not need them with the complete jQuery build.

  1. jquery.mobile.ns.js
  2. jquery.mobile.vmouse.js
  3. jquery.mobile.support.touch.js
  4. touch.js

...that's it, you can now use only jQuery Mobiles touch events!

Saturday, October 26, 2013

Bootstrap 3, LESS, Bundling, and ASP.NET MVC

Until Twitter Bootstrap v3, I would have recommend that you use dotLess to compile and bundle your LESS files. However, it is now a known issue that the current build of dotLess does not support Bootstrap 3, or more specifically that it does not support LESS 1.4; and worse yet, there is no fix in sight.

So, how can you use Bootstrap LESS with MVC?

I recommend using BundleTransformer. It is an amazingly feature rich set of extensions for System.Web.Optimization. The BundleTransformer.Less extension provides easy to use LESS transformations (already up to LESS 1.5) and bundling support that wires up straight into your pre-existing BundleCollection configuration. For more information about everything that BundleTransformer has to offer, check out this article.

Now here is how you setup Bootstrap 3 and BundleTransformer for ASP.NET:

Required NuGet Packages

  1. Twitter.Bootstrap.Less
  2. BundleTransformer.Less
  3. BundleTransformer.MicrosoftAjax
  4. JavaScriptEngineSwitcher.Msie

Sunday, October 20, 2013

Unit Testing and Dependency Injection, with xUnit InlineData and Unity

Inversion of control is great because it makes your code more testable; but you usually still have to write tests for each implementation of your interfaces. So what if your unit testing framework could just work directly with your container to make testing even easier? Well, xUnit can!

Below we define a custom data source for our xUnit theories by extending the InlineDataAttribute. This allows our data driven unit tests to resolve types via a container, and then inject those resolved objects straight into our unit tests.

Bottom line: This allows us to test more with less code!

The rest of post is very code heavy, so I strongly recommend that you start out by taking a look at sections 1 and 2 to get an idea of what we are trying to accomplish. :)

  1. Example Interfaces and Classes
  2. Example Unit Tests
  3. IocInlineDataResolver
  4. UnityInlineDataAttribute

Friday, October 18, 2013

Check Properties of a Dynamic Object in .NET

How can you avoid a RuntimeBinderException when working with dynamics?

In JavaScript, checking if an object implements a property is easy; so why can't it be that easy to check dynamics in C#? Well, it sort of is!* If you are using an ExpandoObject, you need only cast it to a Dictionary and check and see if it contains the desired key.

* Offer only valid with ExpandoObject. **
** See sample code for participating interfaces.***
*** Visit your local Visual Studio installation for details.

Sunday, October 13, 2013

Unshelve to a Different Branch in TFS

Love it or hate it, TFS has a lot of features; some are just more discoverable than others.

Team Foundation Server has the ability to unshelve between branches, but it requires Microsoft Team Foundation Server Power Tools to do so. Once you have installed these, simply follow these two steps to move a shelveset from one branch to another:

  1. Navigate to the root of your project.
  2. Fill in and execute the following command:

Unshelve Command:

tfpt unshelve "[ShelveSetName]" /migrate /source:"[SourcePath]" /target:"[TargetPath]"

Example:

cd c:/code/
tfpt unshelve "Demo Shelveset" /migrate /source:"$/DemoProject/branch" /target:"$/DemoProject/trunk"
Shout it

Enjoy,
Tom

Monday, September 30, 2013

LINQ to SQL DataContext BeginTransaction

LINQ to SQL supports Transactions, but there is no method directly off of the DataContext to initialize one. Fortunately, that functionality is just a simple extension method away!

public static class DataContextExtensions
{
    public static IDbTransaction BeginTransaction(
        this DataContext dataContext, 
        IsolationLevel isolationLevel = IsolationLevel.Unspecified)
    {
        if (dataContext.Connection.State != ConnectionState.Open)
            dataContext.Connection.Open();
 
        return dataContext.Transaction = dataContext.Connection
            .BeginTransaction(isolationLevel);
    }
}
 
public class TransactionTests
{
    [Fact]
    public void Example()
    {
        using (var dataContext = new FutureSimpleDataContext())
        using (dataContext.BeginTransaction())
        {
            // TODO: Stuff!
        }
    }
}
Shout it

Enjoy!
Tom

Wednesday, September 18, 2013

How to Debug Minified JavaScript

I recently wrote a blog post about how to control minification per request. However, that strategy will not help you if the minification itself is causing a bug.

Fortunately Chrome has an absolutely amazing set of developer tools that can help you debug any script, even one that have been minified! Just follow these very simple steps:

  1. Navigate to the page in Chrome.
  2. Launch the developers tools (by pressing F12).
  3. Open the JavaScript file in the Sources tab.
  4. Activate the amazing "Pretty print" feature.
  5. Debug those scripts!

Shout it

Enjoy,
Tom

Sunday, September 15, 2013

TypeScript on your Build Server

Last week I wrote about making your TypeScript files compile on save by updating your project files. It is an easy update to make, but then what happens when you check into source control? You are probably going to get an error because your build server can not resolve Microsoft.TypeScript.targets

Two ways to make TypeScript compile on your build server

  1. You can install TypeScript on your build server.

The big problem with this solution is that it means you have to install a specific version of TypeScript on your build server, and thus make all of your project depend on that single version.

  1. You can check the TypeScript compiler into Source Control.

This may seem like an odd solution, but for right now I feel that it is the best way to go. It allows all of your projects to be independent of each other, and you do not need to install anything new on any of your servers. (This solution has been proposed to the TypeScript team; thanks, josundt!)

How to Add TypeScript to Source Control

This may look like a lot of steps, but do not worry! All of these steps are small, simple, and they will only take you a minute or two. :)

  1. Create a TypeScript folder in the root of your solution folder.
  2. Create a SDKs folder inside of the TypeScript folder.
  3. Create a MSBuild folder inside of the TypeScript folder.
  4. Copy the contents of your TypeScript SDKs install (where the TypeScript compiler, tsc.exe, is located) to the TypeScript\SDKs folder that you have created.
    • By default, that will be located at:
      C:\Program Files (x86)\Microsoft SDKs\TypeScript
  5. Copy the contents of your TypeScript MSBuild folder (where your TypeScript target files are located) to the TypeScript\MSBuild folder that you have created.
    • By default, for Visual Studio 2012, that will be located at:
      C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v11.0\TypeScript
  6. Edit TypeScripts\MSBuild\Microsoft.TypeScript.targets to make the TscToolPath point to your local SDKs folder.
    • The original path would be:
      <TscToolPath Condition="'$(TscToolPath)' == ''">$(MSBuildProgramFiles32)\Microsoft SDKs\TypeScript</TscToolPath>
    • The new path should be:
      <TscToolPath Condition="'$(TscToolPath)' == ''">..\TypeScript\SDKs</TscToolPath>
  7. Open your project file for editing.
  8. Update your TypeScript project import to follow a relative path to the local folder.
    • The original path would be:
      <Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v$(VisualStudioVersion)\TypeScript\Microsoft.TypeScript.targets" />
    • The new path should be:
      <Import Project="..\TypeScript\MSBuild\Microsoft.TypeScript.targets" />
  9. You're done! Reload your project and test it out; if that works, check it in.
Shout it

Enjoy,
Tom

Thursday, September 5, 2013

How to Start Using TypeScript Today!

Here are several tips to get you started using TypeScript (currently on 0.9.1.1) today!

Compile on Save

For fast development you MUST compile your TypeScript files on save. While this is not built into the current pre 1.0 release of TypeScript, it is still very easy to enable. There is a very simple article on CodePlex that provides you the exact XML configuration to add to your Project file to compile on save.

Compile-On-Save

Here are some the simple steps to compile on save:

  1. Right click on your project in Solution Explorer and unload it.
  2. Rick click on your project file and open it for editing.
  3. Copy and paste the PropertyGroup and Import nodes from the link above.
  4. Save and close the project file.
  5. Right click and reload the project in solution explorer.
  6. Open the TypeScript Options
    • Tools -> Options -> Text Editor -> TypeScript -> Project -> General
  7. Check "Automatically compile TYpeScript files which are part of a project"
  8. Close the options menu, you are almost done!
  9. Open your TypeScript (.ts) file and save it...boom, it compiled on save. :)

Make Generated Files Dependent

Whenever you compile a TypeScript file it will generate a JavaScript file for you. However, as of .9, that JavaScript file will not be automatically included in your project. First, you should include those generated files in your project. Second, you should modify the project file to mark those generated fiels as being dependent upon their related TypeScript (.ts) file. This will ensure that no one accidentally modifies those files, and it will ensure that TFS automatically checks them out for edit before being regenerated.

<TypeScriptCompile Include="Scripts\jqueryExtensions.ts" />
<Content Include="Scripts\jqueryExtensions.js">
  <DependentUpon>jqueryExtensions.ts</DependentUpon>
</Content>

Here are some the simple steps to compile on save:

  1. Right click on your project in Solution Explorer and unload it.
  2. Open your project file in a text editor.
  3. Add a dependent node under the JavaScript file (see above for an example).
  4. Right click and reload the project in solution explorer.

Use a Master Definition File

Your TypeScript files require reference tags to include information about other TypeScript files. One of my favorite features of TypeScript is that the community builds definition files for other frameworks. However including multiple references in each file is a lot of work, and a bad idea in general.

I strongly suggest keeping one master definition that references all of your other definition files, and then your code files need only reference that one file inherit information regarding all of your code and dependencies.

Master Definition File

/// <reference path="jquery.d.ts" />
/// <reference path="jqueryui.d.ts" />
 
interface JQuery {
    alertOnClick(): JQuery;
}
 
interface JQueryStatic {
    stringFormat(format: string, ...args: Array<any>): string;
}

Normal Code File

/// <reference path="../Definitions/typeScriptDemo.d.ts" />
 
(function ($: JQueryStatic) {
 
    $.stringFormat = stringFormat;
 
    function stringFormat(format: string, ...args: Array<any>): string {
        return format.replace(/{(\d+)}/g, replaceFormat);
 
        function replaceFormat(match, index) {
            return typeof args[index] != 'undefined'
                ? args[index]
                : match;
        }
    }
 
})(jQuery);

Next week I'll talk about how to get your TypeScript compiling on Team Foundation Server.

Shout it

Enjoy,
Tom

Saturday, August 24, 2013

XUnit.PhantomQ v1.1

I recently blogged about how to Use XUnit to Run QUnit Tests. The initial v1.0 release of XUnit.PhantomQ did not support error messages, but now in v1.1 it supports the must have feature of bringing error messages back with failed test results.

XUnit.PhantomQ on NuGet
XUnit.PhantomQ Source on GitHub

Enjoy,
Tom

Tuesday, August 13, 2013

Control Minification per Request with Web Optimizations

The Microsoft ASP.NET Web Optimization Framework is a great bundling and minification solution for your web applications. Simply grab the Microsoft.AspNet.Web.Optimization NuGet package, register your bundles, render them with a single line of code, and your environment will automatically resolve your dependencies based on whether or not the web server is running in debug mode.

But how can you debug minified styles and scripts in production?

Normally that is a difficult proposition, but here is a simple solution: JUST DON'T MINIFY THEM! With the little code snippets below you can add a simple query string parameter to disable minification for specific sessions or requests.

Adding this functionality to your website is extremely easy and requires no additional dependencies. Web Optimizations already has an internal AssetManager class that supports this functionality, we just need to access it via reflection.

Simply apply the following two steps and you will be ready to debug in production:

  1. Create the HtmlHelperExtensions class with the code below.
  2. Add a call to TrySetOptimizationEnabled inside of your ViewStart.

_ViewStart.cshtml

@using System.Web.Optimization
@{
    Layout = "~/Views/Shared/_Layout.cshtml";
    Html.TrySetOptimizationEnabled();
}

HtmlHelperExtensions.cs

public static class HtmlHelperExtensions
{
    public const string Key = "OptimizationEnabled";
 
    public static bool TrySetOptimizationEnabled(this HtmlHelper html)
    {
        var queryString = html.ViewContext.HttpContext.Request.QueryString;
        var session = html.ViewContext.HttpContext.Session;
 
        // Check the query string first, then the session.
        return TryQueryString(queryString, session) || TrySession(session);
    }
 
    private static bool TryQueryString(
        NameValueCollection queryString, 
        HttpSessionStateBase session)
    {
        // Does the query string contain the key?
        if (queryString.AllKeys.Contains(
            Key, 
            StringComparer.InvariantCultureIgnoreCase))
        {
            // Is the value a boolean?
            bool boolValue;
            var stringValue = queryString[Key];
            if (bool.TryParse(stringValue, out boolValue))
            {
                // Set the OptimizationEnabled flag
                // and then store that value in session.
                SetOptimizationEnabled(boolValue);
                session[Key] = boolValue;
                return true;
            }
        }
 
        return false;
    }
 
    private static bool TrySession(HttpSessionStateBase session)
    {
        if (session != null)
        {
            var value = session[Key] as bool?;
            if (value.HasValue)
            {
                // Use the session value to set the OptimizationEnabled flag.
                SetOptimizationEnabled(value.Value);
                return true;
            }
        }
 
        return false;
    }
 
    private static void SetOptimizationEnabled(bool value)
    {
        // Use reflection to set the internal AssetManager.OptimizationEnabled
        // flag for this request specific.
        var instance = ManagerProperty.GetValue(null, null);
        OptimizationEnabledProperty.SetValue(instance, value);
    }
 
    private static readonly PropertyInfo ManagerProperty = typeof(Scripts)
        .GetProperty("Manager", BindingFlags.Static | BindingFlags.NonPublic);
 
    private static readonly PropertyInfo OptimizationEnabledProperty = Assembly
        .GetAssembly(typeof(Scripts))
        .GetType("System.Web.Optimization.AssetManager")
        .GetProperty(
            "OptimizationEnabled",
            BindingFlags.Instance | BindingFlags.NonPublic);
}
Shout it

Enjoy,
Tom

Wednesday, August 7, 2013

Last in Win Replication for RavenDB

One of my favorite features of RavenDB is how easy it is customize and extend.

RavenDB offers an extremely easy to use built in replication bundle. To deal with replication conflicts, the RavenDB.Database NuGet Package includes an abstract base class (the AbstractDocumentReplicationConflictResolver) that you can implement with your own conflict resolution rules.

Last In Wins Replication Conflict Resolver

John Bennett wrote a LastInWinsReplicationConflictResolver for RavenDB 1.0, and I have updated it for RavenDB 2.0 and 2.5. As always you can get that code from GitHub!

Download RavenExtensions from GitHub

Once you have built your resolver, you need only drop the assembly into the Plugins folder at the root of your RavenDB server and it will automatically be detected and loaded the next time that your server starts.

public class LastInWinsReplicationConflictResolver
    : AbstractDocumentReplicationConflictResolver
{
    private readonly ILog _log = LogManager.GetCurrentClassLogger();
 
    public override bool TryResolve(
        string id,
        RavenJObject metadata,
        RavenJObject document,
        JsonDocument existingDoc,
        Func<string, JsonDocument> getDocument)
    {
        if (ExistingDocShouldWin(metadata, existingDoc))
        {
            ReplaceValues(metadata, existingDoc.Metadata);
            ReplaceValues(document, existingDoc.DataAsJson);
            _log.Debug(
                "Replication conflict for '{0}' resolved with existing doc",
                id);
        }
        else
        {
            _log.Debug(
                "Replication conflict for '{0}' resolved with inbound doc",
                id);
        }
 
        return true;
    }
 
    private static bool ExistingDocShouldWin(
        RavenJObject newMetadata, 
        JsonDocument existingDoc)
    {
        if (existingDoc == null ||
            ExistingDocHasConflict(existingDoc) ||
            ExistingDocIsOlder(newMetadata, existingDoc))
        {
            return false;
        }
 
        return true;
    }
 
    private static bool ExistingDocHasConflict(JsonDocument existingDoc)
    {
        return existingDoc.Metadata[Constants.RavenReplicationConflict] != null;
    }
 
    private static bool ExistingDocIsOlder(
        RavenJObject newMetadata,
        JsonDocument existingDoc)
    {
        var newLastModified = GetLastModified(newMetadata);
 
        if (!existingDoc.LastModified.HasValue ||
            newLastModified.HasValue &&
            existingDoc.LastModified <= newLastModified)
        {
            return true;
        }
 
        return false;
    }
 
    private static DateTime? GetLastModified(RavenJObject metadata)
    {
        var lastModified = metadata[Constants.LastModified];
 
        return (lastModified == null)
            ? new DateTime?()
            : lastModified.Value<DateTime?>();
    }
 
    private static void ReplaceValues(RavenJObject target, RavenJObject source)
    {
        var targetKeys = target.Keys.ToArray();
        foreach (var key in targetKeys)
        {
            target.Remove(key);
        }
 
        foreach (var key in source.Keys)
        {
            target.Add(key, source[key]);
        }
    }
}
Shout it

Enjoy,
Tom

Thursday, August 1, 2013

PhantomJS, the Headless Browser for your .NET WebDriver Tests

Did you know that Selenium already supports PhantomJS?

WebDriver is a specification for controlling the behavior of a web browser. PhantomJS is a headless WebKit scriptable with a JavaScript API. Ghost Driver is a WebDriver implementation that uses PhantomJS for its back-end. Selenium is a software testing framework for web applications. Selenium WebDriver is the successor to Selenium RC. The Selenium WebDriver NuGet Package is a .NET client for for Selenium WebDriver that includes support for PhantomJs via GhostDriver.

NuGet Packages

You need only install two NuGet packages in order to use PhantomJS with WebDriver. You will probably also want which ever Unit Testing framework you prefer. As always, I suggest xUnit.

  1. Selenium.WebDriver
  2. phantomjs.exe

PhantomJSDriver

After installing those, using the PhantomJSDriver is as easy as any other WebDriver!

const string PhantomDirectory =
    @"..\..\..\packages\phantomjs.exe.1.8.1\tools\phantomjs";
 
[Fact]
public void GoogleTitle()
{
    using (IWebDriver phantomDriver = new PhantomJSDriver(PhantomDirectory))
    {
        phantomDriver.Url = "http://www.google.com/";
        Assert.Contains("Google", phantomDriver.Title);
    }
}
Shout it

Enjoy,
Tom

Sunday, July 14, 2013

Use XUnit to Run QUnit Tests

I read a great article recently about Unit testing JavaScript in VisualStudio with ReSharper, written by Chris Seroka. As cool as this feature is, it left me with two questions:

  1. What about developers that do not have ReSharper?
  2. How do I run my JavaScript unit tests on a builder server?

It is no secret that I, absolutely, love, xUnit! Thus I decided to extend xUnit theories to be able to run QUnit tests by implementing a new DataAttribute.

Introducing XUnit.PhantomQ

XUnit.PhantomQ is a little NuGet package you can install to get access to the QUnitDataAttribute (see below for an example). This library will allow you to execute your QUnit tests as XUnit tests.

XUnit.PhantomQ supports both library and web projects, and features the ability to easily specify test files and their dependencies by real relative path to the root of your project.

XUnit.PhantomQ on NuGet
XUnit.PhantomQ Source on GitHub

QUnitData Attribute

Here is an example of writing some JavaScript, a file of QUnit tests, and then using an xUnit theory and a QUnitData Attribute to execute all of those tests right inside of Visual Studio.

// Contents of Demo.js
function getFive() {
    return 5;
}
 
// Contents of Tests.js
test('Test Five', function() {
    var actual = getFive();
    equal(actual, 5);
});
test('Test Not Four', function () {
    var actual = getFive();
    notEqual(actual, 4);
});
 
// Contents of QUnitTests.cs
public class QUnitTests
{
    [Theory, QUnitData("Tests.js", "Demo.js")]
    public void ReturnFiveTests(QUnitTest test)
    {
        test.AssertSuccess();
    }
}
 

Integrating XUnit, PhantomJS, and QUnit

So, how does this thing work under the hood? Below is the complete pipeline, step by step, of whenever the tests are executed:

  1. The XUnit test runner identifies your theory tests.
  2. The QUnitDataAttribute is invoked.
  3. A static helper locates PhantomJS.exe and the root folder of your project.
    • It will automatically walk up the folder structure and try to find PhantomJS.exe in your packages folder. However, you can override this and explicitly set the full path by adding an AppSetting to your config file:
      <add key="PhantomQ.PhantomJsExePath" value="C:/PhantomJS.exe" />
    • The same goes for the root of your project, you can override this location with another AppSetting to your config:
      <add key="PhantomQ.WorkingDirectory" value="C:/Code/DemoProject" />
  4. PhantomJS.exe is invoked as a separate process.
  5. The server loads up XUnit.PhantomQ.Server.js
  6. The server now opens XUnit.PhantomQ.html
  7. QUnit is set to autoRun false.
  8. Event handlers are added to QUnit for testDone and done.
  9. All of the dependencies are now added to the page as script tags.
  10. The test file itself is loaded.
  11. QUnit.start is invoked.
  12. The server waits for the test to complete.
  13. Upon completion the server reads the results out of the page.
  14. The tests and their results are serialized to a JSON dictionary.
  15. The result dictionary is written to the standard output.
  16. The resulting JSON string is read in from the process output.
  17. The results are deserialized using Newtonsoft.JSON
  18. The results are loaded into QUnitTest objects.
  19. The QUnitTest array is passed back from the DataAttribute
  20. Each test run finally calls the AssertSuccess and throws on failure.

...and that's (finally) all folks! If you have any questions, comments, or thoughts on how this could be improved, please let me know!

Shout it

Enjoy,
Tom

Sunday, July 7, 2013

Lucene.Net Analyzer Viewer for RavenDB

To query your data in RavenDB you need to write queries in Lucene.Net.

To know which documents your queries are going to return means that you need to know exactly how your query is being parsed by Lucene.Net. Full text analysis is a great baked in feature of RavenDB, but I have found that they Lucene.NET standard analyzer that parses full text fields can sometimes return surprising results.

This Lucene.Net 3.0 Analyzer Viewer is an update of Andrew Smith's original version for Lucene.Net 2.0. This update now allows you to view the results of text analysis for the same version of Lucene that RavenDB is using. This simple tool can be invaluable to debugging full text searches in RavenDB!

Download Raven.Extensions.AnalyzerViewer from GitHub

This tool also comes with my Alphanumeric Analyzer built in.

Shout it

Enjoy,
Tom

Final Fantasy XIV: A Realm Reborn - Beta Review

Last month I had said that this blog would return to it's normal .NET ramblings, and I meant it...but then I got into the Final Fantasy XIV: A Realm Reborn Beta. To anyone checking in here for tech articles, do not despair; I will write extra posts this month to make up for yet another gaming distraction!

Also, please remember that this review is strictly my personal opinion about a product that has not even been released yet. Anything can change, even my stubborn opinions. Enjoy!

What is back?

There are two things returning to the franchise in a Realm Reborn that die hard Final Fantasy fans have not seen in a long time.

  1. Fantasy

Final Fantasy fans, this MMO is for you. In a desperate attempt to recapture their past glory, Square Enix has pulled out all of the classic Final Fantasy elements out of storage and put them on parade. Best of all, it really works.

There is Chocobo song plays when you mount up and ride. Old school combat music kicks off every fight, and the classic victory theme plays after you defeat bosses. There are moogles, airships with balloons, and mages with robes. Heck, there is even Magitech armor from FF VI! This game has become an overdue homage to the long since past golden era of Final Fantasy.

A realm reborn succeeds in reminding me not just how much I enjoyed the classic Final Fantasy games, but also how much I have missed them.

  1. Panties

There are an absurd amount of mini skirts and panty shots in this game, to the point where even I start to think that it is in bad taste...and believe me, that is saying a lot. The starting outfits make it seem like either all adventures bought their armor from a Victoria's Secret catalog, or that they are escaping a life of servitude from their home country's red light district.

For a long time now I have said that Square Enix can make amazing artwork, but that I dislike the art style they have chosen. Their landscapes, spells, and monsters all look great, but their character models have have boobs, underwear, and pectoral muscles popping out everywhere. You could argue that this is just an innocent cultural difference, but I don't see how that would make a battle thong (yes, that is in the game) any more practical.

What is new?

Not much. However what is new does work very well, and these strengths are what the game is betting on to succeed in an over crowded MMO market.

  1. A fresh take on the class system.

Your character may change class and level simply by swapping out your primary weapon. This means that you are free to play as many classes as you would like without have to redo your skills or professions. It also means that you may then take select abilities from one class and add them to another. Thus your archers may heal, and your tanks may cast fire. It is an absolutely fantastic system that I am sure will provide untold amounts of customization and experimentation for hard core and casual players alike.

  1. Some dynamic new questing mechanisms.

The fate system adds a fun and spontaneous social aspect to group PVE combat. Levequests, while horribly named, are a great new way to offer more dynamic single player oriented question. These are both great additions to the game play that go a long way towards helping distract players from the MMO grind.

  1. Good story telling.

I will give Final Fantasy XIV credit, the main quest line is filled with cut scenes that actually make you feel like you are playing a Final Fantasy game. The primary story provides a nice sense of both progression and grounding, offering up a relatively personal touch to an otherwise generic save the world plot.

  1. A much needed UI overhaul.

I really do not have to say much about the new interface other than it is not the old one. Menus no longer lock the game. Questing is easy to follow and includes mini map tracking. Character creation is second to none. You can, for the most part, figure out how to do anything with a few extra exploratory clicks.

What is old?

As I hinted above, a lot of things have become stale in the world of Eor...Eorzea...Eor see why I'm not gonna want to pay for this stuff anymore?

  1. The standard MMO formula.

Do you love fetch quests? Do you love fetch quests that send you running back and forth? Do you love fetch quests that send you running back and forth through the forest? Do you love fetch quests that send you running back and forth through the forest just to deliver notes? Then boy howdy do we have the perfect game for you!

Despite my singing the praises of their new dynamic quest mechanisms, the game on whole feels static and stale. You move from node to node through your zones, taking quests, killing monsters, collection items, and watching your experience bar raise ever so slowly.

This is a time tested game design formula that works, and it does work here; but it continuously gives me a feeling of deja vu, as I have already been there and done that.

  1. Static combat mechanics.

While I really like the new customizable class system, the combat abilities feel very static. For the most part you will stand in one place and press buttons. Your characters won't jump around the battle field, you wont nimbly dodge attacks, you just kinda stand there; there is no action in this adventure.

Let me be clear: the PACE of the battle is fine, the global cool downs are usually generally around two and a half seconds, and I actually LIKE that quite a bit! I am criticizing the MOTION of combat and the lack of fluid action or movement. You don't leap to your enemies, you do not drag them to you, you do not jump over or around incoming attacks. You push buttons and try not to stand in the fire.

Perhaps this is part of the classic Final Fantasy feel that they aiming for, but again I feel like they could have done more to spice things up.

  1. The behind the scenes technology.

Zoning. The year is 2013, and a brand new, triple A development, massively multiplayer online role playing game, still has zoning. You have got to be kidding me.

Nothing says dated quite like loading screens. Worse yet, the zones are not even that big! The capital cities are each split up into multiple zones. Worse still, the mini maps are broken into these zones too, forcing you to zoom in and out to find quest objects. You have got to be kidding me!

There are other technology problems too. Characters live on specific servers instead of dynamically re-sizing due to population. The install client can not stream game content, you must be fully installed and patched to play. Their log in screen always has a temporary password prompt, what is up with that?

Square Enix is very clearly a game company, not a software company.

Conclusion

It may seem like I have been ragging on a Realm Reborn pretty hard through the course of this review, so let me be very clear regarding my conclusion:

Final Fantasy XIV: A Realm Reborn is a GOOD game, but it is not a great game.

What Square Enix has done to revamp XIV after it's failed launch is very impressive, but the few new mechanics that are here do not try to change the tired old MMO formula in any significant way. Square Enix is playing it safe; they have merely added a much needed fresh coat of paint to an otherwise old car. However, if you are a Final Fantasy fan then you will enjoy a Realm Reborn, at least for a while.

All that being said, a Realm Reborn is undoubtedly above average in an over saturated MMO market. It's greatest strengths are...

  1. It is all new content
  2. It unapologetically harkens back to it's classic roots.
    (For which, I will specifically give it a plus one to the final score below).

If I had to give it a number, it would be: 7/10

Game on,
Tom

Friday, July 5, 2013

Microsoft Build 2013 for Web Developers

Build 2013 was a whole lot of fun!

The big reveals at Build 2013 was Windows 8.1 and Visual Studio 2013. Admittedly Microsoft's big focus was on the Windows 8 store, but that does not mean that they are not delivering great tools for the fastest growing development environment in the world: the web!

Before we discuss web development, let's first take a moment to talk about Windows.

Windows 8.1 actually looks really good! Like all Microsoft products, the first major patch manages to iron out most of the kinks. The new UI updates and inclusion of the start button make the modern UI and desktop seem much less polarizing. Additionally, I am very optimistic about seeing Windows 8 applications come to PC, table, and phone. Microsoft has always had great potential for creating a homogeneous ecosystem consumer devices, and I am finally getting excited about the product line up that they are delivering.

New Editors in Visual Studio 2013

The Visual Studio Team has completely rewritten the HTML and JavaScript editors from scratch in Visual Studio 2013. This is great news; not only are the new editors faster, more responsive, and have much better intellisense, but they also come with a set of fun new features.

  • The overall Razor experience has been greatly improved.
  • Intellisense now include expandable groups to keep drop downs from getting cluttered.
  • CSS intellisense now displays browser compatibility for each rule.
  • New shift alt keys allows you to select content from tree structures.

Microsoft has yet to announce how they will charge for Visual Studio 2013. Will it be a stand alone install, will it be full price, we just do not know yet. Personally, I would love to see Visual Studio more to incremental update system, but obviously there would be lot of technical limitation to doing so...but hey, a dev can dream!

HTML Editing Features in Visual Studio 2013 Preview

Browser Link

Visual Studio 2013 comes with a built in feature called Browser Link. This is where Visual Studio opens a socket connection directly to your browsers (yes, plural) via SignalR that allows your IDE to send commands directly to the browser itself. The most basic use of this a simple refresh button that allows you to refresh all your browser windows on command.

The potential of this feature is fun to think about! Right now the Visual Studio team is intentionally keeping their default feature set for Browser link very simple, but also open source; as they want to see what us developers come up with. Remember that this is a two way channel of communication between the browser and the IDE, so sending error messages back to Visual Studio is just the first thing that comes to my mind.

A cool demo that they showed us at Build was keeping multiple browser windows in sync, this included form elements and page navigation. The cool part was that these browsers included Chrome, Firefox, and IE running on a virtualized windows phone. It was a very compelling demo!

Browser Link feature in Visual Studio Preview 2013

Web API 2.0

Web API was a great addition to Microsoft's web development library, and 2.0 looks like a great incrimental update to that framework. Forgive me if it seems that I do not have much to say about Web API 2, it just seems like a very simple but very welcome update to the framework; but don't let that fool you, I am stoked!

  • Attribute based routing.
  • Integrated OAuth 2 authentication.
  • OWIN Web hosting (run your APIs outside of IIS)
  • Cross-Origin Resource Sharing

ASP.NET Web API 2

TypeScript 0.9

I am SO excited about TypeScript!

TypeScript is a development language that is a super set of JavaScript. It adds type checking, interfaces, and inheritance to JavaScript. The idea is that you develop in TypeScript and then compile that language down to JavaScript for deployment on the web. The key here is that TypeScript is a super set of JavaScript, so all JavaScript code is already valid TypeScript, making it very easy for you to start your migration from language to the other.

There are essentially two ways that you could choose to use TypeScript:

  1. Use TypeScript for simple type checking and enhanced refactoring.

This is probably the more ideal use of TypeScript, or at least the simplest. The idea being that you can constrain your function parameters by type and interface, and make use of TypeScripts more strick declarations for easier refactoring. You would also get significantly improved intellisense both for your code and other frameworks, as a very active open source community has been creating interface wrappers for everything from JQuery to Knockout.

  1. Go full monty, and make your JavaScript completely Object Oriented.

Using TypeScript would allow you to define and extend both classes and interfaces, just as you would a language like C#. This opens the possibility for polymorphism, dependency injection, mocking, testing, the whole object oriented play book. While this does go against many principals of a very functional programming language like JavaScript, it also enables a lot of best practices that help simplify large scale application development.

...how far you want to go using TypeScript is up to you, but regardless of which side of the fence you are on, I definitely feel that TypeScript is worth looking into!

www.typescriptlang.org

Shout it

The future looks bright,
Tom

Sunday, June 30, 2013

Classic RPG Combat - Objectives

This is the final post in a three part series.

My first and second posts rambled on about different combat systems and their resources respectively. Now we will finally tie them together: how do they work together to make a game that is both challenging and fun?

RPG Combat Objects

Ask the following questions about every battle that you are in:

  1. In what way do your choices effect combat?
  2. How do your tactics vary between battles?
  3. What distinguishes a boss battle from a regular fight?

I am going to select three games to answer each question. Obviously there are many more games with many more answers, but this blog series has already gotten really long, so please feel free to add to this list by posting in the comments!

1. In what way do your choices effect combat?

Chrono Trigger - Often in Chrono Trigger enemies will have specific elemental weakness that trigger debuffs. For example, dinosaurs are weakened by lightning attacks, and ogres with hammers lose their weapon when attacked with fire. This means that you must choose your party so that you have access to the magic types needed to beat select areas.

Final Fantasy 7 - You equip your characters with Materia that gives them magic abilities, your characters then learn these abilities over time. Because any character can equip any Materia, you get to decided which characters will learn defensive spells versus offensive spells. This means that you can custom and choose who will eventually fill the rolls of healer and DPS.

Fire Emblem - You must choose what characters to bring into battle, which characters to level up, and which characters to let die. The choices you make with every move impact more than just the battle itself, it will impact the course of the entire game.

2. How do your tactics vary between battles?

Mario RPG - Combat is simple, but the timed attack system offers variety with every battle. If you choose to pay attention in battle, your timed hits and dodges can completely make or break your chances of succeeding in combat.

Lufia 2 - In Lufia 2 you have two types of ability resources: a standard mana bar that drains down with use, and a SP bar that builds up based on damage taken. The fact that you can not always rely on both resources being available for every fight means that you will not always have access to all of your spells. This forces you to mind your resources and alter tactics based on their ever changing availability.

XCOM Enemy Unknown - Yes, XCOM is a tactical RPG! In the latest version of XCOM, different missions will take you on to different types of terrain. When out doors, my snipers dominate the field of battle, claiming the vast majority of kills for my squad. However when force to play in door missions, my snipers line of sight is greatly reduced, and I am force to use other character classes such as assault and support to complete the mission.

3. What distinguishes a boss battle from a regular fight?

Golden Sun - Most regular battles in Golden Sun are both quick and easy, where as bosses often pose a significant challenge. In regular battles enemies are dispatched with your first round of attacks, in stark contrast boss battles require you to cycle through your Djinn (summons that double as status buffs) in an exercise of both endurance and strategic planning.

Might of Magic: Clash of Heroes - All battles in Clash of Heroes are essentially a puzzle game. Boss fights vary the combat by completely changing the standard object of the puzzles. Against bosses you often have to target specific moving targets. In puzzle challenges you have to complete battles with a predetermined set of units in a finite number of moves.

Final Fantasy 6 - FF6 is full of little tricks to make boss battles distinct. At several points in the game you have to command three groups of characters at once on a tactical battlefield, forcing you to create balanced parties of characters that not only fight off multiple groups of enemies but also defeat and end boss. Additionally, they have fights like the Phantom Train, where the use of a phoenix down can defeat the boss with a single blow.

That is (finally) all I have to say about that.

Game on,
Tom

Saturday, June 29, 2013

Classic RPG Combat - Resources

This is the second post in a three part series.

In the first part we talked about classifying RPG combat systems. In this part we will discuss different resources used when in combat. In the third post we will tie this two subjects together and discuss what objectives are being accomplished by combining these mechanics.

Identifying RPG Combat Resources

In any game of strategy you have a finite number resources at your disposal, and it is how well you use these resources will determine whether or not you will emerge victorious. This analysis will speak very in very objective terms, treating characters as resources, no different from their spells and equipment.

  • Resource Points
  • Active and Passive
  • Learned vs Equipped

I will not be discussing items, this is because from a strictly combat context their use is no different than an ability with a finite number of uses, and their stats are only relevant to long term progression.

Resource Points

Mana is a very simple, very finite resource; tour party starts with full mana, and special abilities use up this resource until it runs out, and then you have to use special items or designated locations to restore your mana. The whole point is force you to conserve your resources, to challenge you to avoid using your most powerful abilities. This of course is only one way to limit your use of special abilities.

World of Warcraft's class system contains a perfect showcase of unique resource management mechanisms.

Priests and Mages have mana, a resource the counts down. The warrior gains rage by dealing or receiving damage, making rage a resource that builds up. Rogues have both energy and combo points, making them a hybrid of both previous systems; energy being used up similar to mana, and combo points are built up by abilities similar to rage. Death knights use three unique pairs runes, each of their abilities takes a different combination or runes. Thus runes are consumed similar to mana, but the resources are partitioned into multiple buckets instead of one, creating a wonderfully dynamic down down resource.

Games like Lufia 2 have a hybrid combination of Mana and TP. In Lufia your mana counts down like any typical turn based combat engine, but your TP builds up as you take damage. Your characters then have abilities that are fueled by on resource or the other. This is similar to Final Fantasy VII, where you can replace the TP with the limit break system.

A complete departure from these is the super simple AP system from Final Fantasy: The Four Heroes or Light. In it all characters have only 5 AP points, one of which recharges every turn, and every action requires consumption of a variable number of points. If attacking costs one point, then you can sustain attacking indefinitely. However if you use a powerful attack that drains all of your AP, then you will be helpless for the next turn or two while you regenerate your AP. This system is elegant in it's simplicity, yet highly strategic in the short term versus long term cost benefit that it forces you to consider with every action.

Active and Passive Abilities

Everything we have discussed up until now has been about active abilities. This is where the player explicitly chooses an action for their character to execute. This is pretty straight forward and obvious, so we'll move on.

Passive abilities are taken by characters in response to other actions. A counter attack is a perfect example of this, where a character will attack in retaliation for being attacked. A fun variation of this from Final Fantasy: Tactics is Hamedo, where the character being attacked will preemptively counter attack before they are even struck. Another common passive ability would be to apply a debuff to something that attacks your character, such as poisoning them for example. Passive abilities do not only apply to the direct target of an action, passive abilities can also be used tangentially. In Final Fantasy VI, Shadow's dog Interceptor can block attacks made against other party members.

Combination attacks can also be considered passive actions. In games like Disgaea and Fire Emblem Awakening, characters standing adjacent to each other in combat can assist in attacking, thus dealing additional damage without consuming turns or resources. However not all combination attacks are passive, in Chrono Trigger the characters combination attacks must be explicitly selected and consume both player turns.

Learned vs Equipped

Many abilities are inherent to a specific character, while others are able to be selected by choice. Systems that offer you choice take two primary forms, the first being the choice equip specific abilities or items a la cart, and the second being a class system where abilities are inherited in bulk. Of course some games offer a hybrid of these systems.

Mario RPG is an example of the simple and straight forward character specific system. As Mario levels up he will unlock new his own unique new spells, as Bowser levels up he will unlock is own unique new spells, and so on.

Final Fantasy VII has an a la cart skill system. Every character has an attack based on their own unique choice of weapons, but their abilities beyond that are based entirely on their choice of equipped Materia. Any character can equip any Materia, so anyone can become the main healer or damage dealer.

Final Fantasy III and V have class based skill systems. Each character takes on a class that comes equipped with a specific set of skills. Knights attack with weapons, Monks attack with fists, Black Mages attack with magic, White Mages heal, and so on.

Final Fantasy VI has a hybrid system. Each character has a completely unique set of abilities that only they (and Gogo the mimic) may use. However they also learn magic spells from Espers (summons) that they choose to equip, thus allowing you to customize which abilities your characters gain later in the game.

Final Fantasy Tactics has yet another hybrid system. Each character has a class with a distict set of skills, but then the character may also equip a small handful of skills learned from classes that they have leveled up previously. This allows characters to cross class and take the best skills of one class to help offset the weakness of another. For example you may take a Knight that has limited movement, and then equip the Ninja ability of +3 movement to help offset their disadvantage.

To be concluded in Part 3...

Friday, June 28, 2013

Classic RPG Combat - Genres

This is the first post in a three part series.

In this first part we will talk about how to classify some basic combat systems. In the second we will talk about resources and abilities in combat. In the third and final part we will bring these together and talk about what objectives are being accomplished by combining these mechanics.

Classifying RPG Combat by Genre

At the top level there are basically four distinct types of combat systems. Below these four categories you can continue to subdivide the genre further, but for the most part that will only consist of grouping different permutations of specific mechanics; none of which are necessary for this analysis.

  • Turn based
  • Tactical
  • Action
  • Puzzle

Admittedly modern technology has allowed many RPGs, specifically western ones, to significantly blur the lines between these categories. World of Warcraft's abilities are active turn based, while it's motion is action oriented, and it's modern raids are extremely tactical.

For the sake of simplicity, let's consider this a classical review that covers only console games from the 8bit through 32bit eras.

Turn based

Dragon Quest and Final Fantasy defined this genre over 25 years ago, and both still adhere to the fundamental design. Your characters line up on one side of the screen, the enemies on the other, and you take turns attacking until one side is defeated. This is by far the most common combat system in the entire RPG genre, and is used still to this day by the majority of JRPGs.

The two most common sub-genres here would be turn based or active time battle combat. This is the difference between having to enter your team's commands all at once, like the 8bit Final Fantasy games, or entering each characters actions as they are ready to take a turn, like the 16bit Final Fantasy games.

Another common set of sub-genres that are worth pointed out would be Active versus Passive combat. The simple distinction being whether or not combat is paused while you make action selections from your menus. Some games, such as Chrono Trigger, allow you to choose whether or not you want this enabled.

Side Note: The creative team behind the original Final Fantasy described the game as being their attempt to create a digital version of Dungeons and Dragons. I find this funny as D&D is an American invention and the FF combat system feels distinctly like ancient Greek warfare, where the two sides line up to attack each other. The irony being that America won it's independence from the British Empire very specifically by breaking these classic European rules of combat.

Common Complains: This combat system is often associated with very grindy game play. You will very likely experience a few thousand of these turn based battles throughout your time beating the game, and the vast majority of them will be won by clicking attack over and over again. Additionally, with this particular combat engine completely saturating the traditional RPG market, it has been noted that more recent games often lack any deep mechanical innovation.

Tactical

To be fair, Chess was the game that first defined this combat system. You have a series of units on a limited field of play, each unit constrained in its range of motion and actions, and you fight to defeat the other forces on that battlefield. Like chess, most of the time there are only two opposing forces and the field of play is a grid.

Fire Emblem is considered to be one of the most successful, influential, and defining game series of this genre. Fire Emblem is just a standard grid based tactical game with one major addition: it added the rock paper scissors mechanic to it's combat. This extremely simple addition to the game play added a level of depth and strategy to this genre that is still imitated to this day.

Another defining work in this genre was Final Fantasy Tactics. What FF:T brought to the genre was an abundance of customization, making combat driven around unique abilities that were not specific to characters but to classes. This then allowed players to heavily customize their armies, and to create a seemingly infinite number of distinct tactical combinations. (Yes, I know that Tactics Ogre came before FF:T and was from the same creative team, it just was not as commercially successful as it's successor.)

It is worth noting just how diverse the tactical combat genre is: Civilization, X-COM, Heroes of Might and Magic, all of these games are considered to be tactical combat games. Again, for this analysis I am just focusing on early generation console games.

Common Complaints: The tactical combat system is often a game defining system, as in combat itself will comprise the majority of game play. Battles will usually be very long; in contrast to turn based combat which usually takes only a few minutes per encounter, tactical combat can often take an hour or more to complete.

Action

More commonly referred to as action adventure, an action RPG has a combat system wherein the protagonists take up arms and freely walk around the map to attack their enemies in real time.

For the definitive example of this genre, we need look no further than the Legend of Zelda. Link takes up his sword and shield and walks across the world hacking and slashing his enemies to death. In this game the world and town maps are the the same as the combat maps.

Other examples of action adventure include the Secret of Mana (Seiken Densetsu) series. Key differences between SOM and the Zelda are that there are multiple protagonists working together in combat, as well as the addition of a more a traditional leveling system.

Games like the Tales series take a unique twist on action combat by separating the world map and combat engine, so that combat encounters are played out in a side scrolling battles more akin to fighting games. The Tales series could be described less as action adventure, and more traditional RPGs with action elements.

Common Complaints: The is the genre that termed the phrase "hack and slash." The majority of game time spent in this genre of games is comprised by walking up to other units and pressing attack over and over, and it grow stale rather quickly. Additionally, hack and slash combat itself is often criticized for being overly simple, requiring very little strategy or skill to win encounters.

Puzzle

I would be remiss not to mention that recently there has been an influx of puzzle RPGs to the gaming market. Games like Puzzle Quest and Dragon Puzzle take the simple bejeweled style puzzle game and place in into an RPG as the combat system.

Far more impressive is Might and Magic: Clash of Heroes. This game is a completely unique and innovative take on puzzle RPG combat. You move color coded units between the columns on your battlefield, and lining up units vertically will charge them, while lining them up horizontally will cause them to form a wall. There are multiple races, each with their own unique units, all of which have special skills, and level up as you play. Clash of Heroes was awarded game of the year by RPGamer when it was originally released for the Nintendo DS in 2009. You many now get this game on DS, PC, iOS, and soon Android; I highly recommend that you check it out.

Common Complaints: This is a relatively new genre, but the major complaint that I have heard is that it hard to find the intersection of these two target markets. Puzzle gamers do not enjoy the RPG elements, and RPG enthusiast want to hit things with swords more than they want to solve puzzle.

To be continued in Part 2...

Friday, June 14, 2013

E3 2013

This is mostly a tech blog, but in honor of E3 I am going to make this a game month. I will return to my regularly scheduled .NET blogging in July!

Microsoft

The internet is a blaze with just how poorly Microsoft handled the reveal of the Xbox One, and I have no interest in beating a dead horse. Yes, their price point is too high. Yes, their DRM seems extreme. Yes, they fucked up. No, they did not do anything to make up for it.

I actually like a lot of the ideas behind the Xbox One. I like the kinect responding to voice commands, I like the all in one media center, and while I disagree with always being online I can not lie and say that it directly effects me.

However the lack of exclusive games dries up any interest that I have in the "Xbone". Battlefield, Final Fantasy, Batman, Kingdom Hearts, and almost all of the other AAA titles will be available on multiple platforms. Project Spark looks really good, but I will play it on PC. The only exclusive title that really appeals to me the Panzer Dragoon "Sequel", Crimson Dragon; unfortunately that is certainly not enough to get me to pre-order.

Having had the privilege to attend E3, I would like to offer the following observation: Mircosoft's showroom was like a museum, you could look but not touch.

Sony

What amazed me the most about this E3 was not how badly Microsoft failed, but how much Sony succeeded. For the past few years it looked like Sony could do no right, and now they have turned that around almost overnight!

The PS4 is a staggering $100 less than the Xbox One, it will not be region locked, and the Vita will allow remote play for all PS4 games similar to the Wi U's game pad. These are all great features, and they are backed up by a broad lineup of games for the PS4, Vita, and PSN.

All that being said, they are still as devious as ever. Sony is trying really hard to down play their DRM that, while not as obtrusive as Microsoft's, is still very real. Also, the PSN will no longer be free, but it will still be cheaper than Xbox Live. I want to emphasize that Sony is trying really hard to dodge these facts, only talking about them in private press sessions.

The bottom line: Sony was obviously desperate to win this generation of console war, but their come from behind victory at E3 2013 just goes to show that if you try hard enough you really can accomplish anything. They have done well.

Nintendo

While Microsoft's showroom was a museum, Nintendo's show room was a candy store; and if Sony peaked my curiosity, then Nintendo captured my attention. If you have not figured it out yet, let me spell this out for you:

I think that Nintendo "won" E3, hands down.

I say this for one reason and one reason only: Nintendo brought the games! On the Wii U there is a new Mario World, a new Mario Kart, a new Donkey Kong, a new Smash Brothers, and a HD Zelda. On the 3DS there is a new Zelda, a new Yoshi's Island, a new Pokemon, a new Smash Brothers, and a new Mario RPG. It was literally Christmas come early; I started and ended my day in the Nintendo showroom, I wanted to play every game on that floor, and I loved every minute of it.

Admittedly I am a fan of all of these franchises, so perhaps I have a slightly bias opinion on the subject. However everyone must admit that Nintendo brought their A team, and that lineup dwarfed the competition. While Nintendo is undoubtedly "playing it safe", and admittedly there is nothing "new" in their arsenal, yet I can not help but be excited by the vast library of games that Nintendo has announced.

Summary

Hardware years are exciting, especially at trade show like E3, but I feel that they see to take away from the games themselves. While you could argue that Nintendo sat this year on the sidelines, I say that they brought the games, and that is all that I care about.

Game on,
Tom

Monday, May 27, 2013

How to Test RavenDB Indexes

What if you could spin up an entire database in memory for every unit test? You can!

RavenDB offers an EmbeddableDocumentStore NuGet Package that allows you to create a complete in memory instance of RavenDB. This makes writing integration tests for your custom Indexes extremely easy.

The Hibernating Rhinos team makes full use of this feature by including a full suite of unit tests in their RavenDB solution. They even encourage people to submit pull requests to their GitHub repository so that they can pull those tests directly into their source. This is a BRILLIANT integration of all these technologies to both encourage testing and provide an extremely stable product.

So then, how do you test your RavenDB indexes? Great question; let's get into the code!

  1. Define your Document
public class Doc
{
    public int InternalId { get; set; }
    public string Text { get; set; }
}
  1. Define your Index
public class Index : AbstractIndexCreationTask<Doc>
{
    public Index()
    {
        Map = docs => from doc in docs
                      select new
                      {
                          doc.InternalId,
                          doc.Text
                      };
        Analyzers.Add(d => d.Text, "Raven.Extensions.AlphanumericAnalyzer");
    }
}
  1. Create your EmbeddableDocumentStore
  2. Insert your Index and Documents

In this example I am creating an abstract base class for my unit tests. The GetDocumentStore method provides an EmbeddableDocumentStore that comes pre-initialized with the default RavenDocumentsByEntityName index, your custom index, and a complete set of documents that have already been inserted. The Documents come from an abstract Documents property, which we will see implemented below in step 5.

protected abstract ICollection<TDoc> Documents { get; }
 
private EmbeddableDocumentStore NewDocumentStore()
{
    var documentStore = new EmbeddableDocumentStore
    {
        Configuration =
        {
            RunInUnreliableYetFastModeThatIsNotSuitableForProduction = true,
            RunInMemory = true
        }
    };
 
    documentStore.Initialize();
 
    // Create Default Index
    var defaultIndex = new RavenDocumentsByEntityName();
    defaultIndex.Execute(documentStore);
 
    // Create Custom Index
    var customIndex = new TIndex();
    customIndex.Execute(documentStore);
 
    // Insert Documents from Abstract Property
    using (var bulkInsert = documentStore.BulkInsert())
        foreach (var document in Documents)
            bulkInsert.Store(document);
 
    return documentStore;
}
  1. Write your Tests

These tests are testing a custom Alphanumeric analyzer. They will take in a series of lucene queries and assert that they match the correct internal Ids. These documents are being defined by our abstract Documents property from Step 4.

NOTE: Do not forget to include the WaitForNonStaleResults method on your queries, as your index may not be done building the first time you run your tests.

[Theory]
[InlineData(@"Text:Hello",              new[] {0})]
[InlineData(@"Text:file_name",          new[] {2})]
[InlineData(@"Text:name*",              new[] {2, 3})]
[InlineData(@"Text:my AND Text:txt",    new[] {2, 3})]
public void Query(string query, int[] expectedIds)
{
    int[] actualIds;
 
    using (var documentStore = NewDocumentStore())
    using (var session = documentStore.OpenSession())
    {
        actualIds = session.Advanced
            .LuceneQuery<Doc>("Index")
            .Where(query)
            .SelectFields<int>("InternalId")
            .WaitForNonStaleResults()
            .ToArray();
    }
 
    Assert.Equal(expectedIds, actualIds);
}
 
protected override ICollection<Doc> Documents
{
    get
    {
        return new[]
            {
                "Hello, world!",
                "Goodnight...moon?",
                "my_file_name_01.txt",
                "my_file_name01.txt"
            }
            .Select((t, i) => new Doc
            {
                InternalId = i,
                Text = t
            })
            .ToArray();
    }
}
Shout it

Enjoy,
Tom

Real Time Web Analytics