Monday, October 26, 2009

PLINQO Query Extension Updates

The Problem

Unfortunately the previous query extensions, while great at querying by a particular value, did not support simple operations such as getting by a nullable value type (where you know it can be null or just a value).

Example: If you want to query by a nullable column, you had to use a a lambda expression.

context.Product.Where(p => !p.OwnerId.HasValue || p.OwnerId.Value == 5);

Our Solution

Add an overload that for each "By" extension method that accepts a params list of values.

Example: Now you can look for null OR a certain value without using where statement.

context.Product.ByOwnerId(null, 5);

New PLINQO Feature

Here at CodeSmith we love generic solutions. So, forget just supporting ByValueOrNull scenarios, now all "By" query extensions have an overload to support or statements!

Example: Ummm, every single "By" query extension method generated by PLINQO?

context.Task.ByPriority(Priority.High, null, Priority.Medium);
context.Person.ByFirstName("Eric", null, "Shannon", String.Empty);
context.Product.ByName("Scribblenauts", "Bowser's Inside Story").ByRating(Rating.Excellent, Rating.AboveAverage);

Implementation Details

To see how the overloads are implemented you need only take a look at the template, QueryExtension.Generated.cst. If you are curious to see how all of this comes together under the hood, where we build up the parametrized query string and then parse the lambda, this code is located in DynamicQuery.cs

Fun Little FYI Facts: Unfortunately Linq to SQL does not support doing a IEnumerable.Contains against a nullable value type collection to identify null values. Also, when building a string expression, you cant just say where a primitive is equal to a null, that will cause a parsing exception. So once we jumped those hurdles, the rest was relatively easy; just check for null params arrays in the overloads and know when to treat them as a null vs an empty param, and you're all set!

In Conclusion

Linq rocks. PLINQO just keeps getting better. Check out our new nightly builds, or wait for PLINQO 3.2, coming soon to our Google Code repository near you!

Tuesday, October 20, 2009

PLINQO @ Fort Worth DNUG

The CodeSmith Tools (Shannon and Tom) are continuing their PLINQO presentation tour! Second stop: Fort Worth

If you or any developers that you know will be in the Forth Worth area on October 20th, you should come see us and learn more about PLINQO, the replace and enhance alternative for LINQ to SQL!

Forth Worth DNUG
Topic: PLINQO - Supercharged LINQ To SQL
Speaker: Shannon Davidson & Tom DuPont
Date: October 20th @ 6:00 PM
Where: 610 W. Daggett - Fort Worth, TX 76104

Thursday, October 8, 2009

PLINQO @ North Houston DNUG

The CodeSmith Tools (Shannon and Tom) are taking their PLINQO presentation on the road! First stop: North Houston

If you or any developers that you know will be in the Houston area on October 15th, you should come see us and learn more about PLINQO, the replace and enhance alternative for LINQ to SQL!

North Houston Dot Net User Group
Topic: PLINQO - Supercharged LINQ To SQL
Speaker: Shannon Davidson & Tom DuPont
Date: October 15th @ 6:30 PM
Where: Lone Star College - Montgomery

Monday, August 10, 2009

MVC JSON Model Binder Attribute

Here it is folks: the Ultimate Ultimate* ASP.NET MVC JSON Model Binder!

Moving complex data structures from client to server used to be difficult, but not anymore! Just add the JsonBinder attribute to your action parameters, and this Custom ModelBinder will automatically detect the type and parameter name, and deserialize your complex JSON object to the data structure of your choice.

No configuration required, it works every time, it's PFM!**

* Yes, I said Ultimate twice.
** Pure Friendly Magic


public class JsonBinderAttribute : CustomModelBinderAttribute
    public override IModelBinder GetBinder()
        return new JsonModelBinder();
    public class JsonModelBinder : IModelBinder
        public object BindModel(
            ControllerContext controllerContext, 
            ModelBindingContext bindingContext)
                var json = controllerContext.HttpContext.Request
                if (String.IsNullOrWhitespace(json))
                    return null;
                // Swap this out with whichever Json deserializer you prefer.
                return Newtonsoft.Json.JsonConvert
                       .DeserializeObject(json, bindingContext.ModelType);
                return null;

Controller Action

public class PersonController : Controller
    // Note: The JsonBinder attribute has been added to the person parameter.
    public ActionResult Update([JsonBinder]Person person)
        // Both the person and its internal pet object have been populated!
        ViewData["PetName"] = person.Pet.Name;
        return View();

Sample Models

public class Person
    public string Name { get; set; }
    public int Age { get; set; }
    // Note: This property is not a primitive!
    public DomesticAnimal Pet { get; set; }
public class DomesticAnimal
    public string Name { get; set; }
    public int Age { get; set; }
    public string Species { get; set; }

jQuery Method

function sendToServer() {
    var person = {
        Name : 'Tom',
        Age : 27,
        Pet : {
            Name : 'Taboo',
            Age : 2,
            Species : 'Shiba Inu'
    // {"Name":"Tom","Age":27,Pet:{"Name":"Taboo",Age:2,Species:"Shiba Inu"}}
    var personJson = JSON.stringify(person);
        url: 'Person/Update', 
        type: 'POST',
        data: { 
            person: personJson 
Shout it


Tuesday, July 28, 2009

Breaking Change in PLINQO 3.2

No, PLINQO 3.2 has not been released yet. However, when it is, there will be a few small but breaking changes introduced with it. As we often recommend that you use PLINQO's latest nightly builds, we wanted to bring this to your attention ASAP.

Breaking Query Extension Changes

We have added the UniqueMethodPrefix property to the Queries.cst template. Now PLINQO will generate two methods for each unique index and primary key. Methods that had previously returned an instance of an entity will now return IQueryable, thus updating to the latest version may result in compiler errors.

  • MethodPrefix Property
    • All methods generated return IQueryable.
    • Defaults to "By"
  • UniqueMethodPrefix
    • All methods generated return single instance of an entity.
    • Defaults to "GetBy"

Don't worry! These method names are still configurable! If you don't like this update, you can change the defaults of your Queries.cst to be whatever you prefer! Also, functionality has only been added, not lost! PLINQO now has methods that return both object and IQueryable types for queries with unique results!


Product p;
using (var db = new PetshopDataContext())
    /**** PLINQO 3.1 ****/

    // This "ByKey" method used to return a unique result.
    p = db.Product.ByKey("BD-02");

    // To cache a product by Id, a Where statement had to be used in order to build the IQueryable.
    p = db.Product.Where(p => p.ProductId == "BD-02").FromCache().First();

    /**** PLINQO 3.2 ****/

    // There is no longer a "ByKey" method, it has been replaced with a "GetByKey" method.
    p = db.Product.GetByKey("BD-02");

    // Because there are now By methods for all properties (even ones with unique results),
    // you can now use the extension methods for your cache!
    p = db.Product.ByProductId("BD-02").FromCache().First();

Why make these changes?

When we first developed the naming conventions for our extension methods, we did not have as many features that extended IQueryable results. At the time, it made sense that getting by a unique set should only return a single entity rather than an IQueryable with a single result.

Times have changed, PLINQO has expanded and gotten better. PLINQO now offers more features, and frankly, we felt that the API needed to keep up. This update offers the following advantages...

  • Unique result query extensions now work with batch queries.
  • Unique result query extensions now work with the query result cache.
  • Query Extensions are now more consistent, all "By" methods now return same type.


Tuesday, July 21, 2009

Why CodeSmith Projects?

To me, asking "why would I want to use a CodeSmith Project?" is a lot like asking "why would I want to use a Visual Studio Solution?" Well, for starters, it...

  • Saves you time.
  • Takes less effort.
  • Exposes more functionality.
  • Drastically simplifies automation.
  • IS EASY!

CodeSmith Projects let you control the generation of all your CodeSmith Templates, just like Visual Studio Solutions let you control the compilation of your Visual Studio Projects.


A .csp (CodeSmith Project) file, is a simple concept. Like a VS solution or project, it just contains a list of files (in this case, CodeSmith Templates) a set of configuration data that it needs to take action against those files (in this case, to execute those templates). This allows you to run multiple templates from one convenient location, and to your persist the settings for those templates in between executions.

A CodeSmith Project is easy to create, just right click in CodeSmith Explorer and create a new project, or right click in Visual Studio and add a file, then select CodeSmith Project. Heck, if those aren't easy enough for you, you can always create a new file with the .csp suffix, and then when you right click in windows explorer you will be able to manage or execute it directly from the right click context menu!

Managing your CodeSmith Project is equally easy. Regardless of where you manage outputs from, you are going to get the same friendly Manage Outputs dialog. It is going to allow you to add, remove, or copy your template entries. You can disable or enable templates for generation, alter their order of execution, or even choose to execute a specific template on the spot. When managing specific template properties the standard CodeSmith Property Grid is used, giving you access to all of the properties and UI pickers from CodeSmith Studio.

Visual Studio Integration

As I mentioned above, you can add a .csp directly into any Visual Studio project. This gives you several advantages...

  • Automated integration of your template output directly into your Visual Studio Project.
    • Don't worry about adding or updating your project files, the CodeSmith Project will take care of it for you!
    • Note: This includes recognition by any plug-ins you may have in Visual Studio, such as source control!
  • Ability to automatically generate on build.
    • Keeps all developers and every build in sync with the latest template output!
    • Generating an ORM? Never be out of sync with the database schema again!

Moral of the Story

Develop smarter, not harder; use CodeSmith Projects to automate your code generation. For more, check out my CodeSmith Projects Video Tutorial.

Monday, June 29, 2009

How PLINQO Improves Performance

Performance Q&A

One of the first questions people always ask about PLINQO is "do we have any performance statistics?" Well, to quote a wiser man than myself: "People can come up with statistics to prove anything. 14% of people know that." ~ Homer J. Simpson

PLINQO's primary plan for performance improvement is simple: reduce the number of trips that you need to make to the database. A round trip to the database and back is one of the most costly things an application can do, but it's also one of the most common things an application must do. PLINQO offers several easy to use features that can dramatically reduce database transactions, but it's up to the developer to use them.

Bottom Line: PLINQO can and will out preform standard LINQ to SQL. By how much? That is entirely up to you!

Batch Updates and Deletes

Updates with LINQ to SQL

To update records with LINQ to SQL, you must retrieve objects from the database, make your updates, and then commit the changes back to the database. This requires TWO round trips to the database.

1) Build the query, query the database, wait for a response, store the data in memory.
2) Make your updates, commit the changes to the database, wait for the response.

Batch Updates with PLINQO

PLINQO offers a batch update method on its table objects. To update as many records as you want, merely create a query, send it to the database, and get back your result. Unlimited rows, ONE trip.

1) Build the query, update the database, get the response.

// Update all Tasks with a StatusId of 1 to have a StatusId of 2,
context.Task.Update(t1 => t1.StatusId == 1, t2 => new Task() { StatusId = 2 });

Batch Deletes

To delete a record in LINQ to SQL you must first retrieve that record. That is TWO complete round trips to the database to delete one little record! PLINQO allows for batch deletes in the same manner as batch updates, as many records as you want, without loading records into memory, and in just ONE trip.

// Delete all tasks where StatusId is 2,
context.Task.Delete(t => t.StatusId == 2);

Stored Procedures

While the LINQ to SQL designer does support stored procedures, it only supports returning a single result set. PLINQO supports stored procedures with multiple result sets, and provides a simple way to handle those results. Again, this is yet another way PLINQO helps get you more data with fewer trips to the database.

// Create Procedure [dbo].[GetUsersWithRoles]
// As
// Select * From [User]
// Select * From UserRole
// GO
var results = context.GetUsersWithRoles();
List<User> users = results.GetResult<User>().ToList();
List<UserRole> roles = results.GetResult<UserRole>().ToList();

Batch Queries

In LINQ to SQL every query is a trip to the database. It doesn't matter if you have five queries in succession that require no logic in between, you must still make each and every query separately. The PLINQO DataContext offers an ExecuteQuery overload that will execute as many queries as you want in a single transaction. This is an extremely simple feature to use, and it can drastically improve performance in every day development scenarios.

var q1 = from u in context.User select u;
var q2 = from t in context.Task select t;
IMultipleResults results = context.ExecuteQuery(q1, q2);
List<User> users = results.GetResult<User>().ToList();
List<Task> tasks = results.GetResult<Task>().ToList();

Monday, June 22, 2009


Yo dawg! I heard you like cache, so we put a caching mechanism in yo server side cache, so you can cache while you cache!

...but seriously, PLINQO now includes a built in caching mechanism! :)

  • All IQueryable result sets can now be dynamically cached/retrieved right from the query.
  • The cache is accessible via an IQueryable extension method (FromCache), thus the cached objects are not DataContext specific.
  • The cache duration is set at the time of the query, it can be stored for a specific time span or use a sliding expiration.


using (var context = new PetshopDataContext())
    // Cache a result set. (A query is made to the DB.)
    var birds = context.Product.GetByCategoryId("BIRDS").FromCache().ToList();
    // Get a single entity from that cache. (No query is made to the DB.)
    var firstBird = context.Product.GetByCategoryId("BIRDS").FromCache().FirstOrDefault();

    // Specify number of seconds to cache. (A query is made to the DB.)
    var penguin = context.Product.GetByName("Penguin").FromCache(60).FirstOrDefault();
    // Get the same result set back as a list. (No query is made to the DB.)
    var penguins = context.Product.GetByName("Penguin").FromCache(60).ToList();


This feature is is not yet available in an official PLINQO release, to use the cache you will have to download the latest PLINQO Nightly Build.

To access the FromCache extension method you must...

  1. Include a reference to the following assemblies...
    1. CodeSmith.Data
    2. System.Data.Linq
    3. System.Data.Services
    4. System.Web
  2. Include a using/import statement for the CodeSmith.Data.Linq namespace.

Monday, April 20, 2009

Using CodeSmith.CodeFileParser

The CodeFileParser

In CodeSmith v5.1 we are introducing a new feature: the CodeFileParser!

The Inspiration

We are always wondering, how can we make CodeSmith better? We noticed that more and more cars have the 'flex fuel' logo on them these days; so we asked ourselves, "How can we learn from that? How can we make CodeSmith's fuel more flexible?" Well, CodeSmith is fueled by metadata, so how can we make metadata more flexible? ...and then it came to us: add more metadata!

The Feature

The CodeFileParser will make it easier for CodeSmith templates to use code files as their metadata source. It is a simple class that takes in a file path, or even a content string, parses the class, and returns an easy to walk/search DOM object. So think about that for a moment; this means that you can generate code, *dramatic pause*, from code.

I'll just let that sink in...
...take your time...
...pretty sweet, huh?

The Implementation

The Class

CodeSmith.Engine now contains the CodeFileParser class. It is capable of parsing both C# and VB code. As mentioned above, it is capable of taking in a file path or a content string, and it will take care of reading the file (or string) and parsing the contents for you. Under the hood the CodeFileParser uses the public NRefactory libraries created by the great team over at SharpDevelop.

// There are overloads that don't require basePath or parseMethodBodies.
public CodeFileParser(string fileName, string basePath, bool parseMethodBodies)
// There are overloads that don't require parseMethodBodies.
public CodeFileParser(string source, SupportedLanguage language, bool parseMethodBodies)

The Selection Methods

Most of the methods in NRefactory return position information in the form of Location objects, which, while very descriptive, are not the easiest thing to use when trying to take substrings or selections from the existing code. Because this can be very important when using the object DOM to assist with code generation, we have added several methods to assist with getting substrings and selections; these methods take in Location objects and return strings.

public string GetSectionFromStart(Location end)
public string GetSectionToEnd(Location start)
public string GetSection(Location start, Location end)

The CodeDomCompilationUnit

To quick and easily walk the DOM, the CodeFileParser exposes a (lazy loaded) property that returns System.CodeDom.CodeCompileUnit object. This is a standard .NET object that contains a complete code graph; this object is the quickest and easiest way to traverse your metadata. For more information about the CodeCompileUnit, please check out MSDN article.

The Visitor

When more advanced or customized information is required, the CodeFileParser exposes the CompilationUnit object, which is capable of taking in a visitor object to traverse the DOM and bring back specific data. This is an NRefactory feature, and it only requires that your visitor object implement the AbstractAstVisitor class.

The Example

The We're Already Using It!

We are already using the CodeFileParser in CodeSmith and our Plinqo templates! In CodeSmith we have implemented the CodeFileParser in our InsertClassMergeStrategy; it allows us to parse the existing code file and determine where we need to insert our new content. In Plinqo we use the CodeFileParser to assist with our MetaData class merge; it allows us to make a map of all the properties in that class and then preserve their attributes during regeneration.

The Template Code

<%@ CodeTemplate Language="C#" TargetLanguage="Text" Debug="False" CompilerVersion="v3.5" %>
<%@ Property Category="2.Class" Name="TheFile" Type="CodeFileParser" Optional="False" %>
<%@ Assembly Name="CodeSmith.CodeParser" %>
<%@ Import Namespace="System.CodeDom" %>

<%  foreach(CodeNamespace n in TheFile.CodeDomCompilationUnit.Namespaces) { %>
Namespace: <%= n.Name %>
<%      foreach(CodeTypeDeclaration t in n.Types) { %>
    Type: <%= t.Name %>
<%          foreach(CodeTypeMember m in t.Members) { %>
        Member: <%= m.Name %>
<%          } %>
<%      } %>
<%  } %>

Small footnote, thanks to Seinfeld for inspiring my section title names in this blog post.

Monday, March 9, 2009

CodeSmith on Windows 7

CodeSmith on Windows 7

First and foremost: CodeSmith 5.0 is (unofficially) 100% compatible with Windows 7, 64 bit edition!

I say "unofficially" because obviously I am not allowed to make such a statement with out extensive testing, certification, and or blah blah blah...but hey, "it works on my machine!"

But seriously, we have done some internal testing here at CodeSmith Tools, and so far we have not experienced any issues with running CodeSmith Professional 5.0 on Windows 7 (x64). So for all you brave pioneers out there: that should be one less arrow to worry about!

Windows 7 Beta

I have been running the Windows 7 Beta on my personal laptop (an ASUS G50VT) since the beginning of January. So far, I have to give it TWO THUMBS WAY UP!

I am not proud to say it, but I pretty much skipped over Vista entirely...
*insert sob story about getting frustrated with Vista here* Microsoft's credit, they were not afraid to try something new; and while SP1 did fix Vista, it was a little too late for me.

After reading almost nothing but very positive reviews (and feeling guilty about still living in 2001), I decided to grab a copy of Windows 7 and give it a shot. Windows 7 was a reasonable 3 gig download. The OS installed first try, and only took less than 30 minutes. Even on my picky gaming laptop, drivers were easy to install. Almost every application I have wanted to use installed first try, and the ones that didn't were almost all fixed by simply setting compatibility mode to Vista.

Aside from being both fast and stable, Windows 7 also has plenty of other good features. Everything about the new taskbar is phenomenal. The pinning, the preview pane combined with Aero's a great series of intuitive features that all come together to form a dynamic and sleek new interface. The OS provides native Virtual Hard Drive support; you can create, mount, even boot from them! Also, the new backup system takes advantage of this by allowing Windows to back up a system image straight to VSD.

Development on Windows 7

Having had such a good experience at home, I decided to try out Windows 7 at work. First try, no errors, no hang ups, no complications, I installed all the following software...

  1. Visual Studio 2008 w/ SP1
  2. Microsoft SQL Server 2008
  3. CodeSmith 5.0
  4. TestDriven.Net
  5. NUnit
  6. Asp.Net MVC RC2
  7. Aptana Studio
  8. Firefox 3.0 w/ Firebug 1.3
  9. SlySoft Virtual CloneDrive

...and so far, everything has worked flawlessly.

So in conclusion, when I say CodeSmith works on Windows 7, I am also saying that it builds on Windows 7.

Friday, February 20, 2009

Defining Enum Tables

Back in January I posted about "Defining Many To Many" tables; and now (to take a note from the Colbert Report), here is Part 2 in my infinite part series...

Better Know an ORM Programmatic Definition: Defining Enum Tables!

The idea here is that we want to generate Enums from database tables; so for each table we specify as an Enum table, we want create an Enum and populate it's values from the rows in that table. As always, the goal is to make this solution be as generic as possible. We want this to be able to work on pretty much any database we throw at it, we want it to check for any usual pitfalls, and of course we want what we generate to be as useful as possible! Let's begin with an example...

Example Table (Input)

Table Name: StarTrek
Columns: Id (int), Name (string), Captain (string)
Row 1: "Original", "James Tiberius Kirk"
Row 2: "Animated", "James Tiberius Kirk"
Row 3: "TNG", "Jean-Luc Picard"
Row 4: "DS9", "Benjamin Sisko"
Row 5: "Voyager", "Kathryn Janeway"
Row 6: "Enterprise", "Scott Bakula? Seriously?"

Example Enum (Output)

public enum StarTrek : int
    /// <summary>James Tiberius Kirk</summary>
    Original = 1,
    /// <summary>Scott Bakula? Seriously?</summary>
    Enterprise = 6

So, what is the logic?

1) Explicitly identify select the table.

While associations can be determined by examining keys and other qualities, Enum tables just don't have enough unique qualities to identify in that manner; thus we will want to explicitly choose our Enum tables. For this task I recommend a Regex for Enum table names; however you could always use a good old fashion table list. Now that we have identified that the table SHOULD be an Enum, we need to determine if it CAN be an Enum table...

2) The table must have a primary key.

Enums values can be assigned a numeric value, so to allow for association mapping and logical comparisons it's a good idea that our generated Enums are assigned meaningful values at generation time. NOTE: While assigning a numeric value to Enum values can server many different purposes, the following was specifically chosen because it allows for Enums to act as database associations for business entity objects.

2.A) The table must have a primary key composed of a single column. (It's hard to have a composite key that evaluates to a single numerical value.)

2.B) The primary key column must be of a number type. (This is so that the Enum values can be assigned to the key value.)

3) The table must have a column to specify the values.

Well if the table is the Enum itself, where are the values going to come from? You have to chose which column the value is going to come out of! Again I recommend using a Regex to find this column by name, but if that fails (or if you are feeling lazy) you could default to taking the first column of a string type.

4) The table must have at least one row.

Firstly, this is because there's not a lot of use for an empty Enum; but also, some languages (such as VB) don't support it.


When generating the enums, it might come in handy to generate the description for each value as well (as we did in the example above); so, in the code below is an extra function for finding that description with (surprise surprise) a Regex.

And finally, here is what your code might look like...

public static Regex EnumTableNameRegex = new Regex("(E|e)num$", RegexOptions.Compile);
public static Regex EnumValueColumnRegex = new Regex("((V|v)alue)|((N|n)ame)|((T|t)ype(C|c)ode)", RegexOptions.Compile);
public static Regex EnumDescriptionColumnRegex = new Regex("(D|d)esc", RegexOptions.Compile);

public bool IsEnum(TableSchema table)
    return EnumTableNameRegex.IsMatch(table.Name)                            // 1) Matches the enum regex.
        && table.PrimaryKey != null                                                          // 2) Has a Primary Key...
        && table.PrimaryKey.MemberColumns.Count == 1                         // a) ...that is a single column...
        && IsNumeric(table.PrimaryKey.MemberColumns[0].SystemType)   // b) ...of a number type.
        && !string.IsNullOrEmpty(GetEnumNameColumnName(table))         // 3) Contains a column for name.
        && table.GetTableData().Rows.Count > 0;                                      // 4) Must have at least one row.

private bool IsNumeric(Type t)
    return t == typeof(byte)
           || t == typeof(sbyte)
           || t == typeof(short)
           || t == typeof(ushort)
           || t == typeof(int)
           || t == typeof(uint)
           || t == typeof(long)
           || t == typeof(ulong);

public string GetEnumValueColumn(TableSchema table)
    string result = GetEnumColumn(table, EnumValueColumnRegex);

    // If no Regex match found, use first column of type string.
    if (string.IsNullOrEmpty(result))
        foreach (ColumnSchema column in table.Columns)
            if (column.SystemType == typeof(string))
                result = column.Name;

    return result;

private string GetEnumColumn(TableSchema table, Regex regex)
    string result = string.Empty;

    foreach (ColumnSchema column in table.Columns)
        if (regex.IsMatch(column.Name))
            result = column.Name;

    return result;

public string GetEnumDescriptionColumn(TableSchema table)
    return GetEnumColumnName(table, EnumDescriptionExpressions);

Monday, January 19, 2009

5.1 Preview: Insert Class Merge Strategy

Coming in v5.1, the Insert Class Merge Strategy will be CodeSmith's third native Merge Strategy. The premise is simple: Insert your template output into a previously existing class in the output file.

At first this may sound very similar to the Insert Region Merge Strategy, and indeed it did start out that way; however, this Merge Strategy has many additional settings and features that separate it from it's other fellow Merge Strategies, and that make it a very robust and powerful tool.

I think the best way to describe this Merge Strategy is through example; but before we can do that, we must first go over it's configuration options...

Configuration Options

Language (String, Required)
Only supports C# and VB.

ClassName (String, Required)
Name of the class to insert into.

PreserveClassAttributes (Boolean, defaults to False)
Whether or not the the merge should preserve the existing classes attributes.
By default, the merge tries to replace the entire existing class, which includes the attributes on the top of class; this option leaves the attributes from the top of the original class.

OnlyInsertMatchingClass (Boolean, defaults to False)
Insert the whole template output, or just the matching class.

MergeImports (Boolean, defaults to False)
Merge the import/using statements of the existing file and generated output.

NotFoundAction (Enum, defaults to None)
What to do if the class is not found in the existing file. There are three options...
None: Don't merge anything, just leave the existing file as is.
InsertAtBottom: Append the output of the template to the bottom of existing file.
InsertInParent. Insert the output of the template at the bottom of a speficied parent section (specified by the NotFoundParent property).

NotFoundParent (String, no default)
If you specified InsertInParent for the NotFoundAction configuration, you must specify a name for the parent region.
This can be the name of a Region or a Class.

Example Configuration...

Language: C#
ClassName: "Pet"
PreserveClassAttributes: True
OnlyInsertMatchingClass: True
MergeImports: True

...Existing File...

using System;
using System.ComponentModel.DataAnnotations;

namespace Petshop
    public class Pet
        public int Age { get; set; }
        public string FirstName { get; set; }
        public string LastName { get; set; }

...Generated Output...

using System;
using System.Text;

namespace Petshop
    public class Pet
        public string FirstName { get; set; }
        public string LastName { get; set; }
        public string FullName
            get { return String.Format("{0} {1}", FirstName, LastName); }

...Insert Class Merge Strategy Result!

using System;
using System.ComponentModel.DataAnnotations;
using System.Text;

namespace Petshop
    public class Pet
        public string FirstName { get; set; }
        public string LastName { get; set; }
        public string FullName
            get { return String.Format("{0} {1}", FirstName, LastName); }

Thursday, January 8, 2009

Defining Many To Many

There are a lot of simple tasks that we humans can do with little effort, but when put into logic, can often become quite difficult to define. A great example of this is Many To Many Table Associations.

So, how do you programatically take a table and tell if it is a many to many association? My train of thought started off simple: the table has two foreign key columns. Then the flaws started rolling in...

  1. Regular tables could have two for foreign keys.
  2. It can't just be two columns, there could be time stamps or version numbers.
  3. It can't just be two key columns, because there could be composite keys.
  4. It may or may not have a primary key.
  5. The primary key could be a composite key on the foreign keys. seems that I had taken my human brain (specifically it's mad pattern recognition skillz) for granted! :) So after a brief discussion in the office (and on our forums), we came up with the following logic:

  1. Table must have Two ForeignKeys.
  2. All columns must be either...
    1. Member of the Primary Key.
    2. Member of a Foreign Key.
    3. A DateTime stamp (CreateDate, EditDate, etc).
    4. Match our Row Version regular expression.

Of course, there could always be other things out there that we didn't think of. In this world there are many technologies, with many conventions, used by many programmers, all unique in there own way. So, unfortunately, there is no truly simple answer, nor is there a perfect solution...however, that is why we here at CodeSmith always try to be as generic and flexible as possible in our designs! Also, it's why we love to use things like Extended Properties, and how we get our last criteria:

0) Bypass logic if table contains Extended Property for ManyToMany

So, finally, here is what the code might look like...

Note: This is (a slightly modified version of) what is in our NHibernate templates.

public static bool IsManyToMany(TableSchema table)
    // 0) Bypass logic if table contains Extended Property for ManyToMany
    if (table.ExtendedProperties.Contains("CS_ManyToMany"))
        return true;

    // 1) Table must have Two ForeignKeys.
    // 2) All columns must be either...
    //    a) Member of the Primary Key.
    //    b) Member of a Foreign Key.
    //    c) A DateTime stamp (CreateDate, EditDate, etc).
    //    d) Name matches Version Regex.

    if(table.ForeignKeys.Count != 2)
        return false;

    bool result = true;
    versionRegex = new Regex("(V|v)ersion");

    foreach (ColumnSchema column in table.Columns)
        if (!( column.IsForeignKeyMember
            || column.IsPrimaryKeyMember
            || column.SystemType.Equals(typeof(DateTime))
            || versionRegex.IsMatch(column.Name)))
            result = false;

    return result;

Real Time Web Analytics