
How does a default parameter work?
One of the things that I love about C# is how so many of it's features are just very conveniently designed compiler tricks. This means that, just like any other magic trick, once you know how the trick is performed you immediately realize that there is nothing magical about it all!
So, let's talk about default parameters. They are actually just constant values that get compiled into your code when you go to use a method that has them. Let's look at an example...
public class DefaultParamTests1
{
[Fact]
public void WhatYouWrite()
{
var actual = Double();
Assert.Equal(2, actual);
}
private static int Double(int i = 1)
{
return i * 2;
}
}
public class DefaultParamTests2
{
[Fact]
public void WhatItCompilesTo()
{
var actual = Double(1);
Assert.Equal(2, actual);
}
private static int Double(int i)
{
return i * 2;
}
}
What happens when interfaces and methods don't match?
So, now that you know how the trick is performed, what happens if you use a different default value for a parameter defined by an interface and a class?
The answer is simple: if your object is cast as the class, then it will use the class value. If your object is cast as the interface, it will use the interface value instead. Let's take a look at another example...