Monday, June 30, 2014

Implementing notify property changed with SheepAspect

Introduction

I spend some time looking for AOP options for .NET.  All references point to PostSharp as the coolest option - even when you have to pay to use it - due to its features, starting from the way it works – in build time and static – plus a lot of build-in “advice” and extensions points.

There are also several run time and dynamic options such as Castle, Unity, etc., but I prefer the build time and static approach.  Indeed I use Fody - as an alternative to PostSharp - even when I have to write down custom plugins directly in IL.

But all of these AOP-like libraries and tools for .NET do not allow you to handle exactly the all the AOP concepts such as pointcut, join-point, and advice.

But recently I found a project in CodePlex, named SheepAspect with an introductory statement that includes the following words “…was inspired in AspectJ”. So, I just installed the package from NuGet and started to write in C# a notify property changed proof of concept with a friend of mine (Leandro).

Introducing NotifyPropertyChanged aspect

The very first step is to write a NotifyPropertyChangedAspect class. It’s important to get related with SAQL in order to query the right properties from the right types, and with the usage of the SheepAspect attributes. For this particular example the pointcut includes “all public property setters of types that implement System.ComponentModel.INotifyPropertyChanged” and look like this:
[Aspect]
public class NotifyPropertyChangedAspect
{
    [SelectTypes("ImplementsType:'System.ComponentModel.INotifyPropertyChanged'")]
    public void NotifiedPropertyChangedTypes()
    {
    }

    [SelectPropertyMethods("Public & Setter & InType:AssignableToType:@NotifiedPropertyChangedTypes")]
    public void PublicPropertiesOfTypesThatImplementsINotifyPropertyChangedInterfacePointCut()
    {
    }
}

Now, I just needed to add the notify property changed behavior as around advice just as follow:

[Around("PublicPropertiesOfTypesThatImplementsINotifyPropertyChangedInterfacePointCut")]
public void AdviceForPublicPropertiesOfTypesThatImplementsINotifyPropertyChangedInterface(PropertySetJointPoint jp)
{
    object value = jp.Property.GetValue(jp.This);
    if (!object.Equals(value, jp.Value))
    {
        jp.Proceed();
        jp.This.RaiseNotifyPropertyChanged(jp.Property);
    }
}

As you should notice, to implement this, I also introduce a couple of extension methods TryGetPropertyChangedField and of course the RaiseNotifyPropertyChanged itself:

public static bool TryGetPropertyChangedField(this Type type, out FieldInfo propertyChangedEvent)
{
    propertyChangedEvent = null;
    while (propertyChangedEvent == null && type != null && type != typeof(object))
    {
        propertyChangedEvent = type.GetField("PropertyChanged", BindingFlags.Instance | BindingFlags.NonPublic);
        if (propertyChangedEvent == null)
        {
            type = type.BaseType;
        }
        else if (!typeof(MulticastDelegate).IsAssignableFrom(propertyChangedEvent.FieldType))
        {
            propertyChangedEvent = null;
        }
    }

    return propertyChangedEvent != null;
}

/*...*/

public static void RaiseNotifyPropertyChanged(this object instance, PropertyInfo property)
{
    FieldInfo propertyChangedEvent;
    if (instance.GetType().TryGetPropertyChangedField(out propertyChangedEvent))
    {
        var propertyChangedEventMulticastDelegate = (MulticastDelegate)propertyChangedEvent.GetValue(instance);
        var invocationList = propertyChangedEventMulticastDelegate.GetInvocationList();
                
        foreach (var handler in invocationList)
        {
            MethodInfo methodInfo = handler.GetMethodInfo();
            methodInfo.Invoke(handler.Target, new[] { instance, new PropertyChangedEventArgs(property.Name) });
        }
    } 
}

Now, if a class that implements or inherits from a class that implements INotifyPropertyChanged interface is added, just like this one:

public class Person : INotifyPropertyChanged
{
        public event PropertyChangedEventHandler PropertyChanged;

        public string FirstName { get; set; }

        public string LastName { get; set; }

        public string FullName
        {
            get
            {
                return string.Format(CultureInfo.InvariantCulture, "{0} {1}", this.FirstName, this.LastName).Trim();
            }
        }
}

and a program like this one is written:

this.person.PropertyChanged += (sender, args) =>
                {
                    object value = sender.GetType().GetProperty(args.PropertyName).GetValue(sender);
                    Console.WriteLine("Property Changed => '{0}' = '{1}'", args.PropertyName, value);
                };

this.person.FirstName = "Igr Alexánder";
this.person.LastName = "Fernández Saúco";

then the output will be:

Property Changed => 'FirstName' = 'Igr Alexánder'
Property Changed => 'LastName' = 'Fernández Saúco'

Uhmm! But what about with the computed read-only properties like FullName?

Notifying property changes of computed read-only properties.

In order to notify changes of computed read-only properties, an inspection of the IL code is required. The .NET native reflection API is limited. From the PropertyInfo is only possible to get the IL byte array from the get method body, and nothing more. Therefore, I just switched to the Mono.Cecil reflection API to be able to complement the existing RaiseNotifyPropertyChanged extension method with the following extension methods:

public static bool ExistPropertyDependencyBetween(this Type type, PropertyInfo dependentProperty, PropertyInfo propertyInfo)
{
    AssemblyDefinition assemblyDefinition = new DefaultAssemblyResolver().Resolve(type.Assembly.FullName);
    TypeDefinition typeDefinition = assemblyDefinition.MainModule.GetType(type.FullName);
    PropertyDefinition dependentPropertyDefinition = typeDefinition.Properties.FirstOrDefault(definition => definition.Name == dependentProperty.Name);
    
    bool found = false;
    if (dependentPropertyDefinition != null)
    {
        MethodDefinition definition = dependentPropertyDefinition.GetMethod;
        if (definition.HasBody)
        {
            ILProcessor processor = definition.Body.GetILProcessor();

            int idx = 0;
            while (!found && idx < processor.Body.Instructions.Count)
            {
                Instruction instruction = processor.Body.Instructions[idx];
                
                MethodDefinition methodDefinition;
                if (instruction.OpCode == OpCodes.Call && (methodDefinition = instruction.Operand as MethodDefinition) != null && methodDefinition.DeclaringType.IsAssignableFrom(typeDefinition) && methodDefinition.Name == string.Format(CultureInfo.InvariantCulture, "get_{0}", propertyInfo.Name))
                {
                    found = true;
                }
                else
                {
                    idx++;
                }
            }
        }
    }

    return found;
}

/*...*/

public static IEnumerable<PropertyInfo> GetDependentPropertiesFrom(this Type type, PropertyInfo property)
{
    List<PropertyInfo> dependentPropertyInfos = type.GetProperties(BindingFlags.Instance | BindingFlags.Public).Where(dependentProperty => property != dependentProperty && dependentProperty.CanRead && type.ExistPropertyDependencyBetween(dependentProperty, property)).ToList();
    for (int i = 0; i < dependentPropertyInfos.Count; i++)
    {
        foreach (PropertyInfo info in type.GetDependentPropertiesFrom(dependentPropertyInfos[i]))
        {
            if (!dependentPropertyInfos.Contains(info))
            {
                dependentPropertyInfos.Add(info);
            }
        }
    }

    return dependentPropertyInfos;
}
just like this:
public static void RaiseNotifyPropertyChanged(this object instance, PropertyInfo property)
{
    FieldInfo propertyChangedEvent;
    if (instance.GetType().TryGetPropertyChangedField(out propertyChangedEvent))
    {
        var propertyChangedEventMulticastDelegate = (MulticastDelegate)propertyChangedEvent.GetValue(@this);
        var invocationList = propertyChangedEventMulticastDelegate.GetInvocationList();
                
        foreach (var handler in invocationList)
        {
            MethodInfo methodInfo = handler.GetMethodInfo();
            methodInfo.Invoke(handler.Target, new[] { instance, new PropertyChangedEventArgs(property.Name) });
        }

        foreach (PropertyInfo propertyInfo in instance.GetType().GetDependentPropertiesFrom(property))
        {
            foreach (var handler in invocationList)
            {
                MethodInfo methodInfo = handler.GetMethodInfo();
                methodInfo.Invoke(handler.Target, new[] { instance, new PropertyChangedEventArgs(propertyInfo.Name) });
            }
        }
    }
}

So, now the program output is:

Property Changed => 'FirstName' = 'Igr Alexánder'
Property Changed => 'FullName' = ' Igr Alexánder' 
Property Changed => 'LastName' = 'Fernández Saúco'
Property Changed => 'FullName' = ' Igr Alexánder Fernández Saúco'

Conclusion

At this point, you should be worried about the performance and a lot of reflection API calls. I'm pretty sure that this issue could be handle with the right caching approach ;).

I didn’t know why I never heard about SheepAspect before. Probably no one trust in something called “Sheep”, but believe me SheepAspect rocks!

Wait for a second! Right now I'm reading about something called mixing. Probably I can get a more declarative approach to implement this, just like the Fody or PostSharp home page examples. But I'm not sure right now, so, you have to wait for my next post to know about it ;).

Friday, February 21, 2014

Developing a ReSharper Plugin – The backward compatibility approach

Introduction

We have being developed a ReSharper plugin for Catel framework also known as CatelR# for a while, following an interesting approach in order support the new R# version and also keep the backward compatibility.

If you want to know how we made it, just take a look.

Visual Studio solution setup

1)    Create project per supported R# version, which means that the output of each project is targeting to the specific version of R# and references the specific version of the SDK.



2)    Keep in mind, that could be several breaking changes between R# SDK versions, but nothing that couldn’t be handled with pre-processor directives.

#if R70 || R71 || R80
        protected override void Process(CSharpGeneratorContext context)

#elif R61
        public override void Process(CSharpGeneratorContext context)
#endif
        {
            CSharpElementFactory factory = CSharpElementFactory.GetInstance(context.Root.GetPsiModule());
#if R80
            IDeclaredType viewModelToModelAttributeClrType = TypeFactory.CreateTypeByCLRName(CatelMVVM.ViewModelToModelAttribute, context.PsiModule, UniversalModuleReferenceContext.Instance);
#else
            IDeclaredType viewModelToModelAttributeClrType = TypeFactory.CreateTypeByCLRName(CatelMVVM.ViewModelToModelAttribute, context.PsiModule);
#endif
            /*...*/
#if R80
                        var fixedArguments = new List<AttributeValue> { new AttributeValue(ClrConstantValueFactory.CreateStringValue(model.ShortName, context.PsiModule, UniversalModuleReferenceContext.Instance)) };
#else
                        var fixedArguments = new List<AttributeValue> { new AttributeValue(ClrConstantValueFactory.CreateStringValue(model.ShortName, context.PsiModule)) };
#endif
                        if (propertyName != modelProperty.ShortName)
                        {
#if R80
                            fixedArguments.Add(new AttributeValue(ClrConstantValueFactory.CreateStringValue(modelProperty.ShortName, context.PsiModule,  UniversalModuleReferenceContext.Instance)));
#else
                            fixedArguments.Add(new AttributeValue(ClrConstantValueFactory.CreateStringValue(modelProperty.ShortName, context.PsiModule)));
#endif
                        }
            /*...*/
                    }
                }
            }
        }

3)    Keep all these entire project sources synchronized. Could be very easy thanks to Caitlyn.




4)   Finally redirect the build projects outputs to dealing with ease with the packaging of the deployment units.

Building the deployment units

The R# plugin build process indeed an heterogeneous one as any build process. Even though  this build could be handled via msbuild tasks, we actually recommend the usage of tools with intuitive GUI in order to quickly creating and debugging such build "scripts", such as FinalBuilder or VisualBuild.


1) Since 8.0 R# version  the NuGet based extension manager is available. Therefore one of the build output could be a NuGet package to distribute your plugin through the ReSharper extension gallery.

2) But NuGet based extension manager is not available for all R# versions. Therefore a second build output could also be classic deployment unit built on top of any of the existing installer system.  For instance InnoSetup or NSIS.

The following are the code snippets from the install and uninstall sections of CatelR# setup. Notice how we deal with build output to support all R#  versions.


# ...
# Installer section
# ...

Push "v6.1"
Push "v7.0"
Push "v7.1"
Push "v8.0"
Push "v8.1"
${Do}
  Pop $0
  ReadRegStr $1 HKLM "Software\JetBrains\ReSharper\$0" InstallDir
  ${If} $1 != ''
    DetailPrint "Installing Catel.ReSharper for JetBrains ReSharper $0"
    SetOutPath "$1\Plugins\$(^Name)"
    ${If} $0 == 'v6.1'
      File /r "..\..\output\Debug\v6.1\*.dll"
    ${ElseIf} $0 == 'v7.0'
      File /r "..\..\output\Debug\v7.0\*.dll"
    ${ElseIf} $0 == 'v7.1'
      File /r "..\..\output\Debug\v7.1\*.dll"
    ${ElseIf} $0 == 'v8.0'
      File /r "..\..\output\Debug\v8.0\*.dll"
    ${ElseIf} $0 == 'v8.1'
      File /r "..\..\output\Debug\v8.1\*.dll"
    ${EndIf}
    Push true
    Pop $3
    WriteRegStr HKLM "${REGKEY}" "$0" 1
  ${EndIf}
${LoopUntil} $0 == "v6.1"

# ... 

# ... 
# Uninstaller section
# ... 

Push "v6.1"
Push "v7.0"
Push "v7.1"
Push "v8.0"
Push "v8.1"
${Do}
  Pop $0
  ReadRegStr $1 HKLM "${REGKEY}" "$0"
  ${If} $1 == '1'
    ReadRegStr $2 HKLM "Software\JetBrains\ReSharper\$0" InstallDir
    ${If} $2 != ''
      RMDir /r /REBOOTOK "$2\Plugins\$(^Name)" 
      DeleteRegValue HKLM "${REGKEY}" "$0"
    ${EndIf}
  ${EndIf}
${LoopUntil} $0 == "v6.1"  

# ...


Conclusions

Just a few minutes ago, I read the notification about the release of R#8.2 EAP. It’s a good moment to revalidate this approach. Let’s see,… creating a new project with post-fix 82, …installing the SDK package, ..., ...time out, sorry…slow connection…,…,…,..updating the build script, …, …, … reviewing for breaking changes, good news, there are no breaking changes,… updating setup script…., committing source modifications, … running the build script and it’s done.

Now you can update the extension from the extension gallery or download the full installer of CatelR# with support for R#8.2 EAP ;)

Tuesday, October 22, 2013

Catel extends Prism modularity options through NuGet

Introduction


Prism provides a lot of options regarding modularity by default. Fact is that most of the stuff  you need to build a modularized application is available on Prism. Prism contains several modularity options including ways to retrieve modules from a directory, set them up through the configuration file, and even options to create modularized Silverlight applications.

On the other hand, Catel provides support for module catalog composition in order to enable more than one modularity options into a single application.

Nowadays, initiatives around NuGet come from everywhere, such as Shimmer with the same goal of ClickOnce (but works), OctopusDeploy for release promotion and application deployment or even the recently released extension manager of ReSharper 8.0.

So, the Catel team comes with a new one and the fact is that we wondered why no one told about this before: 

What if you would be able to distribute your application's modules through NuGet?

If you want to know how, take a look.


Enabling NuGet based modularity option


With the forthcoming 3.8 version of Catel you will be able to install and load modules from a NuGet repository. The only thing you have to do is:
  1. Write a module as you know with Prism (or better in combination with Catel).
  2. Create a NuGet package for the module and publish it into your favorite gallery.
  3. Use and configure the NuGetBasedModuleCatalog in the application’s bootstrapper.
The first two steps are well documented. For details see creating and publishing a package and declaring modules from NuGet and Catel documentation.


So, now we can focus on the third one.

Use and configure the NuGetBasedModuleCatalog in the application’s bootstrapper


In order to simplify the following explanation we published a package into the official NuGet Gallery named Catel.Examples.WPF.Prism.Modules.NuGetBasedModuleA. Such package contains an assembly with a single class like this one:
 
public class NuGetBasedModuleA : ModuleBase
{
    public NuGetBasedModuleA(IServiceLocator container):base("NuGetBasedModuleA", null, container)
    {
    }

    protected virtual void OnInitialized()
    {
      var messageService = this.Container.ResolveType<IMessageService>();
      messageService.Show("NuGetBasedModuleA Loaded!!");
    }
}

Let’s start writing an  application that will use this public packaged module. Indeed, the main point of interest here is about the usage and configuration of the NuGetBasedModuleCatalog. Therefore just write the bootstrapper as follow:

public class Boot : BootstrapperBase<MainWindow, NuGetBaseModuleCatalog>
{
  protected override void ConfigureModuleCatalog()
  {
    base.ConfigureModuleCatalog();
    this.ModuleCatalog.AddModule(new ModuleInfo 
          { 
               ModuleName = "NuGetBasedModuleA", 
               ModuleType = "Catel.Examples.WPF.Prism.Modules.NuGetBasedModuleA.NuGetBasedModuleA, Catel.Examples.WPF.Prism.Modules.NuGetBasedModuleA", 
               Ref = "Catel.Examples.WPF.Prism.Modules.NuGetBasedModuleA, 1.0.0-BETA", 
               InitializationMode = InitializationMode.OnDemand
          });
  }
}

Notice how the Ref property of ModuleInfo is used to specify the package id and optionally the version number. If the version number is not specified then the application always will try to download the latest version keeping the modules up to date.

Finally, to make this demo work, just put in the XAML of the main window the ModuleManagerView control, run


and click the check box.

Conclusions


Yes, we did it again. Catel keeps working alongside Prism. Now, we extended Prism’s modularity options through NuGet in order to provide a powerful mechanism of application module distribution. 

If you want to try, just get the latest beta package of Catel.Extensions.Prism and discover by yourself all scenarios that you can handle with this Catel exclusive feature.

Saturday, September 28, 2013

Keep updated CatelR# with ReSharper 8.0



CatelR# is now compatible with the latest version of ReSharper (R#).

We also keep the compatibility with the most recent R# versions including 6.0, 7.0 and 7.1. The full installer, that also includes the R# 8.0 assemblies, of the latest beta of CatelR# is available to download here.

R# 8.0 includes a NuGet based Extension Manager and the Catel Team started to publish the CatelR# extension in the Extension Gallery.

Yes, CatelR# is alive. We keep working in more features and also are waiting for your ideas. You got one? Let us know!

So, ensure yourself to install the latest version of R# and you will no miss any of the forthcoming CatelR#'s cool features.

Friday, September 27, 2013

How Configure HTTP Access to Analysis Services on Internet Information Services from Powershell

Introduction


You can enable HTTP access to Analysis Services by configuring MSMDPUMP.dll, an ISAPI extension that runs in Internet Information Services (IIS) and pumps data to and from client applications and an Analysis Services server. This approach provides an alternative means for connecting to Analysis Services when your BI solution calls for the following capabilities:
  • Client access is over Internet or extranet connections, with restrictions on which ports can be enabled.
  • Client connections are from non-trusted domains in the same network. {…}"
The text above, is the way of the “Configure HTTP Access to Analysis Services on Internet Information Services (IIS)” guide start. Such steps are typical to setup the development environment or production server of a BI solution that move data over HTTP directly from an instance of SQL Server Analysis Services.

Repeat these steps over and over again encourage me to write the script to automate most of the steps of this guide with Powershell.

The Code

So, without more introduction here is the source:

1) Import or load the WebAdministration Module or PSSnapin (depends on IIS version)
$webAdminModule = Get-Module -ListAvailable | Where-Object { $_.Name -eq "WebAdministration" }
if ($webAdminModule -ne $null) 
{
    Write-Host "Importing WebApplication Module..."
    Import-Module -ModuleInfo $webAdminModule
}
else
{
    Write-Host "Loading WebApplication PSSnapin..."
    Add-PSSnapin WebAdministration
}

2) Create and setup application pool
Write-Host "Creating application pool OLAP..."
New-WebAppPool -Name OLAP -ErrorAction SilentlyContinue
Set-ItemProperty IIS:\AppPools\OLAP -Name ManagedRuntimeVersion -Value v2.0.50727
Set-ItemProperty IIS:\AppPools\OLAP -Name ManagedPipelineMode -Value 1

3) Set application pool credentials
Write-Host "Enter application pool credentials"
$userName = Read-Host "UserName" 
$securedPassword = Read-Host "Password" -AsSecureString
$userName = $credential.UserName
$password = [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR($securedPassword));
Set-ItemProperty IIS:\AppPools\OLAP -Name ProcessModel -Value @{ userName="$userName"; password="$password"; IdentityType=3 }

3) Create the web application
Write-Host "Creating OLAP Application in Default Web Site..."
New-Item 'IIS:\Sites\Default Web Site\OLAP' -Type Application -PhysicalPath "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\isapi"
Set-ItemProperty "IIS:\Sites\Default Web Site\OLAP" -Name ApplicationPool -Value OLAP
Set-WebConfiguration system.web/authentication 'IIS:\Sites\Default Web Site\OLAP' -value @{ mode = 'None' }

4) Add the handler mapping (it can't be done with the available command-lets, so)
Write-Host "Adding the handler mapping..."
[System.Reflection.Assembly]::LoadFrom("C:\windows\system32\inetsrv\Microsoft.Web.Administration.dll")
$isapiPath = "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\isapi\msmdpump.dll"
$isapiConfiguration = Get-WebConfiguration "/system.webServer/security/isapiCgiRestriction/add[@path='$isapiPath']/@allowed"  
if (-not($isapiConfiguration -eq $null -or $isapiConfiguration.value))
{  
    Write-Host "Enabling ISAPI Module - $isapiPath"
    Set-WebConfiguration "/system.webServer/security/isapiCgiRestriction/add[@path='$isapiPath']/@allowed" -value "True" -PSPath:IIS:\  
}

$serverManager = New-Object Microsoft.Web.Administration.ServerManager
if ($isapiConfiguration -eq $null)
{
    Write-Host "Adding and enabling ISAPI Module - $isapiPath"
    $appHostConfig = $serverManager.GetApplicationHostConfiguration();
    $isapiCgiRestrictionSection = $appHostConfig.GetSection("system.webServer/security/isapiCgiRestriction");
    $isapiCgiRestrictionCollection =  $isapiCgiRestrictionSection.GetCollection();
    
    $cgiRestrictionElement = $isapiCgiRestrictionCollection.CreateElement("add");
    $cgiRestrictionElement.SetAttributeValue("path", "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\isapi\msmdpump.dll");
    $cgiRestrictionElement.SetAttributeValue("allowed", "true");
    $cgiRestrictionElement.SetAttributeValue("groupId", "");

    $isapiCgiRestrictionCollection.AddAt(0, $cgiRestrictionElement);
}

$webConfig = $serverManager.GetWebConfiguration("Default Web Site", "OLAP");
$handlersSection = $webConfig.GetSection("system.webServer/handlers");
$handlersCollection = $handlersSection.GetCollection();

$handlerElement = $handlersCollection.CreateElement("add");
$handlerElement.SetAttributeValue("name", "OLAP");
$handlerElement.SetAttributeValue("path", "msmdpump.dll");
$handlerElement.SetAttributeValue("verb", "*");
$handlerElement.SetAttributeValue("modules", "IsapiModule");
$handlerElement.SetAttributeValue("scriptProcessor", "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\isapi\msmdpump.dll");

$handlersCollection.AddAt(0, $handlerElement);

$serverManager.CommitChanges();

The main differences with the original non-automated guide are: 
  • Doesn’t create a copy of the files in ‘C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\isapi’ to a ‘c:\inetpub\wwwroot\olap’ just publishes directly the original directory as application. Our deployments (production or workstation) typically involve a single instance of SQL Server Analysis Service.
  • Doesn’t modify the archive msmdpump.ini because the default configuration is enough.
  • Doesn't grant data access permissions. Assumes that the  application pool’s credentials have the right data access permissions.

Conclusions

I’m agree with you. This script can be more parameterized. But with this simple “coded” blog post I just want to send you a message. 

When you have to face automation tasks of your daily work follow this tips:
  • Just do a small step at once.
  • Don’t add complexity to the solution more than your needs.
  • Don’t overwhelm you, do it for fun. 
  • The code will be better tomorrow. 
  • One day as result of your small improvements the solution will look just like you ever wanted. 

But if I you have something to automate just start NOW ;-). 

Wednesday, August 14, 2013

If you can’t defeat them, just “join” them – Part I

Introduction

I spend a lot of time using Subversion as the Version Control System for my development team. The fact is I love Subversion and strongly think that there is no a centralized version control system (CVCS) like Subversion. 

On the other hand, my organization promotes the usage of Microsoft Team Foundation Server (TFS) as Application Life-Cycle Management (ALM) and I also think TFS is a great integrated environment. However some of its services are in disadvantage mainly in terms of usability against some third party software.

But I’m not here to talk about TFS vs. Subversion, even when the user experience of the Visual SVN is better than Team Explorer (even in VS2012).

I just want to share how I committed to the indication of shutting down my local team Subversion, and move my entire sources to the centralized TFS, keeping all my “non-integrated” development environment’s cool features.

What features will I miss if I’ll move to TFS version control system?

I received a lot of critics because I don’t use the integration features of Visual Studio Team System. Actually I have a non-fully integrated environment for ALM (TFS and Subversion) but always keeping traces between sources and tasks. I know that TFS has built-in support for this and also have a check-in policy system to “force” some team disciplines.

On the other hand, I needed move my team into some of self-discipline development practices to control the existing “chaos”: uncommented commits, unplanned or non-well-reported work, and more. In such scenario, I was “forced” to ensure the Subversion and TFS integration and also “route” my team into the right direction. 

I thought that I would able to handle this situation starting from the Subversion with relative ease and I was right. I implemented a hook to enforce the usage of these practices and also solve some integration issues. The continuous integration server (Team City) also helps me in track task with sources, listing all related tasks from the TFS. The fact is I also modified the existing TFS integration plugin to work as I expected, but this is part of another history.

Therefore with a simple Subversion hook I get a centralized or server side “check-in” polices system to:

Avoid empty commit messages: Every single commit must contain a description.

Avoid unplanned work: Every single commit message must contain the id of the work item of type “Sprint Backlog Item” in “In Progress” state.

Automatically track the task history: As every single commit message contains a work item id then all commit messages are attached as history of the work item. 

Accept some relevant commit comments or messages: A "relevant" commit comment is about a commit comment that indicates a feature completion. It should be related with addition, modification, removal or bug fixing, but just when those goals are actually "Done, Done". The usage of this pattern also turn automatically to the “Done” the state of the related work items. This messages can also use to generate the release notes document for a specific version

Migrating from SVN to TFS version control system

My support team started with some existing migration tools but the outcome was incomplete (with runtime errors included). On the other hand, I actually thought in move from SVN to GIT, because the next version of TFS comes with built-in support for GIT repositories but apparently “they” can’t wait and yes, I have to move my source to TFS.

Therefore, with the GIT idea in my mind, I came across a couple of GIT extensions and I started to type the follow commands:

> git svn clone http://svn.mydivsion:3690/repository
> git tfs init http://tfs.mycompany:8080/tfs/DefaultCollection/MyDivision $/MyDivision 
> git tfs fetch 
> git rebase /remotes/default/tfs master 
> git tfs rcheckin –authors="authors.txt"

Everything works like charm. The full migration can be done. Now, after years of discussions, I lose and “they" win. Now, I have to start over again.

The fact is that I think TFS was built on top of one of the best Business Intelligence Platform (BI) ever made and can be configured to produce a lot of multidimensional metrics very useful for ALM. But this migration to TFS version control system are backwards steps, at least for me.


Hook the TFS commits from the server side

As I wrote before, TFS has its own check-in policy system. Actually it is a “client side evaluation” check-in policy system, and I don’t like them. Such policies can also be overridden by the developers. TFS check-in policy system looks more like a warnings system.

But TFS comes with a server side event handling system and can be plugged-in deploying an assembly into the web service plugins directory of the application tier of the current installation of TFS.

I started to re-write all my useful check-in polices, referencing the right TFS assemblies.

For instance, the “Avoid empty commit messages” policy looks like this:

public class AvoidEmptyCommitMessagePolicy : ISubscriber
{
    public string Name
    {
        get { return " AvoidEmptyCommitMessagePolicy"; }
    }

    public SubscriberPriority Priority
    {    
        get { return SubscriberPriority.Normal; }
    }
    
    public EventNotificationStatus ProcessEvent(TeamFoundationRequestContext requestContext, NotificationType notificationType, object notificationEventArgs, out int statusCode, out string statusMessage, out ExceptionPropertyCollection properties)
    {
        statusCode = 0;
        properties = null;
        statusMessage = string.Empty;
        var eventNotificationStatus = EventNotificationStatus.ActionPermitted;
        if (notificationType == NotificationType.DecisionPoint)
        {
            var checkinNotification = notificationEventArgs as CheckinNotification;
            if (checkinNotification != null && string.IsNullOrEmpty(checkinNotification.Comment))
            {
                statusMessage = "Your commit has been blocked because you didn't give any log message.\n Please write a log message describing the purpose of your changes and \n then try committing again.";
                eventNotificationStatus = EventNotificationStatus.ActionDenied;
            }
        }

        return eventNotificationStatus;
    }
}

The “Avoid unplanned work” looks like this:

public class AvoidUnplannedWorkPolicy : ISubscriber
{
        /*...*/
        if (checkinNotification != null)
        {
                CheckinNotificationInfo notificationInfo = checkinNotification.NotificationInfo;
                CheckinNotificationWorkItemInfo[] workItemInfos = notificationInfo.WorkItemInfo;
                if (notificationType == NotificationType.DecisionPoint)
                {
                    if (workItemInfos == null || workItemInfos.Length == 0)
                    {
                        statusMessage = "Your commit has been blocked because you didn't give any work item linked to this changeset.";
                        eventNotificationStatus = EventNotificationStatus.ActionDenied;
                    }
                    
                    var teamFoundationLocationService = requestContext.GetService<TeamFoundationLocationService>();
                    var uri = new Uri(teamFoundationLocationService.ServerAccessMapping.AccessPoint + "/" + requestContext.ServiceHost.Name);
                    TfsTeamProjectCollection tfsTeamProjectCollection =  TfsTeamProjectCollectionFactory.GetTeamProjectCollection(uri);
                    var workItemStore = tfsTeamProjectCollection.GetService<WorkItemStore>();
                    if (workItemInfos.Exists(workItem => workItem.Type != "Sprint Backlog Item" ||  workItem.State != "In Progress"))
                    {
                        statusMessage = "Your commit has been blocked because you didn't give any “Sprint Backlog Item” in state "In Progress" linked to this changeset.";
                        eventNotificationStatus = EventNotificationStatus.ActionDenied;
                    }
                }
       }
        /*...*/
}

and so on.

The AvoidUnplannedWorkPolicy is very Scrum for Team System process template related. It can be done more generic and cross process template support, but it’s just an example. 

That fact is I’m now thinking about a lot of policies to improve my team practices, but if tell you I have to kill you  ;-)

Conclusions

Seems like I can live with such transition (to TFS version control system), but it isn’t all happiness. Now I have to wait for an approval process to deploy my server side policies in the main TFS server. But if “they” don’t like such intrusion then probably I will also back into the “chaos”.

In the meantime I just wrote this blog post.

PD: Yes I'm back in the game again, but now without Subversion, with my ankle ligaments broken as outcome of a basketball game, and with my team Villa Clara as the new Champion of the Cuban National Baseball League ;-)

Thursday, January 17, 2013

Cache storage explained

Introduction 

Caching is about improving applications performance. The most expensive performance costs of the applications are related to the data retrieving, typically when this data requires to be moved across the network or loaded from disk. But some data have a slow changing behavior (a.k.a nonvolatile) and don't require to be re-read with the same frequency of the volatile data.

So, to improve your application performance and handle this "nonvolatile" data from a pretty clean approach, Catel comes with a CacheStorage<TKey, TValue> class. Notice that the first generic parameter is the type of the key and the second the type of the value that will be stored, just like a Dictionary<TKey, TValue> but CacheStorage isn't it just a Dictionary. 

This class allows you to retrieve data and store it into the cache with a single statement and also helps you to handle the expiration policy if you need it.

Let’s take a look into these features:

Initializing a cache storage

To initialize a cache storage field into your class use the following code:

/*...*/
private readonly CacheStorage<string, Person> _personCache = new CacheStorageCacheStorage<string, Person>(allowNullValues: true);

Retrieve data and store it into the cache with a single statement

To retrieve data and storing into a cache with a single statement use the following code:

/*...*/
var person = _personCache.GetFromCacheOrFetch(Id, () => service.FindPersonById(Id));

When this statement is executed more than once with the same key, the value will be retrieved from the cache storage instead of from the service call (notice the usage of a lambda expression). The service call will be executed just the first time or if the item is removed from the cache manually or automatically due to the expiration policy.

Using cache expiration policies

The cache expiration policies add a removal behavior to the cache storage items. A policy signals that an item is expired to make that cache storage remove the item automatically.

A default cache expiration policy initialization code can be specified during cache storage initialization constructor:

/*...*/
private readonly CacheStorage<string, Person> _personCache = new CacheStorageCacheStorage<string, Person>(() => ExpirationPolicy.Duration(TimeSpan.FromMinutes(5)), true);

You can specify a specific expiration policy for an item when it's stored:

/*...*/
var person = _personCache.GetFromCacheOrFetch(id, () => service.GetPersonById(id), ExpirationPolicy.Duration(TimeSpan.FromMinutes(10)));

The default cache policy specified at cache storage initialization will be used if during item storing the expiration policy is not specified.

Build-in expiration policies

Catel comes with build-in expiration policies. They are listed in the following table:

Expiration policy Type Description Initialization code sample
AbsoluteExpirationPolicy Time-base The cache item will expire on the absolute expiration DateTime
ExpirationPolicy.
Absolute(new DateTime(21, 12, 2012))
DurationExpirationPolicy Time-base The cache item will expire using the duration TimeSpan to calculate the absolute expiration from DateTime.Now
ExpirationPolicy.
Duration(TimeSpan.FromMinutes(5))
SlidingExpirationPolicy Time-base The cache item will expire using the duration TimeSpan to calculate the absolute expiration from DateTime.Now, but every time the item is requested, it is expanded again with the specified TimeSpan
ExpirationPolicy.
Sliding(TimeSpan.FromMinutes(5))
CustomExpirationPolicy Custom The cache item will expire using the expire function and execute the reset action if is specified. The example shows how to create a sliding expiration policy with a custom expiration policy.
var startDateTime = DateTime.Now;
var duration = TimeSpan.FromMinutes(5);
ExpirationPolicy.
Custom(() => DateTime.Now >
startDateTime.Add(duration), 
() => startDateTime = DateTime.Now); 
CompositeExpirationPolicy Custom Combines several expiration policies into a single one. It can be configured to expire when any or all policies expire.
new CompositeExpirationPolicy().
Add(ExpirationPolicy.
Sliding(TimeSpan.FromMinutes(5))).
Add(ExpirationPolicy.
Custom(()=>...))

Implementing your own expiration cache policy

If the CustomExpirationPolicy is not enough, you can implement your own expiration policy to makes that cache item expires triggered from a custom event. You are also able to add some code to reset the expiration policy if the item is read from the cache before it expires (just like SlidingExpirationPolicy does).

To implement an expiration cache policy use the following code template:

public class MyExpirationPolicy : ExpirationCachePolicy
{
   public MyExpirationPolicy():base(true)
   {
   }

   public override bool IsExpired
   {
      get
      {
         // Add your custom expiration code to detect if the item expires
      }
   }

   public override void OnReset()
   {
      // Add your custom code to reset the policy if the item is read.
   }
}

Notice that the base constructor has a parameter to indicate if the policy can be reset. Therefore, if you call the base constructor false then the OnReset method will never be called.

Conclusion

If you are using caching in your current projects, you are probably using a different caching strategy. Now you have learned about the caching possibilities in Catel, you should definitely try it out and be amazed at the possibilities :)

X-ray StoneAssemblies.MassAuth with NDepend

Introduction A long time ago, I wrote this post  Why should you start using NDepend?  which I consider as the best post I have ever...