Monday, August 11, 2014

What about Catel.Fody and computed read-only properties change notifications?


In my last post, I covered the implementation of INotifyPropertyChanged interface when using SheepAspect as an AOP library. In the end, I also implemented an approach to notify changes of computed read-only properties. This approach has a downside, the dependent properties discovering process must be done in run-time.

Such a journey recalls us that Catel.Fody didn’t have support for notifying property changes of computed read-only properties. How could such a thing ever be possible? Obvious, “the shoemaker's son always goes barefoot” ;). But don’t worry: the feature is here, moving the dependent properties discovering process to build-time, thanks to Fody.

As you probably know by now, Catel.Fody will rewrite all properties on the DataObjectBase and ViewModelBase. So, if a property is written like this:

public string FirstName { get; set; }

will be weaved into

public string FirstName
{
    get { return GetValue<string>(FirstNameProperty); }
    set { SetValue(FirstNameProperty, value); }
}

public static readonly PropertyData FirstNameProperty = RegisterProperty("FirstName", typeof(string));


But now we added a new feature to Catel.Fody. If a read-only computed property like this one exists:

public string FullName
{
    get { return string.Format("{0} {1}", FirstName, LastName).Trim(); }
}

the OnPropertyChanged method will be also weaved into

protected override void OnPropertyChanged(AdvancedPropertyChangedEventArgs e)
{
    base.OnPropertyChanged(e);
    if (e.PropertyName.Equals("FirstName"))
    {
        base.RaisePropertyChanged("FullName");
    }
    if (e.PropertyName.Equals("LastName"))
    {
        base.RaisePropertyChanged("FullName");
    }
}


This feature is already available in the latest beta package of Catel.Fody.

Try yourself and let us know.

Monday, June 30, 2014

Implementing notify property changed with SheepAspect

Introduction

I spend some time looking for AOP options for .NET.  All references point to PostSharp as the coolest option - even when you have to pay to use it - due to its features, starting from the way it works – in build time and static – plus a lot of build-in “advice” and extensions points.

There are also several run time and dynamic options such as Castle, Unity, etc., but I prefer the build time and static approach.  Indeed I use Fody - as an alternative to PostSharp - even when I have to write down custom plugins directly in IL.

But all of these AOP-like libraries and tools for .NET do not allow you to handle exactly the all the AOP concepts such as pointcut, join-point, and advice.

But recently I found a project in CodePlex, named SheepAspect with an introductory statement that includes the following words “…was inspired in AspectJ”. So, I just installed the package from NuGet and started to write in C# a notify property changed proof of concept with a friend of mine (Leandro).

Introducing NotifyPropertyChanged aspect

The very first step is to write a NotifyPropertyChangedAspect class. It’s important to get related with SAQL in order to query the right properties from the right types, and with the usage of the SheepAspect attributes. For this particular example the pointcut includes “all public property setters of types that implement System.ComponentModel.INotifyPropertyChanged” and look like this:
[Aspect]
public class NotifyPropertyChangedAspect
{
    [SelectTypes("ImplementsType:'System.ComponentModel.INotifyPropertyChanged'")]
    public void NotifiedPropertyChangedTypes()
    {
    }

    [SelectPropertyMethods("Public & Setter & InType:AssignableToType:@NotifiedPropertyChangedTypes")]
    public void PublicPropertiesOfTypesThatImplementsINotifyPropertyChangedInterfacePointCut()
    {
    }
}

Now, I just needed to add the notify property changed behavior as around advice just as follow:

[Around("PublicPropertiesOfTypesThatImplementsINotifyPropertyChangedInterfacePointCut")]
public void AdviceForPublicPropertiesOfTypesThatImplementsINotifyPropertyChangedInterface(PropertySetJointPoint jp)
{
    object value = jp.Property.GetValue(jp.This);
    if (!object.Equals(value, jp.Value))
    {
        jp.Proceed();
        jp.This.RaiseNotifyPropertyChanged(jp.Property);
    }
}

As you should notice, to implement this, I also introduce a couple of extension methods TryGetPropertyChangedField and of course the RaiseNotifyPropertyChanged itself:

public static bool TryGetPropertyChangedField(this Type type, out FieldInfo propertyChangedEvent)
{
    propertyChangedEvent = null;
    while (propertyChangedEvent == null && type != null && type != typeof(object))
    {
        propertyChangedEvent = type.GetField("PropertyChanged", BindingFlags.Instance | BindingFlags.NonPublic);
        if (propertyChangedEvent == null)
        {
            type = type.BaseType;
        }
        else if (!typeof(MulticastDelegate).IsAssignableFrom(propertyChangedEvent.FieldType))
        {
            propertyChangedEvent = null;
        }
    }

    return propertyChangedEvent != null;
}

/*...*/

public static void RaiseNotifyPropertyChanged(this object instance, PropertyInfo property)
{
    FieldInfo propertyChangedEvent;
    if (instance.GetType().TryGetPropertyChangedField(out propertyChangedEvent))
    {
        var propertyChangedEventMulticastDelegate = (MulticastDelegate)propertyChangedEvent.GetValue(instance);
        var invocationList = propertyChangedEventMulticastDelegate.GetInvocationList();
                
        foreach (var handler in invocationList)
        {
            MethodInfo methodInfo = handler.GetMethodInfo();
            methodInfo.Invoke(handler.Target, new[] { instance, new PropertyChangedEventArgs(property.Name) });
        }
    } 
}

Now, if a class that implements or inherits from a class that implements INotifyPropertyChanged interface is added, just like this one:

public class Person : INotifyPropertyChanged
{
        public event PropertyChangedEventHandler PropertyChanged;

        public string FirstName { get; set; }

        public string LastName { get; set; }

        public string FullName
        {
            get
            {
                return string.Format(CultureInfo.InvariantCulture, "{0} {1}", this.FirstName, this.LastName).Trim();
            }
        }
}

and a program like this one is written:

this.person.PropertyChanged += (sender, args) =>
                {
                    object value = sender.GetType().GetProperty(args.PropertyName).GetValue(sender);
                    Console.WriteLine("Property Changed => '{0}' = '{1}'", args.PropertyName, value);
                };

this.person.FirstName = "Igr Alexánder";
this.person.LastName = "Fernández Saúco";

then the output will be:

Property Changed => 'FirstName' = 'Igr Alexánder'
Property Changed => 'LastName' = 'Fernández Saúco'

Uhmm! But what about with the computed read-only properties like FullName?

Notifying property changes of computed read-only properties.

In order to notify changes of computed read-only properties, an inspection of the IL code is required. The .NET native reflection API is limited. From the PropertyInfo is only possible to get the IL byte array from the get method body, and nothing more. Therefore, I just switched to the Mono.Cecil reflection API to be able to complement the existing RaiseNotifyPropertyChanged extension method with the following extension methods:

public static bool ExistPropertyDependencyBetween(this Type type, PropertyInfo dependentProperty, PropertyInfo propertyInfo)
{
    AssemblyDefinition assemblyDefinition = new DefaultAssemblyResolver().Resolve(type.Assembly.FullName);
    TypeDefinition typeDefinition = assemblyDefinition.MainModule.GetType(type.FullName);
    PropertyDefinition dependentPropertyDefinition = typeDefinition.Properties.FirstOrDefault(definition => definition.Name == dependentProperty.Name);
    
    bool found = false;
    if (dependentPropertyDefinition != null)
    {
        MethodDefinition definition = dependentPropertyDefinition.GetMethod;
        if (definition.HasBody)
        {
            ILProcessor processor = definition.Body.GetILProcessor();

            int idx = 0;
            while (!found && idx < processor.Body.Instructions.Count)
            {
                Instruction instruction = processor.Body.Instructions[idx];
                
                MethodDefinition methodDefinition;
                if (instruction.OpCode == OpCodes.Call && (methodDefinition = instruction.Operand as MethodDefinition) != null && methodDefinition.DeclaringType.IsAssignableFrom(typeDefinition) && methodDefinition.Name == string.Format(CultureInfo.InvariantCulture, "get_{0}", propertyInfo.Name))
                {
                    found = true;
                }
                else
                {
                    idx++;
                }
            }
        }
    }

    return found;
}

/*...*/

public static IEnumerable<PropertyInfo> GetDependentPropertiesFrom(this Type type, PropertyInfo property)
{
    List<PropertyInfo> dependentPropertyInfos = type.GetProperties(BindingFlags.Instance | BindingFlags.Public).Where(dependentProperty => property != dependentProperty && dependentProperty.CanRead && type.ExistPropertyDependencyBetween(dependentProperty, property)).ToList();
    for (int i = 0; i < dependentPropertyInfos.Count; i++)
    {
        foreach (PropertyInfo info in type.GetDependentPropertiesFrom(dependentPropertyInfos[i]))
        {
            if (!dependentPropertyInfos.Contains(info))
            {
                dependentPropertyInfos.Add(info);
            }
        }
    }

    return dependentPropertyInfos;
}
just like this:
public static void RaiseNotifyPropertyChanged(this object instance, PropertyInfo property)
{
    FieldInfo propertyChangedEvent;
    if (instance.GetType().TryGetPropertyChangedField(out propertyChangedEvent))
    {
        var propertyChangedEventMulticastDelegate = (MulticastDelegate)propertyChangedEvent.GetValue(@this);
        var invocationList = propertyChangedEventMulticastDelegate.GetInvocationList();
                
        foreach (var handler in invocationList)
        {
            MethodInfo methodInfo = handler.GetMethodInfo();
            methodInfo.Invoke(handler.Target, new[] { instance, new PropertyChangedEventArgs(property.Name) });
        }

        foreach (PropertyInfo propertyInfo in instance.GetType().GetDependentPropertiesFrom(property))
        {
            foreach (var handler in invocationList)
            {
                MethodInfo methodInfo = handler.GetMethodInfo();
                methodInfo.Invoke(handler.Target, new[] { instance, new PropertyChangedEventArgs(propertyInfo.Name) });
            }
        }
    }
}

So, now the program output is:

Property Changed => 'FirstName' = 'Igr Alexánder'
Property Changed => 'FullName' = ' Igr Alexánder' 
Property Changed => 'LastName' = 'Fernández Saúco'
Property Changed => 'FullName' = ' Igr Alexánder Fernández Saúco'

Conclusion

At this point, you should be worried about the performance and a lot of reflection API calls. I'm pretty sure that this issue could be handle with the right caching approach ;).

I didn’t know why I never heard about SheepAspect before. Probably no one trust in something called “Sheep”, but believe me SheepAspect rocks!

Wait for a second! Right now I'm reading about something called mixing. Probably I can get a more declarative approach to implement this, just like the Fody or PostSharp home page examples. But I'm not sure right now, so, you have to wait for my next post to know about it ;).

Friday, February 21, 2014

Developing a ReSharper Plugin – The backward compatibility approach

Introduction

We have being developed a ReSharper plugin for Catel framework also known as CatelR# for a while, following an interesting approach in order support the new R# version and also keep the backward compatibility.

If you want to know how we made it, just take a look.

Visual Studio solution setup

1)    Create project per supported R# version, which means that the output of each project is targeting to the specific version of R# and references the specific version of the SDK.



2)    Keep in mind, that could be several breaking changes between R# SDK versions, but nothing that couldn’t be handled with pre-processor directives.

#if R70 || R71 || R80
        protected override void Process(CSharpGeneratorContext context)

#elif R61
        public override void Process(CSharpGeneratorContext context)
#endif
        {
            CSharpElementFactory factory = CSharpElementFactory.GetInstance(context.Root.GetPsiModule());
#if R80
            IDeclaredType viewModelToModelAttributeClrType = TypeFactory.CreateTypeByCLRName(CatelMVVM.ViewModelToModelAttribute, context.PsiModule, UniversalModuleReferenceContext.Instance);
#else
            IDeclaredType viewModelToModelAttributeClrType = TypeFactory.CreateTypeByCLRName(CatelMVVM.ViewModelToModelAttribute, context.PsiModule);
#endif
            /*...*/
#if R80
                        var fixedArguments = new List<AttributeValue> { new AttributeValue(ClrConstantValueFactory.CreateStringValue(model.ShortName, context.PsiModule, UniversalModuleReferenceContext.Instance)) };
#else
                        var fixedArguments = new List<AttributeValue> { new AttributeValue(ClrConstantValueFactory.CreateStringValue(model.ShortName, context.PsiModule)) };
#endif
                        if (propertyName != modelProperty.ShortName)
                        {
#if R80
                            fixedArguments.Add(new AttributeValue(ClrConstantValueFactory.CreateStringValue(modelProperty.ShortName, context.PsiModule,  UniversalModuleReferenceContext.Instance)));
#else
                            fixedArguments.Add(new AttributeValue(ClrConstantValueFactory.CreateStringValue(modelProperty.ShortName, context.PsiModule)));
#endif
                        }
            /*...*/
                    }
                }
            }
        }

3)    Keep all these entire project sources synchronized. Could be very easy thanks to Caitlyn.




4)   Finally redirect the build projects outputs to dealing with ease with the packaging of the deployment units.

Building the deployment units

The R# plugin build process indeed an heterogeneous one as any build process. Even though  this build could be handled via msbuild tasks, we actually recommend the usage of tools with intuitive GUI in order to quickly creating and debugging such build "scripts", such as FinalBuilder or VisualBuild.


1) Since 8.0 R# version  the NuGet based extension manager is available. Therefore one of the build output could be a NuGet package to distribute your plugin through the ReSharper extension gallery.

2) But NuGet based extension manager is not available for all R# versions. Therefore a second build output could also be classic deployment unit built on top of any of the existing installer system.  For instance InnoSetup or NSIS.

The following are the code snippets from the install and uninstall sections of CatelR# setup. Notice how we deal with build output to support all R#  versions.


# ...
# Installer section
# ...

Push "v6.1"
Push "v7.0"
Push "v7.1"
Push "v8.0"
Push "v8.1"
${Do}
  Pop $0
  ReadRegStr $1 HKLM "Software\JetBrains\ReSharper\$0" InstallDir
  ${If} $1 != ''
    DetailPrint "Installing Catel.ReSharper for JetBrains ReSharper $0"
    SetOutPath "$1\Plugins\$(^Name)"
    ${If} $0 == 'v6.1'
      File /r "..\..\output\Debug\v6.1\*.dll"
    ${ElseIf} $0 == 'v7.0'
      File /r "..\..\output\Debug\v7.0\*.dll"
    ${ElseIf} $0 == 'v7.1'
      File /r "..\..\output\Debug\v7.1\*.dll"
    ${ElseIf} $0 == 'v8.0'
      File /r "..\..\output\Debug\v8.0\*.dll"
    ${ElseIf} $0 == 'v8.1'
      File /r "..\..\output\Debug\v8.1\*.dll"
    ${EndIf}
    Push true
    Pop $3
    WriteRegStr HKLM "${REGKEY}" "$0" 1
  ${EndIf}
${LoopUntil} $0 == "v6.1"

# ... 

# ... 
# Uninstaller section
# ... 

Push "v6.1"
Push "v7.0"
Push "v7.1"
Push "v8.0"
Push "v8.1"
${Do}
  Pop $0
  ReadRegStr $1 HKLM "${REGKEY}" "$0"
  ${If} $1 == '1'
    ReadRegStr $2 HKLM "Software\JetBrains\ReSharper\$0" InstallDir
    ${If} $2 != ''
      RMDir /r /REBOOTOK "$2\Plugins\$(^Name)" 
      DeleteRegValue HKLM "${REGKEY}" "$0"
    ${EndIf}
  ${EndIf}
${LoopUntil} $0 == "v6.1"  

# ...


Conclusions

Just a few minutes ago, I read the notification about the release of R#8.2 EAP. It’s a good moment to revalidate this approach. Let’s see,… creating a new project with post-fix 82, …installing the SDK package, ..., ...time out, sorry…slow connection…,…,…,..updating the build script, …, …, … reviewing for breaking changes, good news, there are no breaking changes,… updating setup script…., committing source modifications, … running the build script and it’s done.

Now you can update the extension from the extension gallery or download the full installer of CatelR# with support for R#8.2 EAP ;)

Tuesday, October 22, 2013

Catel extends Prism modularity options through NuGet

Introduction


Prism provides a lot of options regarding modularity by default. Fact is that most of the stuff  you need to build a modularized application is available on Prism. Prism contains several modularity options including ways to retrieve modules from a directory, set them up through the configuration file, and even options to create modularized Silverlight applications.

On the other hand, Catel provides support for module catalog composition in order to enable more than one modularity options into a single application.

Nowadays, initiatives around NuGet come from everywhere, such as Shimmer with the same goal of ClickOnce (but works), OctopusDeploy for release promotion and application deployment or even the recently released extension manager of ReSharper 8.0.

So, the Catel team comes with a new one and the fact is that we wondered why no one told about this before: 

What if you would be able to distribute your application's modules through NuGet?

If you want to know how, take a look.


Enabling NuGet based modularity option


With the forthcoming 3.8 version of Catel you will be able to install and load modules from a NuGet repository. The only thing you have to do is:
  1. Write a module as you know with Prism (or better in combination with Catel).
  2. Create a NuGet package for the module and publish it into your favorite gallery.
  3. Use and configure the NuGetBasedModuleCatalog in the application’s bootstrapper.
The first two steps are well documented. For details see creating and publishing a package and declaring modules from NuGet and Catel documentation.


So, now we can focus on the third one.

Use and configure the NuGetBasedModuleCatalog in the application’s bootstrapper


In order to simplify the following explanation we published a package into the official NuGet Gallery named Catel.Examples.WPF.Prism.Modules.NuGetBasedModuleA. Such package contains an assembly with a single class like this one:
 
public class NuGetBasedModuleA : ModuleBase
{
    public NuGetBasedModuleA(IServiceLocator container):base("NuGetBasedModuleA", null, container)
    {
    }

    protected virtual void OnInitialized()
    {
      var messageService = this.Container.ResolveType<IMessageService>();
      messageService.Show("NuGetBasedModuleA Loaded!!");
    }
}

Let’s start writing an  application that will use this public packaged module. Indeed, the main point of interest here is about the usage and configuration of the NuGetBasedModuleCatalog. Therefore just write the bootstrapper as follow:

public class Boot : BootstrapperBase<MainWindow, NuGetBaseModuleCatalog>
{
  protected override void ConfigureModuleCatalog()
  {
    base.ConfigureModuleCatalog();
    this.ModuleCatalog.AddModule(new ModuleInfo 
          { 
               ModuleName = "NuGetBasedModuleA", 
               ModuleType = "Catel.Examples.WPF.Prism.Modules.NuGetBasedModuleA.NuGetBasedModuleA, Catel.Examples.WPF.Prism.Modules.NuGetBasedModuleA", 
               Ref = "Catel.Examples.WPF.Prism.Modules.NuGetBasedModuleA, 1.0.0-BETA", 
               InitializationMode = InitializationMode.OnDemand
          });
  }
}

Notice how the Ref property of ModuleInfo is used to specify the package id and optionally the version number. If the version number is not specified then the application always will try to download the latest version keeping the modules up to date.

Finally, to make this demo work, just put in the XAML of the main window the ModuleManagerView control, run


and click the check box.

Conclusions


Yes, we did it again. Catel keeps working alongside Prism. Now, we extended Prism’s modularity options through NuGet in order to provide a powerful mechanism of application module distribution. 

If you want to try, just get the latest beta package of Catel.Extensions.Prism and discover by yourself all scenarios that you can handle with this Catel exclusive feature.

Saturday, September 28, 2013

Keep updated CatelR# with ReSharper 8.0



CatelR# is now compatible with the latest version of ReSharper (R#).

We also keep the compatibility with the most recent R# versions including 6.0, 7.0 and 7.1. The full installer, that also includes the R# 8.0 assemblies, of the latest beta of CatelR# is available to download here.

R# 8.0 includes a NuGet based Extension Manager and the Catel Team started to publish the CatelR# extension in the Extension Gallery.

Yes, CatelR# is alive. We keep working in more features and also are waiting for your ideas. You got one? Let us know!

So, ensure yourself to install the latest version of R# and you will no miss any of the forthcoming CatelR#'s cool features.

Friday, September 27, 2013

How Configure HTTP Access to Analysis Services on Internet Information Services from Powershell

Introduction


You can enable HTTP access to Analysis Services by configuring MSMDPUMP.dll, an ISAPI extension that runs in Internet Information Services (IIS) and pumps data to and from client applications and an Analysis Services server. This approach provides an alternative means for connecting to Analysis Services when your BI solution calls for the following capabilities:
  • Client access is over Internet or extranet connections, with restrictions on which ports can be enabled.
  • Client connections are from non-trusted domains in the same network. {…}"
The text above, is the way of the “Configure HTTP Access to Analysis Services on Internet Information Services (IIS)” guide start. Such steps are typical to setup the development environment or production server of a BI solution that move data over HTTP directly from an instance of SQL Server Analysis Services.

Repeat these steps over and over again encourage me to write the script to automate most of the steps of this guide with Powershell.

The Code

So, without more introduction here is the source:

1) Import or load the WebAdministration Module or PSSnapin (depends on IIS version)
$webAdminModule = Get-Module -ListAvailable | Where-Object { $_.Name -eq "WebAdministration" }
if ($webAdminModule -ne $null) 
{
    Write-Host "Importing WebApplication Module..."
    Import-Module -ModuleInfo $webAdminModule
}
else
{
    Write-Host "Loading WebApplication PSSnapin..."
    Add-PSSnapin WebAdministration
}

2) Create and setup application pool
Write-Host "Creating application pool OLAP..."
New-WebAppPool -Name OLAP -ErrorAction SilentlyContinue
Set-ItemProperty IIS:\AppPools\OLAP -Name ManagedRuntimeVersion -Value v2.0.50727
Set-ItemProperty IIS:\AppPools\OLAP -Name ManagedPipelineMode -Value 1

3) Set application pool credentials
Write-Host "Enter application pool credentials"
$userName = Read-Host "UserName" 
$securedPassword = Read-Host "Password" -AsSecureString
$userName = $credential.UserName
$password = [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR($securedPassword));
Set-ItemProperty IIS:\AppPools\OLAP -Name ProcessModel -Value @{ userName="$userName"; password="$password"; IdentityType=3 }

3) Create the web application
Write-Host "Creating OLAP Application in Default Web Site..."
New-Item 'IIS:\Sites\Default Web Site\OLAP' -Type Application -PhysicalPath "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\isapi"
Set-ItemProperty "IIS:\Sites\Default Web Site\OLAP" -Name ApplicationPool -Value OLAP
Set-WebConfiguration system.web/authentication 'IIS:\Sites\Default Web Site\OLAP' -value @{ mode = 'None' }

4) Add the handler mapping (it can't be done with the available command-lets, so)
Write-Host "Adding the handler mapping..."
[System.Reflection.Assembly]::LoadFrom("C:\windows\system32\inetsrv\Microsoft.Web.Administration.dll")
$isapiPath = "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\isapi\msmdpump.dll"
$isapiConfiguration = Get-WebConfiguration "/system.webServer/security/isapiCgiRestriction/add[@path='$isapiPath']/@allowed"  
if (-not($isapiConfiguration -eq $null -or $isapiConfiguration.value))
{  
    Write-Host "Enabling ISAPI Module - $isapiPath"
    Set-WebConfiguration "/system.webServer/security/isapiCgiRestriction/add[@path='$isapiPath']/@allowed" -value "True" -PSPath:IIS:\  
}

$serverManager = New-Object Microsoft.Web.Administration.ServerManager
if ($isapiConfiguration -eq $null)
{
    Write-Host "Adding and enabling ISAPI Module - $isapiPath"
    $appHostConfig = $serverManager.GetApplicationHostConfiguration();
    $isapiCgiRestrictionSection = $appHostConfig.GetSection("system.webServer/security/isapiCgiRestriction");
    $isapiCgiRestrictionCollection =  $isapiCgiRestrictionSection.GetCollection();
    
    $cgiRestrictionElement = $isapiCgiRestrictionCollection.CreateElement("add");
    $cgiRestrictionElement.SetAttributeValue("path", "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\isapi\msmdpump.dll");
    $cgiRestrictionElement.SetAttributeValue("allowed", "true");
    $cgiRestrictionElement.SetAttributeValue("groupId", "");

    $isapiCgiRestrictionCollection.AddAt(0, $cgiRestrictionElement);
}

$webConfig = $serverManager.GetWebConfiguration("Default Web Site", "OLAP");
$handlersSection = $webConfig.GetSection("system.webServer/handlers");
$handlersCollection = $handlersSection.GetCollection();

$handlerElement = $handlersCollection.CreateElement("add");
$handlerElement.SetAttributeValue("name", "OLAP");
$handlerElement.SetAttributeValue("path", "msmdpump.dll");
$handlerElement.SetAttributeValue("verb", "*");
$handlerElement.SetAttributeValue("modules", "IsapiModule");
$handlerElement.SetAttributeValue("scriptProcessor", "C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\isapi\msmdpump.dll");

$handlersCollection.AddAt(0, $handlerElement);

$serverManager.CommitChanges();

The main differences with the original non-automated guide are: 
  • Doesn’t create a copy of the files in ‘C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\bin\isapi’ to a ‘c:\inetpub\wwwroot\olap’ just publishes directly the original directory as application. Our deployments (production or workstation) typically involve a single instance of SQL Server Analysis Service.
  • Doesn’t modify the archive msmdpump.ini because the default configuration is enough.
  • Doesn't grant data access permissions. Assumes that the  application pool’s credentials have the right data access permissions.

Conclusions

I’m agree with you. This script can be more parameterized. But with this simple “coded” blog post I just want to send you a message. 

When you have to face automation tasks of your daily work follow this tips:
  • Just do a small step at once.
  • Don’t add complexity to the solution more than your needs.
  • Don’t overwhelm you, do it for fun. 
  • The code will be better tomorrow. 
  • One day as result of your small improvements the solution will look just like you ever wanted. 

But if I you have something to automate just start NOW ;-). 

Wednesday, August 14, 2013

If you can’t defeat them, just “join” them – Part I

Introduction

I spend a lot of time using Subversion as the Version Control System for my development team. The fact is I love Subversion and strongly think that there is no a centralized version control system (CVCS) like Subversion. 

On the other hand, my organization promotes the usage of Microsoft Team Foundation Server (TFS) as Application Life-Cycle Management (ALM) and I also think TFS is a great integrated environment. However some of its services are in disadvantage mainly in terms of usability against some third party software.

But I’m not here to talk about TFS vs. Subversion, even when the user experience of the Visual SVN is better than Team Explorer (even in VS2012).

I just want to share how I committed to the indication of shutting down my local team Subversion, and move my entire sources to the centralized TFS, keeping all my “non-integrated” development environment’s cool features.

What features will I miss if I’ll move to TFS version control system?

I received a lot of critics because I don’t use the integration features of Visual Studio Team System. Actually I have a non-fully integrated environment for ALM (TFS and Subversion) but always keeping traces between sources and tasks. I know that TFS has built-in support for this and also have a check-in policy system to “force” some team disciplines.

On the other hand, I needed move my team into some of self-discipline development practices to control the existing “chaos”: uncommented commits, unplanned or non-well-reported work, and more. In such scenario, I was “forced” to ensure the Subversion and TFS integration and also “route” my team into the right direction. 

I thought that I would able to handle this situation starting from the Subversion with relative ease and I was right. I implemented a hook to enforce the usage of these practices and also solve some integration issues. The continuous integration server (Team City) also helps me in track task with sources, listing all related tasks from the TFS. The fact is I also modified the existing TFS integration plugin to work as I expected, but this is part of another history.

Therefore with a simple Subversion hook I get a centralized or server side “check-in” polices system to:

Avoid empty commit messages: Every single commit must contain a description.

Avoid unplanned work: Every single commit message must contain the id of the work item of type “Sprint Backlog Item” in “In Progress” state.

Automatically track the task history: As every single commit message contains a work item id then all commit messages are attached as history of the work item. 

Accept some relevant commit comments or messages: A "relevant" commit comment is about a commit comment that indicates a feature completion. It should be related with addition, modification, removal or bug fixing, but just when those goals are actually "Done, Done". The usage of this pattern also turn automatically to the “Done” the state of the related work items. This messages can also use to generate the release notes document for a specific version

Migrating from SVN to TFS version control system

My support team started with some existing migration tools but the outcome was incomplete (with runtime errors included). On the other hand, I actually thought in move from SVN to GIT, because the next version of TFS comes with built-in support for GIT repositories but apparently “they” can’t wait and yes, I have to move my source to TFS.

Therefore, with the GIT idea in my mind, I came across a couple of GIT extensions and I started to type the follow commands:

> git svn clone http://svn.mydivsion:3690/repository
> git tfs init http://tfs.mycompany:8080/tfs/DefaultCollection/MyDivision $/MyDivision 
> git tfs fetch 
> git rebase /remotes/default/tfs master 
> git tfs rcheckin –authors="authors.txt"

Everything works like charm. The full migration can be done. Now, after years of discussions, I lose and “they" win. Now, I have to start over again.

The fact is that I think TFS was built on top of one of the best Business Intelligence Platform (BI) ever made and can be configured to produce a lot of multidimensional metrics very useful for ALM. But this migration to TFS version control system are backwards steps, at least for me.


Hook the TFS commits from the server side

As I wrote before, TFS has its own check-in policy system. Actually it is a “client side evaluation” check-in policy system, and I don’t like them. Such policies can also be overridden by the developers. TFS check-in policy system looks more like a warnings system.

But TFS comes with a server side event handling system and can be plugged-in deploying an assembly into the web service plugins directory of the application tier of the current installation of TFS.

I started to re-write all my useful check-in polices, referencing the right TFS assemblies.

For instance, the “Avoid empty commit messages” policy looks like this:

public class AvoidEmptyCommitMessagePolicy : ISubscriber
{
    public string Name
    {
        get { return " AvoidEmptyCommitMessagePolicy"; }
    }

    public SubscriberPriority Priority
    {    
        get { return SubscriberPriority.Normal; }
    }
    
    public EventNotificationStatus ProcessEvent(TeamFoundationRequestContext requestContext, NotificationType notificationType, object notificationEventArgs, out int statusCode, out string statusMessage, out ExceptionPropertyCollection properties)
    {
        statusCode = 0;
        properties = null;
        statusMessage = string.Empty;
        var eventNotificationStatus = EventNotificationStatus.ActionPermitted;
        if (notificationType == NotificationType.DecisionPoint)
        {
            var checkinNotification = notificationEventArgs as CheckinNotification;
            if (checkinNotification != null && string.IsNullOrEmpty(checkinNotification.Comment))
            {
                statusMessage = "Your commit has been blocked because you didn't give any log message.\n Please write a log message describing the purpose of your changes and \n then try committing again.";
                eventNotificationStatus = EventNotificationStatus.ActionDenied;
            }
        }

        return eventNotificationStatus;
    }
}

The “Avoid unplanned work” looks like this:

public class AvoidUnplannedWorkPolicy : ISubscriber
{
        /*...*/
        if (checkinNotification != null)
        {
                CheckinNotificationInfo notificationInfo = checkinNotification.NotificationInfo;
                CheckinNotificationWorkItemInfo[] workItemInfos = notificationInfo.WorkItemInfo;
                if (notificationType == NotificationType.DecisionPoint)
                {
                    if (workItemInfos == null || workItemInfos.Length == 0)
                    {
                        statusMessage = "Your commit has been blocked because you didn't give any work item linked to this changeset.";
                        eventNotificationStatus = EventNotificationStatus.ActionDenied;
                    }
                    
                    var teamFoundationLocationService = requestContext.GetService<TeamFoundationLocationService>();
                    var uri = new Uri(teamFoundationLocationService.ServerAccessMapping.AccessPoint + "/" + requestContext.ServiceHost.Name);
                    TfsTeamProjectCollection tfsTeamProjectCollection =  TfsTeamProjectCollectionFactory.GetTeamProjectCollection(uri);
                    var workItemStore = tfsTeamProjectCollection.GetService<WorkItemStore>();
                    if (workItemInfos.Exists(workItem => workItem.Type != "Sprint Backlog Item" ||  workItem.State != "In Progress"))
                    {
                        statusMessage = "Your commit has been blocked because you didn't give any “Sprint Backlog Item” in state "In Progress" linked to this changeset.";
                        eventNotificationStatus = EventNotificationStatus.ActionDenied;
                    }
                }
       }
        /*...*/
}

and so on.

The AvoidUnplannedWorkPolicy is very Scrum for Team System process template related. It can be done more generic and cross process template support, but it’s just an example. 

That fact is I’m now thinking about a lot of policies to improve my team practices, but if tell you I have to kill you  ;-)

Conclusions

Seems like I can live with such transition (to TFS version control system), but it isn’t all happiness. Now I have to wait for an approval process to deploy my server side policies in the main TFS server. But if “they” don’t like such intrusion then probably I will also back into the “chaos”.

In the meantime I just wrote this blog post.

PD: Yes I'm back in the game again, but now without Subversion, with my ankle ligaments broken as outcome of a basketball game, and with my team Villa Clara as the new Champion of the Cuban National Baseball League ;-)

X-ray StoneAssemblies.MassAuth with NDepend

Introduction A long time ago, I wrote this post  Why should you start using NDepend?  which I consider as the best post I have ever...