24 February 2017

Prevent specific users from modifying area and iteration in a Work Item Type

It is not supported to add READONLY or FROZEN rules on the System.IterationPath and System.AreaPath.

A work around is to create a custom field named "You cannot change this field" for example. Give it the Integer type.
In the "Rules" tap, add one WHENCHANGED rule for each field you want to protect.
In the first one, select System.AreaId in the Field Condition, and navigate to the Rules tab.
Add two rules:
  • COPY
    • For: the TFS group containing the users not authorized for change
    • From: value
    • Value: leave empty
    • For: the TFS group containing the users not authorized for change

Do the same for the second WHENCHANGED rule for System.IterationId

Basically, the following sequence will trigger when an unauthorized user tries to change a protect field (System.AreaId and System.IterationId linked respectively to System.AreaPath and System.IterationPath) :
  • The field value is changed to blank
  • The field is set to REQUIRED
This triggers a validation error when the user modifies the Area or Iteration field. The error message is not the best because it says:
     Field 'You cannot change this field' cannot be empty. 

At least it gets the job done :)

21 July 2016

VNet to VNet connection in Azure from different subscriptions

In case you need to connect subnets in different Azure subscription, you might come across the following (nice) article: https://azure.microsoft.com/en-us/documentation/articles/vpn-gateway-vnet-vnet-rm-ps/
I just wanted to give a little more information based on my experience with the actions involved.

If you try to follow the steps to connect the subnets using pre-existing gateways, it will not work. You have to create the gateways using the provided instructions.
I suppose it is due to the fact that gateways created from the Portal have the "Basic" SKU (you can see it using the Resource Explorer).

The PowerShell instructions include a specific parameter named SKU and we have to pass the "Standard" value. At the time of the writing it is not possible to pass this parameter in the Portal, hence the need to use PowerShell to create the gateways.

22 October 2015

Find all TFS work items for which the original estimate has been changed

You might find yourself having to track down bad practices (like modifying the original estimate of a task).
You could setup an alert to be notified as soon as it happens or you can monitor changes through the TFS database.

Today, I'll explain how to monitor using a query on the TFS database. We will use the collection database. The idea is to use the views related to work items and their history.

The first step is to union the data sets (the current work item states and the past work item states).
Then we select the record having an original estimate higher than the previous version. THAT is the tricky part. I used the LAG analytic function to get the reference to the "previous" record (that is, the same id but the previous revision).

Here is the complete statement.

    FROM [Tfs_DefaultCollection].[dbo].[WorkItemsAreUsed]
   UNION *
    FROM [Tfs_DefaultCollection].[dbo].[WorkItemsWereUsed]

),AllDataWithLag AS
       , [Microsoft.VSTS.Scheduling.OriginalEstimate] -

         LAG([Microsoft.VSTS.Scheduling.OriginalEstimate], 1)
         OVER(PARTITION BY [System.Id] ORDER BY [System.Rev]) AS EstimateDifference
   FROM AllData

 WHERE [System.TeamProject] = '[Replace with project name]'
   AND EstimateDifference > 0
 ORDER BY [System.Id], [System.Rev]

19 October 2015

Microsoft Visio and Azure

If you find yourself having to create diagrams for Cloud architectures in Visio, there are nice stencils from Microsoft with all the Azure services !

Here is the download link

03 August 2015

Migrate TFS "Repro Steps" field to the description

I recently decided, for a specific team project, to use the Description field instead of the Repro Steps field in the Bug work item. The main reason is to be able to create reports for both Product Backlog Items and Bugs, while being able to display the full description.

The first step was to migrate the content of the Repro Steps field into the Description field in order to retain data for existing items.
Since it is not possible to do field-to-field bulk update from the Web interface, I had to create a small console application to copy the data.
The main problem I encountered was about inline images. When you copy and paste a screenshot into an HTML field, it creates an img tag with Base64 data in the src attribute.
For an unknown reason, when the data in the src exceeds a certain limit (I don't know the actual value), the src attribute is blanked during the save. Images simply appear blank in the target field.

After some search, I came across this post from René van Osnabrugg. I tweaked the code to reach the final solution.
Basically, the code
  1. uses a Regex to find inline images in the Repro Steps field, 
  2. extracts the binary data from the src attribute
  3. writes the images data on the local disk
  4. attaches the image to the work item
  5. replaces the inline image with a img tag linked to the tfs attachment API
  6. saves everything into the Description field
I decided to keep the attachment just in cause I'd like to extract/share easily.

Here is the complete code sample.

 var tfsCollectionUri = new Uri("tfs collection url");  
 var tfsCollection = new TfsTeamProjectCollection(tfsCollectionUri);  
 var workItemStore = tfsCollection.GetService<WorkItemStore>();  
 var items = workItemStore.Query("SELECT [System.Id], [System.Description] FROM WorkItems WHERE [System.TeamProject] = '@@@YourProject@@@' AND [System.WorkItemType] = 'Bug' AND [System.State] <> 'Removed'").Cast<WorkItem>();  
 foreach (var item in items)  
  var reproSteps = Convert.ToString(item.Fields["Microsoft.VSTS.TCM.ReproSteps"].Value);  
  // Get images from repro steps  
  var imageMatches = Regex.Matches(reproSteps, "<img src=\"data:image/png;base64,[^\\.]*\" alt=\"\">")  
                     .Select(m => m.Value)  
  foreach (var match in imageMatches)  
   var base64data = match.Replace("<img src=\"data:image/png;base64,", "")  
              .Replace("\" alt=\"\">", "");  
   // Write image to disk  
   var imageData = Convert.FromBase64String(base64data);  
   var imageName = "image-"+ item.Id + "-" + DateTime.Now.Ticks + ".png";  
   File.WriteAllBytes(imageName, imageData);  
   // Attach file to work item  
   int attachmentIndex = item.Attachments.Add(new Attachment(imageName, "Created from repro steps inline image"));  
   // Get the attachment ID  
   var attachmentGuid = item.Attachments[attachmentIndex].FileGuid.ToString();  
   // Replace inline image with attached version  
   reproSteps = reproSteps.Replace(match, String.Format("<img src=\"{0}/WorkItemTracking/v1.0/AttachFileHandler.ashx?FileNameGuid={1}&FileName={2}\"/>", tfsCollection.Uri.ToString(), attachmentGuid, imageName));  
  item.Description = reproSteps;  
  var errors = item.Validate();  
  if (errors.Count == 0)  

29 January 2015

Useful fields in the TFS Database

If you ever need to retrieve the parameters (and respective values) of a test case (for reporting purposes for example), the database table to read is Tfs_<CollectionName>.dbo.WorkItemLongTexts.
It contains the HTML/XML field values for each workitem.
The only missing part is the ID of the actual field you want to retrieve :
  • FldId 10018 contains the definition of the parameters
  • FldId 10029 contains the values of the parameters

27 January 2015

Deploy an ASP.NET MVC Web Application to Azure Website using Release Management for TFS

There are a lot of articles about the integration of an Azure VM into Release Management for TFS. What if you simply want to deploy an ASP.NET MVC application without the need to provision a dedicated machine.
It turns out to be quite simple. One of the most efficient way to do it is to create the initial deployment from Visual Studio and then pick up the piece to build a RM template.

My Web Application is quite simple: a database, a Entity Framework Code First model all sitting inside the ASP.NET MVC application. Like all deployments, the main process is:
- generate the Web Deploy Package with placeholders for configuration (connection string, parameters, etc.) using TFS Build (or whatever build system you have).
- Deploy to the staging machine (using a agent-based or agent-less machine)
- Run the MS Deploy CMD file with the appropriate parameters to trigger the deployment on Azure.

The reason I use a staging machine as the basis for the CMD is that it allows me to restrict the firewall configuration to one single machine from which all deployments to Azure will happen.

If you need guidance on the generation of the Web Deploy package, you can refer to the following article as a starting point.
The main difference in my approach is that the component I create to deploy to Azure is no longer based on IRMSDeploy.exe. It is a simple "Windows Command Line Processor" based component.

Details of the component to publish to Azure
Details of the component to publish to Azure

The complete command line arguments are:

/C ""__WebApplicationName___Package\__WebApplicationName__.Deploy.cmd" /Y "/m:__AzureDeployUrl__?Site=__SiteName__" -allowUntrusted /u:"__Username__" /p:"__Password__" /a:Basic"

The __WebApplicationName__ parameter can be customized dependending on your own publish parameter. The other parameters represent values taken from the publish profile that you need to retrieve from the Azure portal (by clicking on "Download the publish profile").
Azure Portal to retrieve the PublishSettings file
Azure Portal to retrieve the PublishSettings file

The table below describe how to retrieve the parameter values.

Parameter Value
SiteName Look for tag "msdeploySite" in the PublishSettings file
AzureDeployUrl Usually https://.scm.azurewebsites.net/msdeploy.axd
Username Look for tag "userName" in the PublishSettings file
Password Look for tag "userPWD" in the PublishSettings file

Once you have all the information, you just need to update the release template with your component. In the screenshot below, there are some additional parameters. There are only relevant for the application I am publishing, you can safely ignore the EnvironmentName and Database-ConnectionString (though they are very likely to be part of your own deployment component).
Details of the release template using the custom component
Details of the release template using the custom component

When this is done, you can test the release by triggering a build (or a release directly) and see if it works.

Feel free to drop a comment if you need help!

Note: you might encounter the following error in the deployment log:
More Information: Creating a new application is not supported by this server environment.

This is most likely due to an incorrect value in the publishing parameters (DeployIisAppPath in that case). Open your publish profile in Visual Studio.
Publish profiles in Visual Studio
In the XML file, make sure the value for DeployIisAppPath will render exactly as the value for "msdeploySite" taken from the Azure PublishSettings files.
Value for the DeployIisAppPath

 In our case, the value will be replaced by the RM engine as defined in the component created earlier. Actually, this is how we link the Azure Website with the content of the Web Deploy package.