Things Worth Watching #5 – Pacts to the Rescue (Beth Skurrie)

Beth Skurrie‘s “Pacts to The Rescue” presentation at Sydney API Days 2015 is worth the watch to get an overview of Consumer-Driven-Contracts and the use of Pact to implement it. One thing I was missing, and that I hadn’t seen yet till watching this,  was the Pact-Broker – a centralized service to publish pacts to and consume them in the service providers. A .Net implementation of the broker would also be nice, just to simplify integration, modification and ramp-up. I don’t know of any implementation, but at least there is Pact-net to use pacts in a .Net codebase, provided by SEEK-jobs.



Battling TFS Builds and Powershell

My last few days have been about getting DNX builds to work in TFS. I’ve got a couple of projects / solutions of different types ongoing, between web apps, web APIs and class libraries. I consider a good build system as a fundamental component in the development process, to aid in continuous integration and enable continuous deployment.

TFS vNext builds in TFS 2015 present a new setup, and as with any new tech problems arise, and solutions aren’t always easy to find. Existing tasks don’t always cater to my needs, so I’ve been creating a bunch of simple Powershell scripts to fill in the gaps. And since I’m an absolute Powershell noob, well, lots of problems appeared.

So, what follows are some problem-solution scenarios I found along the way.

TFS build + Powershell : Persist vars between steps

The Context:
I’ve opted to give GitVersion a try. One of my solutions is composed of class libraries that are reusable in multiple other solutions (base classes, interfaces, abstractions). My distribution mechanism of choice is an internal nuget server. As such versioning the packages is key. I prefer to follow SemVer since it explicits what changes occur, and also garantees an incremental version number on each change.

Previously (pre-dnx), I’ve built a bunch of command line tools for my team to use to control the version numbers. This required devs to run a command pre-commit to increment the number in the AssemblyInfo.cs files (or in our case a single shared file). Increments were driven by the dev, who would choose to increment the major, minor or patch component. .Net assemblies include a fourth component – revision, which we would allow to increment if no changes were made but we simply wanted a new build / package (because something went wrong). NuGet updates in projects / solutions require incremental numbers so this was useful to us.

Though not difficult, the manual increment step is still a manual step. GitVersion automates this, guaranteeing at least a patch increment on each new build. Eventually, git commit message can drive minor and major increments (using “feature” and “breaking” in the commit message). Alternatively, the GitVersion.yaml file has a next-version property that can be used for that too.

The problem:
In my build, I have a step (before dnu restore and build) to get the new version number and add it to an environment variable. The idea would be to persist the variable’s value throughout the various build steps (and not across builds or permanently). Unfortunately, using an $env:varname type of attribution in the Powershell script does not store the variables. Calling Get-Variables in the next steps does not present the variable in the list of available variables. This may be possible in the future, and should be possible in VSO (using specific variables), but not yet in TFS 2015. I open an issue that has a short discussing associated: .

The workaround:
To solve this, I took the “easy” but somewhat inelegant route of serializing the object that GitVersion creates to file, and reading / de-serializing the file contents to object wherever I need it. For this too work, I had to put the GitVersion.yaml next to the .git folder (otherwise GitVersion.exe would not produce the correct output). Since the PS scripts are run from the solution folder, I serialized the object to a gitVersionOutput.json file, which shouldn’t interfere with anything else (and will only exist on the build server).

$GitVersionOutput | Out-File gitVersionOutput.json

Once I’ve gotten the new version number for the project / solution, I can run through each project.json file in the solution and substitute the version property’s value.

Powershell : ConvertTo-Json might not produce the result you want

The Context:
As mentioned in the previous step, once I’ve got the new SemVer version number, I can substitute the value in each of the project.json files in the solution. This will allow the build to use that value for each of the generated packages. Package and assembly version numbers will automaticaly be incremented, NuGet packages will be correctly versioned and the Git repo can then be tagged. Binaries, packages and repo will be synced to match specific point-in-tine inputs to outputs.

The problem:
What would seem to be a simple step /powershell script turn out to be a load of work. The idea was to read each project.json with a combination of the Get-Content and the ConvertFrom-Json cmdlet, change the properties value, and rewrite the file with the ConvertTo-Json and Out-File cmdlets. Unfortunately it’s not straightforward. ConvertTo-Json won’t serialize correctly – deep trees will see some subtrees, like the dependencies subtrees converted to string instead of a tree.

In other words, running

Get-Content -Raw project.json | ConvertFrom-Json | ConvertTo-Json | Out-File project.json

doesn’t produce the correct output.

What was read as (shortened for brevity to the relevant parts)

"frameworks": {
"dotnet": {
"dependencies": {
"System.ComponentModel": "4.0.0-beta-23109",
"System.Data.Common": "4.0.0-beta-23109",
"System.Runtime": "4.0.20-beta-23109",
"System.Runtime.InteropServices": "4.0.20-beta-23109",
"System.Runtime.Serialization.Primitives": "4.0.10-beta-23109"

Gets written as

"frameworks": {
"dotnet": {
"dependencies": "@{System.ComponentModel=4.0.0-beta-23109; System.Data.Common=4.0.0-beta-23109; System.Runtime=4.0.20-beta-23109; System.Runtime.InteropServices=4.0.20-beta-23109; System.Runtime.Serialization.Primitives=4.0.10-beta-23109}"

Notice that dependencies is no longer an object with properties but a single string.

The solution:
Two things should be considered. To read the json, Get-Content should have the -Raw switch to avoid using any other piped step (such as Out-String); Get-Content -Raw project.json | ConvertFrom-Json works correctly.
The other thing, when serializing, is to include the --depth argument before writing the file : ConvertTo-Json --depth 999 | Out-File project.json , to avoid the deeper parts of the tree to be written as a single string value.

Deployment – IIS might be looking for x64 apps because of the app pool

The context:
This one drove a colleague of mine, who has never used IIS, mad. The context was a simple one. Deploying (x-copy) an app to a IIS based server. Our build doesn’t have this step setup yet (but hopefully it will, soon, with OctopusDeploy), and it’s also something kinda one-shot-ish since we were trying to deploy a branch’s snapshot with some mocks for validation. Esetially we had an X-copy to the server, and reference to an x86 version of a dnx beta.

The problem:
After small changes to IIS (basic stuff), one last error was keeping the the site / app from displaying. IIS was looking for dnx-clr-win-x64.1.0.0-beta5, and not the x86 variant. We did not have, anywhere, a ref to x64.

The solution:
This was solved with an AppDomain setting. In the used app Domain, there is a setting that allows 32bit apps to run. Changing the flag value to true corrected the problem. Unfortunately, this one was pretty hard to find. To Get there:

  1. In IIS choose the appdomain being
  2. Choose used and Advanced settings
  3. Change “Activate 32 bit applications” in the General properties group to true

Powershell – Hidden vars don’t go out to the command line

The context:
As previously mentioned, GitVersion is being used to calculate a new SemVer version of the artifacts being generated. The last step, after building, testing and pushing packages is to tag the repo, on the commit that generated the version, with the version number. This is generally done using a git push

The problem:
The origin argument in the command is a URL to the repo, and authentication info is necessary. In the case of TFS repos, a username/password combination is required. git push at the command line generally requests the user and password values in separate interactions in the shell. To avoid these interactions (since in the build it’s not possible or convenient), a single command should be used with all the required info. in this case, the url for origin can contain the user and pass before the url using the following format:

git push https://:@

Passwords are sensitive things and we don’t want them coded in the script. We can use build variables (defined in the variables tab of the build job). TFS supports “hidden” variables that allow you to include sensitive info, and hide them from the UI and logs. These vars are generally available as environment variables throughout the build. While you can access normal variables through ${env:variableName} references, hidden vars won’t get substituted correctly in the script, failing the authentication.

The Solution:
Build variables can generally be referenced through the ${env:varname} syntax, but hidden vars can’t. They can, though, be passed as arguments to a script. In this case their value is used internally as expected and still hidden (substituted by ****) in the logs. Script arguments are declared in the start of the script and defined in the arguments textbox in the step definition. in this case, the reference is in the form $(varName).

So for my case, I have 3 vars I define in the build – tagUser, tagPass, tagRepoUrl – where tagPass is hidden. At the very beginning of my powershell script I accept 3 arguments (by ordinal reference) which can be locally accessed as $tagUser, $tagPass, and $tagRepoUrl

[string]$tagUser = $args[0]
[string]$tagPass = $args[1]
[string]$tagRepoUrl = $args[2]


In the arguments textbox I reference the build vars in order: $(tagUser) $(tagPass) $(tagRepoUrl)



There’s also a GitHub issue for this one:

I probably can get the repo url through some other mechanism (maybe gitVersion or .git folder files or something) which I may explore and get rid of the tagRepoUrl variable all together. Seems redundant, just haven’t invested the time for that.

Build vNext projects in TFS (on-premise)

Since I’ve been working with vNExt these last few weeks at Celfinet, and having started a brand new solution for some services, one of my main concerns was to get a build setup ASAP. The lesser the amount of dependencies and projects and what not, the easier and faster it is to setup the build. Faster here is mainly for code sync (from source control, in this case, GIT) and the build feedback. The less amount of work that needs to be done, the faster it will be to se the whole job pass or fail, which in the setup phase can be daunting if you have to wait long.

We generally use one of two build systems – TFS or Jenkins. The new build system (also known as Build.VNext or Build.Preview in the menu) was recently made available to the team in our installation. I’m not a fan of the previous build system. The Xaml workflow concept for defining jobs isn’t a bad idea, but it just isn’t a great experience and is really hard to get into and use. I do like Jenkins flexibility – you build is basically just a sequence of steps of anything – but the UI and navigation could really use some work. When I saw the presentations for the new TFS system, I was hooked (even without trying!). The interface is super clean and has all the flexibility that we got from Jenkins. There is still plenty that can be done to make the new TFS build system spectacular, but it clearly is going in a great direction.

Sidenote: I’m quite curious as to why the ASP.Net team hasn’t adopted the TFS / VSO build system. Preference seems to have gone to Travis/ AppVeyor based builds (as presented on GitHub project pages).

Anyway, not everything was a smooth setup in the build system for my projects. Not that it isn’t easy, but it’s still a preview and documentation is scarce, so I naturally hit some obstacles.

1 – DNVM, DNU updates (prerequisite)

First thing to do in the build, and because we’ll need to build using dnu, is to install/update DNVM and install / upgrade DNU to whatever our solution requires. The process is an automated version of my previous post. The powershell script used is similar to the one at It :

  • updates / installs dnvm
  • updates dnu to whatever is in the solution’s global.json, and sets the appropriate environment var to use it.
  • restores packages for the solution’s projects.

The only difference is referenced script only restored dependencies for projects within the \src folder. My modified version restores packages for all the projects (or, better yet, all the project.json) in the solution.

Therefore, the following powershell script was added to the solution folder (and source controlled) and added as a first step in the build process.


2 – MSBuild it

Second step was a regular MSBuild step, using the MSBuild task and pointing to the solution file. It’s more than enough and will correctly capture all the projects and dependency graph within it.  

3 – Run xUnit tests

This was a painfull one for me. I spent a lot of time trying to use the VSTest task available, with no luck. It would run, sometimes with success (depending on whatever settings I used – I tried so many it’s now hard to tell). Donovan Brown had already written up a post on this which I tried, but it just wouldn’t work for me – the path to the adapters the he mentions wouldn’t work in my installation. So after a lot of struggle I tried inventing a powershell script to recursively find all the test projects under the solution’s \test folder and run dnx . test on all of them. This off course assumes that all the test projects are under \test and that all of them have a test command in the project.json. It’s a bit forced, but for a first effort should be ok. refactoring later on when more info is available will be required! So after the MSbuild step I added another powershell step to run the xUnit tests:

Like the previous one, the script was placed in source control in the solution directory.

Next steps

Build and test was the most important parte of what I wanted in this first approach to the build. Next step will be generating the appropriate artifactes (nuget packages) and deploying to an environment with OctopusDeploy. Hopefully it’ll be pretty straightfoward 😀

Upgrading vNext MVC6 Apps From Beta4 to Beta5

It’s been a while, but my last few sprints have been focused on working with ASP.NET 5 MVC6, namely evaluating what is going on in order to protoype an app front-end. Working with Beta versions is definitely challenging – there are plenty of moving parts and a lack of documentation. Fortunately, the term “community” is one we can apply more to what is going on with Microsoft’s iniciative. I’ve been pretty much googling stuff, deducing concepts, and hogging the Jabber #AspNetvNext chat room, where, fortunately, I’m getting answers I need.

I started developing the prototype in Beta4, and decided to migrate to Beta5 before starting to add even more parts to the project. Unfortunately, upgrading isn’t the easiest thing in the world, but it is possible.

1. Upgrade your DNVM and DNU

Your runtime environment should be one that can support Beta5 correctly. At this time, the latest is Beta6-10383. To upgrade, you can run in the command line window:

@powershell -NoProfile -ExecutionPolicy unrestricted -Command "&{$Branch='dev';iex ((new-object net.webclient).DownloadString(''))}"

This will update dnvm itself, if you are still in Beta4. As of the beta5 or 6 version, a self-update command is available.

With dnvm updated, you should upgrade the dnu versions.

dnvm upgrade

should pull and install the most recent stable dnu. You can indicate a specific version also:

dnvm install 1.0.0-beta5-11904


dnvm install 1.0.0-beta5-11904 -r coreclr

if you want to choose/specify the clr type. You can also pull the most recente beta build available using

dnvm upgrade -Unstable

2- Update your global.json

In your solution’s global.json, update the sdk version value:

"sdk": {
"version": "1.0.0-beta5-11904"

3- Update packages in the projects

At this time, you can update package refs in your projects to Beta5 versions. One thing to consider, though, is that from Beta4 to Beta5, a lot of changes have occurred, especially in the namespaces for some class. For instance, pretty much every package set now has an *.Abstractions package with base classes and interfaces, separated from their implementations. I would at this point recommend excluding as many projects as it would make sense to and to update gradually, traversing the dependency chain. This is simply to reduce the amount of focus required to analyse output errors.

The main NuGet feed contains a large set of packages, but if you want to try using the latest and greatest, you can pull from the myGet feed. In order to do so, at the same level as your global.json and .sln file, add a .nuget/ folder and a NuGet.config file (similar to previous solution types). Add the following :

<?xml version="1.0" encoding="utf-8"?>
<!-- Add this repository to the list of available repositories -->
<add key="AspNet MyGet" value="" />
<add key="" value="" />

I prefer having this in the solution, and committed to source control, as the entire team can share this definition in the solution context AND you won’t have to manually configure VS to use the repository (which would make it available to every solution you use, which you may or may not want).

3.1 Abstractions

Alot of things are changing, so some car is required, and code will require updates. The most common change that affected me were namespace changes (as mentioned in Pretty much all of the repos have some Abstractions namespace associated to it, now. That’s actually a pretty good thing , in my opinion. If you are not tied to a specific implementation of abstract class or interface, the only reference you are required to have in class libs is the abstractions package. That makes use of DI more straight foward in the application project, which would be where the actual implementation is referenced. Other than there, you can always program against the abstraction and Mock as you wish in tests.

Some common abstraction refs I required where:


Microsoft.Framework.Caching.Abstractions Microsoft.Framework.Caching IExperationTrigger
Microsoft.Framework.FileProviders.Abstractions Microsoft.Framework.FileProviders IFileInfo, IDirectoryContents
Microsoft.Framework.DependencyInjection.Abstractions Microsoft.Framework.DependencyInjection IServiceCollection
Microsoft.Framework.Runtime.Abstractions Microsoft.Framework.Runtime ILibraryManager
Microsoft.AspNet.Http.Abstractions Microsoft.AspNet.Http IApplicationBuilder

3.2 – Configuration

Configuration classes have changes also ( , There is a preference towards using ConfigurationModel classes / packages, to tie into the OptionsModel, instead of the previous Configuration (though configuration builder is in Configuration). Configuration can be loaded into IOption objects in DI by using the following code:

var applicationEnvironment = services.BuildServiceProvider().GetRequiredService<IApplicationEnvironment>();
var configurationPath = Path.Combine(applicationEnvironment.ApplicationBasePath, "config.json");

var configBuilder = new ConfigurationBuilder()
var configuration = configBuilder.Build();

The path to the config file should now be an absolute path as mentioned in

3.3 – _GlobalImport.cshtml

If you use _GlobalImport.cshtml for your views, the file name convention has changed to search for _ViewImports.cshtml (plural) as mentioned in . That file should therefore be renamed.

3.4 – Configure()

For some unexplained reason, I was having trouble using Use* extension methods on IApplicationBuilder such as UseStaticFiles and UseCookieAuthentication. For some odd reason, the runtime shouldn’t load the UseMiddleware<>() method those extensions use.

Método não encontrado: 'Microsoft.AspNet.Http.Authentication.SignInContext Microsoft.AspNet.Authentication.AuthenticationHandler.get_SignInContext()'.



For some reason some packages were not playing nice. The workarounds I used:

For UseStaticFiles(), I just used what the extension actually does:

app.UseMiddleware<StaticFileMiddleware>(new StaticFileOptions());

For UseCookieAuthentication(), using Beta6 packages for Microsoft.AspNet.Authentication and Microsoft.AspNet.Authentication.Cookies solved the problem.

Visual Studio Solution and Package Versioning – Part 1

Previously, I wrote in my previous post “SemVer, Team Development and Visual Studio Solutions” some of my thoughts about solution and application versioning. It’s been nearly two months since I wrote it, and after having followed the strategy I outlined, I can now discuss my experience with it. My team and I have felt some of the pains associated to it. Just like when I wrote it I was looking for a solution to my problem at the time, I continue to search for fixes to other problems that have arisen.

I understand that that post is kind of TL;DR but it pretty much states what my problem was and still is, and a possible path. At this moment, I still think that what was presented is still valid and works. The problems that I have encountered are essentially about tooling and concurrency. NuGet can be a REAL pain in many ways, if you don’t understand how it works.

What I want to do now is go over the pain point’s I have felt and try to find solutions for them. Hopefully, this will help me get my head around what I really need, since so many ideas are going through my mind. Hopefully, by writing this, I can get my mind straightened out, and help others to do the same.

The “solution” I used…

The changes I applied, based on the previously mentioned post, was used across 3 VS solutions with 10, 60 and 90 projects each (to help understand the dimension of the problem). The most generic solution contained shared libraries and had the least number of projects; the others refer to different application components.

Constraints were applied to simplify some aspects of the process:

  • Any project that would have shareable library outputs were packaged into NuGet packages. Package generation was aided by OctoPack, applied through NuGet.
  • Two NuGet repositories were used: a local repository on the developer’s machine – a dev could at any moment create a package and consume it without affecting other devs efforts during development; a second feed is used for CI built and published packages. These are generally considered official.
  • NuGet configuration was contained in the solution, by use of nuget.config file in the .nuget solution folder. LocalRepo folder was a relative path \ sibling directory to the solution folder. This way, pulling from the repo would be enough to get dev machines to connect to the repos.
  • SemVer was used as .Net assemblies permit, using only the first 3 parts (major, minor and patch). Revision number was generally 0, except when we needed to regenerate a package. I seriously wanted to use SemVer because it makes so much sense, but there are problems with that, which I hope to highlight shortly.
  • To simplify package management and dependency control within a solution, ALL projects shared the same version number, stored in the project’s AssemblyVersion.cs file, and was updated using Zero29.
  • Because we used OctoPack, assembly and package version numbers were identical to the assembly versions and consistent throughout the solution.
  • EVERY packaged project has a nuspec file, to enforce file inclusion, dependency requirements, and metadata (except for version number), Dependencies can be VERY problematic and correct care must be taken to get this up to date consistently.
  • A “tools” solution folder was added to each solution, with a “BuildAndPack” folder. This folder contained a series of .bat files to simplify some steps, such as version increment tasks, nuspec file management, and inter solution package updates (which I’ll get to shortly).

Devs became responsible for controlling the versioning process. Any time a change was made in a solution, the dev would need to increment the version number, build and pack locally. With these local packages, he could update the downstream projects with the changes in the dependency packages and validate the integration. This would allow him to test his changes and effects in the dowstream project (which in many ways commands changes in upstream projects).

Still, we tried our best to simplify and automate each step, to minimize the number of decision and manual steps required. Also, there was a strong attempt to keep the solution self contained, in order for any member to get all the necessary tooling components and scripts on a pull from source control.


The solution in place isn’t tuned yet to what we as a team require. It’s a start, but needs many fixes and solutions to the many pain points that are still felt. I relate them mainly to tooling and change concurrency.

I mention tooling, because NuGet has a set of default behaviors that are somewhat problematic. For instance, consider:

  • a project A that consumes a package B and C.
  • B defines C as a dependency, and says in it’s nuspec to work with any version of C from 1.0 to 2.
  • project A currently consumes v1.0 of B and C.
  • A new version of B and C exists at v1.1.

If you update package B in project A, you would probably expect C to get updated. Unfortunately, it isn’t. A already has a version of C, and sticks with the lowest version available of C. You would end up with B at 1.1 and C at 1.0. Two ways to get C up to date are to update C explicitly or change B’s nuspec to demand C at 1.1 as a minimum. NuGet doesn’t support this out of the box and so I had to develop a tool to iterate the nuspecs in a solution and update minimum package versions, especially on dependencies within a solution, because all packages generated from a solution will have the same version.

Concurrency-wise, when 2 devs update the version of the assemblies within a solution, every AssemblyInfo.cs file is changed on the same lines, and concurrent changes generate merge conflicts in source control. If changes are infrequent, it is more or less quick and easy; if they are frequent, devs get stuck in merge resolution. Same thing happens to .csproj files, since the path to the package contains the version number.

Series of Problems and Solutions

I want to tackle these problems in a series of posts, finding workarounds or solutions for them. As they get written I’ll add the links to the list:

  • Version increments and minimizing merge conflicts
  • Keeping intra-solution package dependencies aligned
  • Keeping inter-solution package dependencies aligned
  • Keeping things aligned in a NuGet package update

I’ll also try to keep the tool set used in a GitHub repo @ and eventually as NuGet packages (once I figure out how the public feed works).