When to create a Nuget stable release?

I was having a chat with another member of my team this morning about Nuget package versioning – the question was the following:

When should we create a stable release version of a NuGet package? 

I found the answer in Building pre-release packages: a stable release is one that’s considered reliable enough to be used in production. It’s just as simple as that. Also, keep in mind that the latest stable release is also the one that will be installed as a package update or during package restore:

nuget-stable1

For more details please go to Building pre-release packages (package versioning, pre-release versions, reinstalling and updating packages, etc).

Disabling ‘member is obsolete’ warnings on Visual Studio Team Services

The scenario – I am working on a new functionality solution that has many members marked as Obsolete (some are not being used at the moment and others will be removed in the future). When the solution is compiled warnings are being generated as follows:
01-vstudio-warnings

And this is how things are supposed to work – other developers working in the same solution will know straight away that these members should not be used. It’s perfectly fine to diplay these warnings locally, but honestly I don’t think it makes sense to display them on the build server.

MSbuild has a property named nowarn that can be used to suppress compiler warnings. In my case, I want to suppress warnings CS0612 (‘member’ is obsolete) and CS0618 (‘member’ is obsolete: ‘text’).

In VSTS add the following to the MSBuild arguments to your Visual Studio Build task:

/p:NoWarn=”612,618″

02-build-task.png

That’s it! No more ‘member’ is obsolete warnings will be displayed when running a new build. Remember to add the same arguments to other tasks that might use MSBuild (for example, I have another task that generates an ASP.NET deployment package which was generating the same warnings).

Happy coding!

Testing Service Fabric deployment packages on VSTS

The scenario – you have a Service Fabric build configured on Visual Studio Team Services (VSTS) as follows:

01-sf-build-configuration

As you can see from the screenshot, there is a task to generate the Service Fabric deployment package. There were no errors in this task, but don’t assume that everything is OK with the package, something might go wrong when you try to deploy it to a SF cluster.

In order to avoid surprises when deploying the application, you can test the package after its generation using the Test-ServiceFabricApplicationPackage powershell cmdlet.

Add a new Powershell++ task after generating the package and configure it as follows:
01-test-sf-package-task

The command takes the path to the SF package folder as a parameter. I usually set the SF project as the working folder.

Queuing a new build, you can see the results of the build and in particular the task that tests the SF package:

03-build-results

That’s it! With this solution you will know immediately if something is wrong with the package, saving you from the frustration of a failed deployment. This doesn’t mean that deployments will never fail, but hopefully you will be able to detect most or all of the errors in the deployment package every time you trigger a new build 🙂

On using code from the internet

It’s quite common for us, software developers, to search for solutions to our problems on sites like stackoverflow.com, blogs, etc – but what do you do when you find a code sample that suits your needs?

In my case, I always add a reference to the site:

stackoverflow-reference

Why bother? In my opinion you can find other useful information such as how to use the code, other useful answers that might be used in a different context, etc. Also, this is a nice way to give some credits to the author, don’t you agree?

Improving the performance of Service Fabric builds and deployments on VSTS

Consider the following definition – this is my typical build for a Service Fabric (SF) application on Visual Studio Team Services (VSTS):

001-typical-build-definition

In short, after restoring the nuget packages, building the solution and running the unit tests I generate the Service Fabric deployment package and test it. The artifacts of the build are not only the deployment package but also the publish profiles for each environment, as follows:

002-typical-build-artifacts

Unlike an ASP.NET application the Service Fabric deployment package is not zipped, as you can see in the above screenshot. Publishing the artifacts in this case takes around 11 seconds:

003-typical-build-timeline

What does it mean exactly “publishing the artifacts”? It means that the files files will be uploaded to the server. Obviously the more and bigger files you have the longer it will take to upload them to a server (and later on downloading them when you need to deploy the application).

I decided to zip the deployment package instead to improve the performance. I added another task as follows:

004-zip-task

And this is the deployment package, zipped:

005-zip-build-artifacts

Zipping and uploading the deployment package took around 8 seconds:

006-zip-build-timeline

You might think that’s not a big improvement but times will change depending on the size of the deployment packages. In this case the generated folder has 26MB but I’ve heard of deployment packages that have almost 200MB in size!

Also, have in consideration that when you do a deployment you’ll need to download the artifacts from the server. In the first case the deployment package (uncompressed) took on average 10 seconds to download:

007-typical-release

As opposed to less than 3 seconds in the second case (zipped deployment package):

008-zip-release

That’s it! In this example I have shown you a Service Fabric build but I’d recommend this approach if you upload multiple files as artifacts. Even though there are extra tasks in the build (zip the package) and release (unzip the package) there was an improvement in the performance of both the build and release.

My suggestion is to create 2 versions of the same build and also the release and compare the results between each version. Give it a try, you might be surprised with the result!

Happy coding 🙂

Running a basic smoke test after a deploying a website using Powershell

The problem:

You have automated a deployment of an ASP.NET website to Azure (App Service), using Visual Studio Team Services (VSTS). Deployment went fine (no errors) but when you try to access the website you get an runtime error such as the following:

01-runtime-error

You must be thinking that I should run some smoke tests after the deployment to detect problems like this one, and you’re absolutely right. But, believe me or not, there are still a lot of companies that have no unit tests or just a few – let alone integration/smoke tests!  Believe me, I’ve been there 🙂

In that case, there are quick and easy ways to run some sort of smoke tests. For example, you can configure a Powershell task as follows:

In short, Invoke-WebRequest sends an HTTP request to a web page (defined in the $(HomePage) variable). If the page returns a 500 error the Powershell task and consequently the deployment will fail:

03-deployment-result

Deployment log:

04-deployment-log

That’s it! As a final note, the task configured above contains inline script, which should be avoided (I did it for demonstration purposes only). All source code, scripts,configuration, etc should be under source control.

Use environment variables to speed up your .NET Core build on VSTS

I’m using Visual Studio Team Services (hosted agent) to automate the builds and deployments of .NET Core solutions.

You probably noticed that .NET Core builds take much more time compared to the traditional .NET builds. For example, when you run the command dotnet restore you might have noticed something like this being logged:

2016-09-15T11:15:39.1510337Z A command is running to initially populate your local package cache, to improve restore speed and enable offline access. This command will take up to a minute to complete and will only happen once.
2016-09-15T11:15:44.7529135Z Decompressing 0%... Decompressing 1% ... (text removed for brevity) Decompressing 100% 5523 ms
2016-09-15T11:16:05.7899968Z Expanding 0%.... Expanding 1%... (text removed for brevity) Expanding 100% 20548 ms
2016-09-15T11:16:29.7176084Z log  : Restoring packages for C:\a\1\s\Development\Source\MyProject\UI\project.json...

As you can see, caching of the packages took almost 1 minute! As suggested in Stop wasting time during .NET Core builds, adding the following environment variables to your build definition can reduce the build time:

.NET Core environment variables

So basically DOTNET_SKIP_FIRST_TIME_EXPERIENCE will prevent the caching of the packages on the build machine, and NUGET_XMLDOC_MODE will prevent the download of the XML documentation for the packages. Unfortunately I couldn’t find much documentation about these variables, but check the blog post above for more details.

Visual Studio: Unable to start debugging on the web server

This happened to me today – I was getting the same error whenever I tried to debug an ASP.NET application using Visual Studio:

Unable to start debugging on the web server. Could not start ASP.NET debugging. More information may be available by starting the project without debugging.

debug1

My initial reaction was to check if there was something wrong in IIS, and I was right: the application pool used by the application I wanted to debug was stopped!

debug2

At that moment I realised that I changed my Windows password 2 or 3 hours before trying to debug the application. Given that the application pool was running under my credentials, all I had to do to fix the issue was to right-click the application pool and go to Advanced Settings > Identity and update my password 🙂

 

MSBuild – Access to the path is denied

The problem:

I was configuring a new build on Bamboo CI server for a ASP.NET application. The solution built locally just fine, but consistently failed on the build server. This was the error:

(BeforeBuild target) -> 
  C:\Bamboo\src\MC-BUILD-JOB1\MyProject.Web\MyProject.Web.csproj(1851,5):
 error : Could not write Destination file: 
Access to the path 'C:\Bamboo\src\MC-BUILD-JOB1\MyProject.Web\Config\AppSettings.config' is denied.

The problem was in the following line:

<TransformXml Source="Config\AppSettings.Base.config" 
              Transform="Config\AppSettings.$(Configuration).config" 
              Destination="Config\AppSettings.config" />

So basically the TransformXml task was failing because the file Config\AppSettings.config was checked out as read-only in the build server.

Fortunately there is an easy workaround. The trick is to apply the XML transformations to a temp file and then use the Copy task with the OverwriteReadOnlyFiles attribute set to “True” to overwrite the file Config\AppSettings.config:

<TransformXml Source="Config\AppSettings.Base.config" 
              Transform="Config\AppSettings.$(Configuration).config" 
              Destination="Config\AppSettings_temp.config" />
<Copy SourceFiles="Config\AppSettings_temp.config" 
      DestinationFiles="Config\AppSettings.config" 
      OverwriteReadOnlyFiles="True" />
<Delete Files="Config\AppSettings_temp.config" />

Powershell scripts running on Bamboo don’t return the correct exit code

As part of an deployment project on Bamboo CI, I was running a powershell script to deploy an ASP.NET application to a Cloud Service on Azure.

Even though there was an error executing the script, Bamboo was setting the status of the Deployment to Success. Why? Because the exit code returned by the powershell script is always 0 (zero means successful execution).

After some research I was able to find a way to return the correct exit code in case of failure. I added the following lines to the top of my powershell script:

trap
{
    write-output $_
    exit 1
}

The trap statement includes a list of statements to run when a terminating error occurs – in this case, every time an error occurs the error message will be displayed and then the script will return a correct exit code indicating a failure. I am returning 1 but any value different from 0 (zero) will do the trick 🙂