Using NuGet 2.0 with TeamCity

We use the NuGet package restore functionality as part of our build. None of the packages are under source control, they are downloaded (from an internal subset of the public feed) before compilation.

Unfortunately, there is a slight shortcoming in some older versions of NuGet that results in far more requests being made than are strictly necessary. This was fixed in version 2.0, but that won’t be supported in TeamCity until 7.1 is released (currently EAP only).

For now, you can choose “custom” as your NuGet version:

Custom NuGet build step

and provide the path to NuGet.exe.

We saw a drastic improvement in build times (from 15-20 mins, down to under 5).

TeamCity publishes broken NuGet packages

We use TeamCity (7.0.3) both as a CI server, and a private NuGet feed. I have a lot of love for TC, but I currently have one major concern:

If my tests fail, TC still publishes the NuGet packages.

Not cool.

I raised this is an issue with JetBrains, and was told it was a feature request. And, that as a workaround, I could split my builds in two and separate the publish step from the rest of the build.

As far as I’m concerned, this is a showstopper bug. It introduces broken behaviour out of the box, which you won’t find out about until you realise a broken package is in use downstream. If JetBrains expects people to use TeamCity as a NuGet feed, this needs to be fixed.

(There’s also no (easy) way to remove packages from the feed, but I’m less concerned about that :) )

Working with binary dependencies

We have a reasonably complex build pipeline, using TeamCity & NuGet. This is a generally a Good Thing, but there are occasions when it becomes tempting to go back to having one big solution.

The main problem is the length of the feedback loop: you check some code in, wait for a build, and some tests, and some more tests. Then it triggers another build, and some tests, and some more tests.

And eventually the change arrives at the place you need it. Assuming you didn’t make any dumb mistakes, there’s no network issues, etc etc.

This can sap productivity, especially once you start perusing the internets :)

The alternative is to copy the dlls from one source tree, to another. An arduous process, and easy to get wrong. So script it:

function ripple([string] $project, [string] $source, [string] $target) {
  $targetNugget = gci "$target\packages" -r -i "$project.*" | Where {$_.psIsContainer -eq $true} | Sort-Object -Descending | Select-Object -First 1
  gci "$source\$project\bin\*" -r -i "$project.*" | foreach { cp -v $_ "$targetNugget\lib\net40" }
}

Usage:

$packages = "Project1", "Project2"
foreach ($p in $packages) { ripple $p "C:\code\Solution1\src" "C:\code\Solution2\src" }

This will copy the build artifacts for Project1 (i.e. bin\*\Project1.*) in Solution1, to the highest Project1 nuget package in Solution2 (e.g packages\Project1.3.1.0.456).

(In case it’s not obvious, the name is an homage to the tool being developed for the same purpose by the FubuMVC team)

Updating a NuGet package fails saying it’s not installed

If you try to update a NuGet package, and get an error saying:

Update-Package : 'package' was not installed in any project. Update failed.

(but it’s definitely already in use), it may be because of this.

In my case, I’d got into the situation by previously updating the package, and then reverting the changes. Because we’re using the package restore functionality, this meant the package repo contained a newer version of the package than the packages.config referenced. Which didn’t go down well!

The solution is to delete the offending newer package (or just nuke the whole contents of the packages folder (except the repositories.config!), and re-build). You’ll need to close VS first, as it locks the files.

Hopefully a future version of NuGet will provide a more informative message :)