I have been out to a number of sites recently where there are C++ developers. We often get talking about package management and general best practices for shared libraries. The common refrain is ‘I wish we had something like Nuget for C++’.
We have an internal Nuget Server we use to manage our software packages. As part of our upgrade to TFS2012 this needed to be moved to a new server VM and I took the chance to upgrade it from 1.7 to 2.1.
Now we had had a problem that we could publish to the server via a file copy to its underlying Packages folder (a UNC share) but could never publish using the Nuget command e.g.
Nuget push mypackage.nupkg -s http://mynugetserver
I had never had the time to get around to sorting this out until now.
The reported error if I used the URL above was
Failed to process request. ‘Access denied for package ‘Mypackage.’.
The remote server returned an error: (403) Forbidden..
If I changed the URL to
Nuget push mypackage.nupkg -s http://mynugetserver/nuget
The error became
Failed to process request. ‘Request Entity Too Large’.
The remote server returned an error: (413) Request Entity Too Large..
Important: This second error was a red herring, you don’t need the /nuget on the end of the URL
The solution was actually simple, and in the documentation though it took me a while to find.
I had not specificed an APIKey in the web.config on my server, obvious really my access was blocked as I did not have the shared key. The 413 errors just caused me to waste loads of time looking at WCF packet sizes because I had convinced myself I needed to use the same URL as you enter in Visual Studio > Tools > Option > Package Management > Add Source, which you don’t
Once I had edited my web.config file to add the key (or I could have switched off the requirement as an alternative solution)
Determines if an Api Key is required to push\delete packages from the server.
<add key=”requireApiKey” value=”true” />
Set the value here to allow people to push/delete packages from the server.
NOTE: This is a shared key (password) for all users.
<add key=”apiKey” value=”myapikey” />
Change the path to the packages folder. Default is ~/Packages.
This can be a virtual or physical path.
<add key=”packagesPath” value=”” />
I could then publish using
Nuget mypackage.nupkg myapikey -s http://mynugetserver
In my last post I discussed the process I needed to go through to get Typemock Isolator running under TFS 2012. In this process I used the Auto Deploy feature of Isolator. However this raised the question of how to manage the references within projects. You cannot just assume the Typemock assemblies are in the GAC, they are not on the build box using auto deploy. You could get all projects to reference the auto deployment location in source control. However, if you use build process templates across projects it might be you do not want to have production code referencing build tools in the build process are directly.
For most issues of this nature we now use Nuget. At Black Marble we make use of the public Nuget repository for tools such as XUnit, SpecFlow etc. but we also have an internal Nuget repository for our own cross project code libraries. This includes licensing modules, utility and data loggers etc.
It struck me after writing the last post that the best way to manage my Typemock references was with a Nuget package, obviously not a public one, this would be for Typemock to produce. So I create one to place on our internal Nuget server that just contained the two DLLs I needed to reference (I could include more but we usually only need the core and act assert arrange assemblies).
IT IS IMPORANT TO NOTE that using a Nuget package here in no way alters the Typemock licensing. Your developers still each need a license, they also need to install Typemock Isolator, to be able to run the tests and your build box needs to use auto deployment. All using Nuget means is that you are now managing references in the same way for Typemock as any other Nuget managed set of assemblies. You are internally consistent, which I like.
So in theory as new versions of Typemock are released I can update my internal Nuget package allowing projects to use the version they require. It will be interesting to see how well this works in practice.
For those of you who don’t know Nuget is a package manager that provides a developer with a way to manage assembly references in a project for assemblies that are not within their solution. It is most commonly used to manage external commonly used assemblies such a nHibernate or JQuery but you can also use it manage your own internal shared libraries.
The issue the questioner had was that they had added references via Nuget to a project
Their project then contained a packages.config file that listed the Nuget dependencies. This was in the project root with the <project>.csproj file.
<?xml version="1.0" encoding="utf-8"?> <packages> <package id="Iesi.Collections" version="184.108.40.20600" /> <package id="NHibernate" version="220.127.116.1100" /> </packages>
This packages.config is part of the Visual Studio project and so when the project was put under source control so was it.
However, when they created a TFS build to build this solution all seems OK until the build ran, when they got a build error along the lines
Form1.cs (16): The type or namespace name ‘NHibernate’ could not be found (are you missing a using directive or an assembly reference?)
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (1490): Could not resolve this reference. Could not locate the assembly "Iesi.Collections". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (1490): Could not resolve this reference. Could not locate the assembly "NHibernate". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.
Basically the solution builds locally but not on the build box, the assemblies referenced by Nuget are missing. A quick look at the directory structure show why. Nuget stores the assemblies it references in the solution folder, so you end up with
Packages – the root of the local cache of assemblies created by Nuget
If you look in the <project>.csproj file you will see a hint path pointing back up to this folder structure so that the project builds locally
The problem is that this folder structure is not known to the solution (just to Nuget), so this means when you add the solution to source control this structure is not added, hence the files are not there for the build box to use.
To fix this issue there are two options
- Add the folder to source control manually
- Make the build process aware of Nuget and allow it to get the files it needs as required.
For now lets just use the first option, which I like as in general in do want to build my projects against a known version of standard assemblies, so putting the assemblies under source control is not an issue for me. It allows me to easily go back to the specific build if I have to.
(A quick search with your search engine of choice will help with the second option, basically using the nuget.exe command line is the core of the solution)
To add the files to source control, I when into Visual Studio > Team Explorer > Source Control and navigated to the correct folder. I then pressed the add files button and added the whole Packages folder. This is where I think my questioner might have gone wrong. When you add the whole folder structure the default is to exclude .DLLs (and .EXEs)
If you don’t specifically add these files you will still get the missing references on the build, but could easily be thinking ‘ but I just added them!’, easy mistake to made, I know I did it.
Once ALL the correct files are under source control the build works as expected.