I am leaving this post here for historical reasons. A lot has changed over the years. Please remember the context in which this was originally written.

NuGet is the new (well, relatively new) hotness. Everyone loves NuGet. I am even guilty of extolling some of its virtues. I am not going to talk about the things which NuGet does well as they have been documented over and over again, and I don’t really have anything to add to those lists of awesome features. I can’t, however, advocate the way the owners of NuGet have taken the product.

Microsoft, along with most of the industry, went from using build domain specific languages (DSL) to drowning in XML. MSBuild, Ant, NAnt, XBuild, etc are all anathemas to our profession. Don’t get me wrong, they served their purpose, but we have to learn from those mistakes and move toward build systems based on coded build scripts using tools like Rake and Albacore, jake, and psake.

Leveraging existing programming languages, we can utilize their capabilities in creating DSLs to run our builds. Funny as this is, we have now come full circle, but with a better understanding of what a build should be.

Why bring this up? This applies directly to NuGet’s exhaustive use of XML. In addition, we have NuGet’s incredibly limited command-line outside of Visual Studio, and the choices to disempower users with unneeded hand holding. There is an incredible focus to force users to manage their dependencies inside of Visual Studio with a number of features, such as the PowerShell tool scripts, only working when you have the solution loaded.


Ruby has had a package management system, gems, and a dependency bundling tool, bundler, for years. Gem specs are ruby code veiled in a thin DSL. You can see examples in the rspec gemspec and Albacore gemspec . They were made simple, leveraging the language and incredibly powerful.

Bundler scripts are ruby code veiled in a thin DSL which defines the external dependencies in one place (in addition it has a GemFile.lock which ensures other developers will use the same packages that you are based on all of your packages’ dependency versioning requirements). Again, we can look to the elegant simplicity of the Albacore Gemfile for an example.

The ruby community has fully embraced their language, package management system, and made them first class citizens. When you clone a repo, you execute ‘bundle install’ and all of your dependencies are pulled down. You do this once. From then on you can check for updates, verify that packages are up-to-date, and all other dependency lifecycle tasks. You can also manage your gem projects with the Jewler gem; it has some very enticing features (if you looked at the Albacore gemspec, you likely saw that it is generated by Jewler).

What about NuGet?

NuGet defines its specs with XML. And to be all great and powerful XML, it is schema validated. Ninety-three percent of the first three lines of a package are copy & paste noise. This is a waste of time and hides the intent of the spec.

<?xml version="1.0"?>
<package xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <metadata xmlns="http://schemas.microsoft.com/packaging/2010/07/nuspec.xsd">

NuGet defines its dependencies with XML and manages its local package configurations with a repositories.config file and many, many packages.config files. Each of these packages.config files are stored in each project folder. If you use NuGet, the way it is designed to work, you have no way of seeing what dependencies your application actually has without loading your solution into VS and making sure that all of these config files are checked in across your codebase.


Let’s start by taking a look at a simple spec definition.

<?xml version="1.0"?>
<package xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <metadata xmlns="http://schemas.microsoft.com/packaging/2010/07/nuspec.xsd">
    <authors>Ninject Project Contributors</authors>
	<summary>IoC container for .NET</summary>
	<tags>Ninject ioc di</tags>

Pulling directly from Giles rakefile which uses Albacore to generate the XML spec:

desc "Create the nuspec"
nuspec :createSpec => :prepPackage do |nuspec|
    nuspec.id = "Giles"
    nuspec.version = @GilesVersion
    nuspec.authors = "Jeff Schumacher (@codereflection)"
    nuspec.owners = "Jeff Schumacher (@codereflection)"
    nuspec.description = "Giles - continuous test runner for .NET applications."
    nuspec.summary = "Currently supports Machine.Specifications (mspec), NUnit, xUnit.net, and NSpec"
    nuspec.language = "en-US"
    nuspec.projectUrl = "http://testergiles.herokuapp.com/"
    nuspec.title = "Giles, Rupert Giles, at your service!"
    nuspec.tags = "testrunner test unittest giles"
    nuspec.output_file = "Giles.nuspec"
    nuspec.working_directory = "deploy/package"
    nuspec.licenseUrl = "https://github.com/codereflection/Giles/blob/master/License.txt"

desc "Create the nuspec package"
nugetpack :createPackage do |nugetpack|
    nugetpack.nuspec = "deploy/package/Giles.nuspec"
    nugetpack.base_folder = "deploy/package"
    nugetpack.output = "deploy"

We can replicate this with a simple PowerShell script where we give the user a simple object which properties are attached.

$spec.id = Ninject
$spec.version =
$spec.authors = @(‘Ninject Project Contributors’)
$spec.requireLicenseAcceptance = $true
$spec.licenseUrl = https://github.com/ninject/ninject/raw/master/LICENSE.txt
$spec.summary = “IoC container for .NET”
$spec.language = en-US
$spec.tags = @(‘Ninject’, ‘ioc’, ‘di’)
$spec.iconUrl = https://github.com/ninject/ninject/raw/master/logos/Ninject-Logo32.png
$spec.projectUrl = http://www.ninject.org

There are additional properties such as dependencies, frameworkAssemblies, and files. With a scripted definition, these nodes are very simple, and we can actually use file sets and collections to specify values instead of wildcards and long lists.

In the Beginning

Perhaps the craziest part is that NuGet started with Nubular (Nu), which was implemented via ruby gems. Let that sink in for a moment. The package management system for .NET was once ruby based. Here is an original package definition for AutoFac

version = File.read(File.expand_path("../VERSION", __FILE__)).strip
Gem::Specification.new do |spec|
	spec.platform	= Gem::Platform::RUBY
	spec.name    	= 'autofac'
	spec.version 	= version
	spec.files = Dir['lib/**/*'] + Dir['docs/**/*']
	spec.summary 	= 'Autofac - An addictive .NET IoC container'
	spec.description = 'Autofac is an IoC container for Microsoft .NET. It manages the dependencies between classes so that applications stay easy to change as they grow in size and complexity. This is achieved by treating regular .NET classes as components.'
	spec.authors       	= ['Nicholas Blumhardt','Rinat Abdulin']
	spec.email         	= 'emailme@bilsimser.com'
	spec.homepage      	= 'http://code.google.com/p/autofac/'
	spec.rubyforge_project = 'autofac'

If you go to the Nu website, you’ll see this at the top:

February 2011 - The collaborative effort of the Nubular team and Microsoft has produced NuGet.

I wasn’t involved, but it seems to me that collaborative is a synonym for neutering. They took a thriving ecosystem with working solution and then stepped back years technologically. Microsoft couldn’t embrace Ruby; especially with the way it abandoned IronRuby. NuGet became the TFS of package management: highly integrated and doing nothing well.

Package Restore

I traveled to Portland and paired up with another developer to try to cut some code. We pulled down a repo, and then proceeded to trip all over ourselves…err, NuGet…for quite a while. The project we were dealing with, unsurprisingly, relied on NuGet package restore.

We were surprised at every step dealing with packages hooking into the csproj targets, build steps, and tooling incompatibilities. In the end, we changed frameworks and were finally able to rip out NuGet’s parasitic attachment to our project.

NuGet package restore is a feature in which NuGet embeds itself into your project files

  <!-- your project contents-->
  <Import Project="$(SolutionDir)\.nuget\nuget.targets" />

It also creates a .nuget folder at your solution root which contains NuGet.config, NuGet.exe, and NuGet.targets.

When you pull down a project/solution that uses package restore, and then try to build it, you get an error (well, many errors most likely, but they are the same). NuGet, through its targets hook, will call:

"$(SolutionDir).nuget\nuget.exe" install "C:\<PathTo>\packages.config" -source "" 
	    -RequireConsent -o "$(SolutionDir)\packages"

Where <PathTo> is changed for every packages.config littered in your codebase.

C:\Dev\SomeProject\.nuget\NuGet.targets(80,9): error : Package restore is disabled by default.
To give consent, open the Visual Studio Options dialog, click on Package Manager node and 
check 'Allow NuGet to download missing packages during build.' You can also give consent by
setting the environment variable 'EnableNuGetPackageRestore' to 'true'. [C:\Dev\SomeProject\src\SomeProject\SomeProject.csproj]
C:\Dev\SomeProject\.nuget\NuGet.targets(80,9): error : One or more errors occurred. [C:\Dev\SomeProject\src\SomeProject\SomeProject.csproj]

So, now I need to either open Visual Studio and find the setting they list, or I need to create an environment variable and set the value to true. Now, we have to configure package restore on each developer’s machine, plus the build server to deal with this requirement.

There is something more insidious hiding inside your build due to package restore. When you run your build, it is not a one-time install of your application’s dependencies. NuGet will run a package install for every single package in every project that has a packages.config file. Let’s look at the log output from building a solution with package restore:

> msbuild SomeProject.sln
  "C:\dev\SomeProject\.nuget\nuget.exe" install "C:\dev\SomeProject\src\SomeProject\packages.config
  " -source ""  -RequireConsent -solutionDir "C:\dev\SomeProject\ "
  All packages listed in packages.config are already installed.

The build will always run the nuget.exe install call for every build. Every build you have is now slower, and will be slower still for each project you add that makes this redundant call (nuget will detect that the packge is already installed, but it is still slowing your build and filling your logs with garbage).

NuGet is also dealing with privacy issues related to package restore. Had the NuGet team stuck with Nu’s original implementation, there would have never been a need for package restore with everything handled with Bundler. Ruby’s gem system isn’t the greatest in the world, and there are growing numbers of people that are talking about trying something else, but they are still years ahead of the NuGet.


NuGet on the command line is a shadow of its Package Manager Console self. NuGet.exe:

  • Can’t update a package, let alone all, without being given a packages.config file. (I hear this is being fixed)
  • Can’t remove a package, at all.
  • Does not run init.ps1, install.ps1, or uninstall.ps1 leaving your packages broken when used outside of Visual Studio. You don’t have a DTE instance, so even if the scripts were executed, they wouldn’t work. Package installation has been designed to force a user into Visual Studio to install a dependency.
  • With the config parameter: gets or sets NuGet config values - except that you don’t know what keys exist so you can’t see the config values.
  • When using packages.config will get the exact version in the file no matter what.
  • Doesn’t get dependencies at all when using packages.config; they are expected to be in the file already.

You can’t specify a compatible set of versions in your packages.config files should one or more of your dependencies require a different, but still compatible version of a shared dependency. I can hear you objecting with “you can use allowedVersions”:

<package id="log4net" version="1.2.10" allowedVersions="[1.2.10]" />

This however, doesn’t work; it is broken by design. Using allowedVersions means you’d have to track down every packages.config file and manually edit it to add this attribute. Even worse, NuGet looks at each packages.config individually so it can’t see version compatibility issues with allowedVersions.

NuGet appends the version of the library to the name when creating a folder in the Packages folder. This means that any updates require you to update every one of your project files with a new file path. This is a weak argument, but we should be given an easy way to override this without manually creating the calls to nuget.exe or install-package.

One item, that may be the coup de grâce of NuGet’s feature failings relates to how package sources are specified. How do you know where a particular dependency is sourced? We can see an example with xUnit. If we open the xunit NuGet.targets file:

<ItemGroup Condition=" '$(PackageSources)' == '' ">
  <PackageSource Include="https://nuget.org/api/v2/" />
  <PackageSource Include="http://www.myget.org/F/b4ff5f68eccf4f6bbfed74f055f88d8f/" />

When you look at a dependency, you have no idea where that dependency is coming from. NuGet will have to check each feed to find where a particular package resides. What happens if the package is in both feeds? What if you want the version from your own feed to be used? This is just another example of making the tool work like magic and finding yet another place to configure NuGet.

Now, to figure out what dependencies I have, and where they come from, I have to look through three different types of XML files. Even though once I have done that, I may not know where those dependencies even come from.

The End?

Will I stop using NuGet? No. Will I be happy using it? No. Will I stop complaining? Hell no. Pretending that everything is great will never push us to do better. I’ll do everything I can to remove NuGet’s way of doing things and put in my own.

Finishing Questions

  • Do we host a new gems website and write some importer form nuget to convert packages to gems?
  • Will we write something gemlike?
  • Can we analyze gems’ and NuGet’s failings to design a better system?
  • Can we create tooling around a cross platform DSL other than XML?
  • The use of PowerShell and reliance on Visual Studio really hurts mono development on Linux using nuget. What could we use instead? Can we push Microsoft to make PowerShell OSS?
  • Can Nemerle be leveraged to create a DSL that will work for mono as well?
  • Do we rewrite nuget the correct way? Correct being first class, OS independent CLI and no XML. The CLI is the most important thing in most camps. It should be the case, but it obviously isn’t since MS took over NuGet. There is a whole camp that thinks the GUI is the most important thing; then there is MS that tries to make all free things tie into paid products somehow. They would be wrong. I understand their perspective, but it is the wrong choice. When developing free software, choose to do it right. By creating the deep integration into Visual Studio and abandoning the command line, we are forced to use Visual Studio in order to leverage NuGet, thus destroying any competition.