Skip to content
Howard van Rooijen By Howard van Rooijen Co-Founder
An experiment to automatically detect API breaking changes in .NET assemblies and suggest a Semantic Version number

As I mentioned in a previous post (and also covered by Matthew) at the start of every year we clean up our core intellectual property (~100 projects), and re-evaluate our tool-chain. NuGet Packages are now the de facto standard (for better or worse) in the .NET ecosystem, and have become one of the biggest causes of development friction for us in the last 18 months.

There is a distinct lack of up-to-date, clear and concise guidance about what the currently supported best practices are (versioning, local development, dealing with feature branches, creating packages on a continuous integration server); the most popular blog posts are a number of years out of date and there is a lack of clarity about which features / approaches are now deprecated. One of the main pain points we've experienced has been managing versioning of the packages and ensuring that breaking changes result in major version number increments.

Last summer I attended a TeamCity Customer Advisory Board meeting and when asked what my number 1 new feature would be, my response was to do something to improve how to manage any build pipeline that generates a NuGet package.

The first scenario that has caused us problems is that feature branches automatically build and publish NuGet packages; if you use TeamCity branch specifications (take the exiting build configuration, build it for any branch that matches a pattern) you cannot conditionally disable a build step based on the current branch. An example of this is the publishing NuGet packages step, which causes contamination of your package repo with feature branch builds. The way to counter this issue is to have separate build configurations for publishing NuGet packages, but this now requires human intervention to publish packages and management of the increased number of build configurations. This is not ideal.

The second scenario, which feels very human labour intensive, is understanding breaking changes and how they effect version numbers. I would like to see a more automated process for establishing whether a new set of changes causes a breaking change, and thus requires a new major version number increment, and for that new version number to flow to the package creation step.

I can't do very much to change the first scenario, but I decided to see if I could perform an experiment which I could share with the TeamCity developers to convey the sort of NuGet process I'd like to see to solve the second scenario. As per usual I started out trying to flesh out scenarios using SpecFlow. My idealised process is as follows:

  • TeamCity detects a code change and kicks off a build
  • Once the compilation step has completed, we synchronise the build artefacts from the previously published NuGet package
  • We run the API change detection tool
  • If a breaking change is detected, the tool emits the next valid Semantic Version number via a TeamCity Service Message, this updates the version number for this build
  • The tool uses ILMerge to update the version number of the assembly to be the new recommended version number.
  • The package is published with the updated version number.

I started doing a search for any prior art and discovered the APIChange project by Alois Kraus (and more specifically the branch by @GrahamTheCoder who has updated the project to use the latest version of Mono.Cecil); this library performed the IL introspection required and I then added the notion of rules that could be evaluated.

A breaking change is determined if a public type is modified or removed (thus breaking compatibility with a consuming assembly) but additive changes are not considered breaking.

Next, I wrote a simple rule evaluator to increase the Semantic Version (using SemVer) number if a breaking change was detected, followed by creating a command line tool that could be called from TeamCity which integrates ILMerge to update the assembly with the new version number (that is actually an interesting bit of code, that required the generation of a dynamic assembly and took a while to figure out).

The best hour you can spend to refine your own data strategy and leverage the latest capabilities on Azure to accelerate your road map.

All the moving parts are present and working, and I'm interested to see if the community has any ideas.

Programming C# 12 Book, by Ian Griffiths, published by O'Reilly Media, is now available to buy.

If you want to poke around in the code base, the best place to start is the executable specifications.

I've just published the experiment to GitHub. I'd love to hear any thoughts / feedback.

Howard van Rooijen

Co-Founder

Howard van Rooijen

Howard spent 10 years as a technology consultant helping some of the UK's best known organisations work smarter, before founding endjin in 2010. He's a Microsoft ScaleUp Mentor, and a Microsoft MVP for Azure and Developer Technologies, and helps small teams achieve big things using data, AI and Microsoft Azure.