Dependency Management with Dependabot

No matter what language or package manager, dependency management in most projects suffer from many of the same problems: - Evaluating incoming security risks associated with packages in your project never really happens. - Latest dependencies are added when a project starts, and not updated very often after. - When upgrades of packages do occur it is typically a big bang upgrade of everything causing more effort and testing than you would have hoped. - Whether it’s npm, NuGet, PyPI, maven, Docker etc… it feels like a full-time job sometimes to be able to manage packages the way we all would like to. The problem is exacerbated depending on your company’s methodology for maintaining projects, the architecture of your repository granularity (i.e. do I have 100 different GIT repos to maintain packages for independently?) and whether you live in a product-based or project-based world.

There must be a better way!


Our first attempt towards solving the dependency management conundrums was two years ago. We started out by attempting to automate some of the dependency upgrades within a .NET Core related GitHub project. This led us no further than a fantastic tool called “NuKeeper“. NuKeeper has a ton of features that are second to none in accomplishing some of the intricacies in dependency management that you’d want.

NuKeeper is a command line utility that will create Pull Requests based on proposed package updates with NuGet semantic versioning. As such, this tool is specifically for .NET, but has fantastic support across .NET Framework and .NET Core (and several GIT providers).

We found its ability to configure includes and excludes based on wildcards incredibly helpful. Additionally, you can configure the granularity of Pull Requests that you want. For example, we could configure for only 10 updates, but could set this for either being managed via 1 PR or 10 PRs. One of its best features is its ability to configure the age of a package before updating. We’ve found this to be indispensable in not updating packages before they are battle-tested by other consumers first.

The drawback is that it is a command-line tool (or a dotnet global tool). This makes it rather difficult to automate at an organizational level. Sure there are ways to do it. You can include the execution of the global tool in your project during a CI build. In an attempt to build something more agnostic so we didn’t have to build it into every build pipeline, we put together a script, a YAML manifest and a single nightly build that would loop through and execute on each repository that was configured. This allowed the team to submit a simple PR to add new repositories without much additional work.

NuKeeper and this YAML configuration worked well for a short period of time. However, it also brought some complications. Its infrastructure necessary to configure internal private NuGet feeds was sloppy. Some weird exceptions ended up being problematic over time for projects with Pull Requests already submitted. It broke down after a while from the enterprise configuration and execution perspective and usage faded.


During our NuKeeper analysis, we briefly looked into Dependabot as well. At the time it was a paid product (unless you were working on a public or open source project) and not something we got too far with. It looked very promising with a SaaS type of an approach, while also supporting a very wide variety of package managers. In May 2019, Dependabot announced it was joining GitHub and its public preview was now available for free or private and/or organizational usage. This changed things up quite a bit.

Over the past few months, we have been able to fully deprecate our usage of NuKeeper in favor of the Dependabot preview. This unlocked the same features we were using for .NET across all our package managers and language ecosystem. Thus far with its usage, feedback is very favorable:

  • It has consistent and reliable Pull Requests into GitHub
  • It properly manages the full Pull Request lifecycle, including rebasing or closing the PR when necessary
  • Dependabot can be interacted with via command comments directly in the Pull Request
  • After enabling for your organization, it can be activated on a per repository basis
  • You can easily configure private feed credentials centrally for all repositories
  • Getting started with Dependabot is as simple as adding the GitHub app for Dependabot and then adding a config.yml to your repository
version: 1
- package_manager: "dotnet:nuget"
  directory: "/"
  update_schedule: "live"
- package_manager: "docker"
  directory: "/"
  update_schedule: "daily"

Dependabot PR

It is also worth noting that GitHub has had their “Security Vulnerability” scanning feature available in their repositories for some time now. This feature is now integrated with Dependabot and you can explicitly receive Pull Requests from Dependabot for packages with identified security threats only if you wish. This is exciting to see the intersection of the GitHub features in this way. As an aside, we expect that in the future we’ll begin to see some very interesting compositions between Microsoft, GitHub Actions, Dependabot and NPM with GitHub.

UPDATE: Dependabot is fully integrated as part of your GitHub experience now. If you’re using the Dependabot-preview as a separate GitHub app it continues to work for now. If you enable Dependabot native from your GitHub settings it will provide an automated Pull Request to convert to the new configuration file stored in your .github folder. However, beware that Dependabot native does not yet support Private registries for package configuration. This is a big reason as to why we use Dependabot, and intend to remain on Dependabot-preview until later this year when the GitHub roadmap indicates Private registry support and configuration will become available.

The Good with the Bad

Dependabot certainly scratches the itch, but there are always some caveats. Consider the following thoughts and suggestions for efficiently getting started:

Pull Request Flooding: Upon activating you may be flooded with initial PR upgrades. Prior to activating, consider doing a one-time upgrade of all dependencies in your repository to bring it up to the latest dependency versions, making the transition to Dependabot easier. Keep in mind Dependabot creates a single PR per dependency, with a maximum of 10 open PRs per repository (config setting). As you merge and close it may feel never-ending as new PRs open automatically. This pain can be alleviated by simply upgrading to up-to-date version in advance before activating Dependabot.

Config File Prioritization: Using the config.yml is not a requirement, but you should prioritize the usage of the “config” file for configuration on Dependabot. This provides clear transparency on the Dependabot configuration without heading to the UI. Additionally, review more complex configurations if there are certain priorities and ranges of versions to avoid. You can comment on a PR in order to give Dependabot commands to ignore certain versions. This works for patches but easily loses visibility to major and minor versions of dependency upgrades long-term.

Comprehensive Test Suite & PR Status Checks: The real value in Dependabot is accepting small dependency upgrades incrementally, rather than big bang upgrades down the road that you handle manually. The ability to validate those Pull Requests through a test suite that is executing automatically will enable you to approve and merge these small incoming updates with confidence as they happen. If you are unaware of how certain dependency updates will affect the stability of your codebase the tendency is to leave them and let them build up. At this point, you have lost a great deal of the benefits provided by this tooling, so don’t let this happen.

Notifications: In order for you and your team to stay informed of incoming dependency upgrades, you’ll want to ensure that you have PR request notifications sending somewhere. This can be enabled via email, slack, etc. You won’t have a member of the team who originally submitted the PR since it was automated.

Warning…. It Is Not Smart: This is not some type of magic tool that can read the minds of the dependency builders. This means that if a package owner does not properly apply semantic versioning to a package it can make it tough to automatically perform upgrades. Additionally, Dependabot does not understand the intricacies of each individual framework and some of the inherent framework versions for it. As an example, if you are using a package for framework-x 1.0, which must use depending packages of 1.x, Dependabot is not aware that upgrading those depending packages to 2.x is a completely ridiculous thing to do. You’ll want to consider your config settings to apply scenarios like this when required.

We do find ourselves still wanting some of the configuration features available in NuKeeper. The ability to configure the age of the packages was incredible. Under GitHub, I’m hoping we’ll see some of these more advanced features make their way into Dependabot.

For the Enterprise

The Dependabot GitHub app does allow you to “enable for your organization” as a whole. This allows your developers in the organization to simply add that config.yml and begin receiving Pull Requests with no other configuration needed.

There are of course many advantages to incrementally staying up to date with the latest package dependencies that we have mentioned. But another internal advantage that is a large driver in pushing Dependabot is the integration with internal or private package feeds as well. This gives your organization the ability to push updates when internal package dependencies are updated as well. We share a number of libraries across teams, products, and projects. It is essential that when we make a bug fix, security patch or even just add a new feature that the library is pushed out to all consumers automatically and notifies them of exactly what has changed (Dependabot includes a list of GIT Commits in that release from the repository automatically in each Pull Request). This allows our teams to move fast with less friction on shared libraries when we need them.


Travis Gosselin @travisgosselin