Liquibase Learnings and CICD

Liquibase is one of those technologies that I knew the name of at SPS Commerce well before I ever had cause to use it in my day-to-day work. For the unfamiliar, Liquibase is an application that interprets files and converts them to SQL. We use it at SPS Commerce to deploy and rollback changes to our databases. It was a bit of a mystery before I joined the database team, but for my team it is an invaluable tool. For some teams, the processes around using it are not as approachable. “Trust us,” became a de facto Database Engineering mantra of sorts for the, “human ops,” around Liquibase changes.

Our existing solution, leveraging Jenkins, works. It will apply and rollback database changes, but doesn’t cover all of our needs. That is not necessarily a bad thing, but that means more time spent ticketing rather than getting our code out the door and into the database. Once our technology organization settled on Azure DevOps as a continuous integration solution, a departure from Jenkins and other technologies we had leveraged in the past, we had all the pieces in place we needed to freshen up our database change process.

As with many teams, we got to the cloud on the back of Jenkins. The aforementioned “human ops” involve running Jenkins Jobs multiple times and reporting the outcome in external tickets. The prospect of leveraging existing CI/CD solutions, namely drone and Jenkins specifically, was not appealing. While it could be done, it was clear that these were not solutions we could rely on long-term and involved a lot of code we would have to write and support.

The database team had chipped away at the problem, whether we knew it or not at the time, with work on supporting copies of common database technologies in docker images. A co-product of that work was the ability to faithfully reproduce database schemas locally, with docker-compose. This unlocked a brand-new approach to how we could utilize Liquibase at SPS Commerce.

Developers and end users would no longer be obligated to use an actual RDS instance, or server to test their changes in Liquibase. A throw away environment would always be available and at their fingertips with minimal to no configuration on their machines. This meant no more failed Jenkins jobs, or at the very least, fewer failed Jenkins jobs, and a low stakes environment where developers could work free of fear that they would negatively impact their live environment. All that remained was to develop the solution that would bring Azure DevOps and our containerized database pattern together into something that internal teams would have confidence in, and that they would understand more intuitively.

In its current beta state, our new pipeline reduces manual intervention by approximately 50% and even more importantly, it does more than just work. The manual intervention it does require is not creating tickets anymore but just a few clicks to approve a change. It creates change tickets where required. It gates access to apply changes to a given database and does a great job of visualizing if there are active changes ahead of your own that you need to consider. Beyond that, it has built-in mechanisms that ensure that changes going into a database apply and rollback as we expect them to in a docker container, rather than applying against a live host.

We are excited to begin rolling this out and migrating jobs to this new process in early 2021. I’m very excited to be making changes that help us make these kinds of changes with more confidence and faster than we can make them today. I’m also hoping to share more on this topic as we wrap up final features and documentation – keep your eyes out for more!

Joe Smith

SPS Technology @spstech