The CI/CD and DevOps Blog

Configuring Multi-Stage CI

In this blog, we demonstrate how to use the Shippable platform to perform Multi-Stage CI on your repositories. The key benefit of Multi-Stage CI is to split a time-consuming CI process into smaller stages to detect issues in code quality / tests as early as possible and shorten the feedback loop on every checkin. This often entails refactoring or designing your application into smaller components and testing each component in isolation first before running more expensive integration tests of your component with other components in the system.

What is multi-stage CI?

In our multi-stage CI scenario, we split the CI of a Node.js app into several stages.

  • Stage 1: Stage 1 runs on every PR and lints the source code in the repository to find style errors. To learn more about the benifits of linting your javascript code, look at this article. The idea behind Stage 1 is to perform a quick code quality check on every PR and shorten the feedback loop for any errors in coding style and bugs found during static analysis. This allows developers to quickly find and fix issues in their PRs.
  • Stage 2: Stage 2 runs on successful completion of Stage 1. In Stage 2, we run a small subset of tests to quickly validate the PR.
  • Stage 3: Stage 3 runs on the merged commit to the repository. Here we run a broader set of core unit tests that take longer to run than Stage 2.

Configuring CI For A Postgres Database

Shippable makes it easy to setup database migrations and test them continuously. In this blog, we will go over the steps to execute and test migrations on a PostgreSQL database using Shippable CI

Our sample uses Node.js and the node-pg-migrate module to setup migrations on a PostgreSQL database. Shippable integrates with PostgreSQL and allows you to automatically launch a PostgreSQL instance with a single line in the yml configuration. We will test migrations on this PostgreSQL instance.

 

Sample project

The code for this example is in GitHub: devops-recipes/ci-migrate-postgresdb 

You can fork the repository to try out this sample yourself or just follow instructions to configure your own use case. 

Are you Stuck in The New DevOps Matrix From Hell?

If you google "matrix from hell", you'll see many articles about how Docker solves the matrix from hell. So what is the matrix from hell? Put simply, it is the challenge of packaging any application, regardless of language/frameworks/dependencies, so that it can run on any cloud, regardless of operating systems/hardware/infrastructure.

                     

The original matrix from hell: applications were tightly coupled with underlying hardware

 

Docker solved for the matrix from hell by decoupling the application from the underlying operating system and hardware. It did this by packaging all dependencies inside Docker containers, including the OS. This makes Docker containers "portable", i,e, they can run on any cloud or machine without the dreaded "it works on this machine" problems. This is the single biggest reason Docker is considered the hottest new technology of the last decade.

With DevOps principles gaining center stage over the last few years, Ops teams have started automating their tasks like provisioning infrastructure,  managing config, triggering production deployments, etc. IT automation tools like Ansible and Terraform help tremendously with these use cases since they allow you to represent your infrastructure-as-code, which can be versioned and committed to source control. Most of these tools are configured with a YAML or JSON based language which describes the activity you're trying to achieve. 

Spinning up Custom Nodes Programmatically

Most customers use nodes provided by Shippable to run their CI/CD workflows. However, you also have another option called Custom Nodes, which allows you to attach your own build nodes, which can be inside your VPC and/or behind your firewall, to run your CI/CD workflows

Custom nodes provide many key advantages:

  • Security: Your build machines can be inside your VPC and/or behind your firewall, which gives you the ability to configure access, IAM, etc. We even have a way of configuring these machines so that you do not have to grant Shippable SSH access! This means your code never leaves your firewall and no external entity can access your machines.
  • Faster build times: You can leave your build machines running all the time, which eliminates the occasional 2-3 mins per build that is added when new machines are spun up on Shippable's hosted infrastructure.
  • Docker caching: If you use Docker for your build workflows like pulling Docker images from a registry or building Docker images, your build machines will already have these images and this will speed up your builds.

Using NPM Private Modules In Your CI Workflow

In addition to publicly available packages, npm also allows users to publish and manage packages in a private namespace. If you want your npm install command in the CI workflow to install your private dependencies, there are a couple of setup steps that are required.

Unless you follow these steps, your npm install command will throw an error, such as 

  npm ERR! 404 Not found : @manishas/my-private-module   

 

Step 1: Get your npm auth token

First, you need to find your npm token, which is stored in the .npmrc file in your home directory. The file will contain a line which looks like this:

  //registry.npmjs.org/:_authToken=GFYR$E8D-89DG-GH54-HJDR-TYDGOSYT785T  

Copy the value of the token, which is everything after authToken=