How can you make your builds complete faster so that you can build more often and have earlier feedback?
You could do this by caching your node modules. I’ll explain how to do this within Azure DevOps.

Creating a build pipeline in Azure DevOps for a Node.js based application is straightforward. You only need a few lines of code to achieve this. At a later point you may add additional tasks, such as linting and running tests. As a result, the pipeline will take longer to complete with each added task. Let’s start speeding things up using caching.

You can find the repository for node modules caching in Azure DevOps here.

Suppose the build YAML file looks as follows:

trigger:
- master
pool:
  name: Hosted Ubuntu 1604

steps:
- bash: npm install
  displayName: NPM Install Dependencies
- bash: npm run build
  displayName: Build Angular application

The first step is to install all dependencies. The second step needs these dependencies in order to start the build. In this example, I used an Angular application. However, any application that needs NPM packages will profit from the caching solution. The node modules folder is 257MB.

The results are as follows:
Bare minimum no caching

A build time of under a minute is not bad at all, but we can do better.

Adding caching

I used the caching solution from Microsoft DevLabs.
There is also the Microsoft Azure DevOps caching solution currently in beta.
The reason I chose the Microsoft DevLabs task is that it is faster and provides additional configuration.
After adding this task from the marketplace, an artifact feed needs to be set up as follows:

Azure artifact feed - create

In Azure Devops when editing the build definition, the tasks will show in the right pane:

cache tasks

You can find more on these tasks here.

save and restore task settings

You need cache alias only when your build pipeline needs to restore and save multiple caches and uses optimistic caching. For node modules you can tick the platform independent checkbox. In this way, the build agent will use the same cache independent of the agent’s OS.

The generated YAML looks as follows:

trigger:
- master
pool:
  name: Hosted Ubuntu 1604

steps:
- task: RestoreAndSaveCache@1
  inputs:
    keyfile: '**/package-lock.json, !**/node_modules/**/package-lock.json, !**/.*/**/package-lock.json'
    targetfolder: '**/node_modules, !**/node_modules/**/node_modules'
    vstsFeed: '106881c5-1e85-4e16-a467-be2702084631'
    platformIndependent: true
- bash: npm install
  displayName: NPM Install Dependencies
- bash: npm run build
  displayName: Build Angular application

After committing the YAML file and waiting for the build to complete, let’s look at the results:

bare minimum - caching

This run takes longer, because the cache needs to be saved. Azure artifacts shows the cache feed:

Caching feed

Let’s run the pipeline again and check the results:

bare minimum - retrieve from cache

It still takes somewhat longer, but now Azure artifacts provides the cache. The “NPM install dependencies” task went from 39 to 17 seconds.

Let’s add optimistic caching to the YAML file:

condition: ne(variables['CacheRestored'], 'true')

optimistic caching

This means the “NPM Install Dependencies” task will be skipped when the node modules are in the cache.
Let’s look at the result:

Bare minimum - optimistic caching

We just decreased the build time by 10 seconds.
Depending on the amount of tasks in your build pipeline this decrease can vary. I managed to improve the build time 5m 29s to 3m 22s minutes at my client.

Conclusion

Node modules caching in Azure DevOps will reduce your pipeline run duration when your solution relies on node packages. You can further reduce this run duration by adding optimistic caching. When you introduce parallel jobs, node module caching could increase the effect of faster builds. In my opinion, you should start doing node module caching when creating a pipeline. I believe the performance benefits will outweigh the time needed to implement this.

Let me know what you think

Any comments are more than welcome! You can add those in the comments section below or drop me a message through LinkedIn.