Gitlab is my go-to place if I want to create and share a repo, setup CICD and create some examples for my Cypress adventures. Being it that I need to create some demo, a Cypress or CICD proof of concept, or just some space where I can share a repo with the world, Gitlab is the place where you can find me hiking a digital trail. Note to self: creating a blog about why I love Gitlab so much.
As one of the founders of the Dutch Cypress Meetup, it surely cannot be a big surprise that I am a big fan of Cypress. It has everything to do with the hassle-free setup of the testing framework. It just works great out of the box.
One of the great powers of Cypress is that it offers an official Cypress Dashboard with which you can record, parallelise and load balance your tests, and has many more features. Again a note to self: write about some learnings.
So this blog contains a note to self: share with my future self and the world some boilerplate about setting up a multi-project pipeline in Gitlab which at the end drives my Cypress Integration (or E2E) test.
Because readers can have different perspectives on what E2E or Integration tests mean, for the sake of clarity, let me first explain what I mean with an E2E test within this blog.
Lets say I have a backend service repository and a frontend service repository. Both services are within my own span of control. I work on both services and they are tightly coupled.
In this scenario, I want to create some tests that cover scenario’s that involve both the frontend and the backend. No mocking / stubbing in these tests. In this scenario my integration tests are tests that involve the stubbing, so when I test the frontend, I stub the backend, making sure that when I run the integration tests, my backend does not need to run, and my mock acts as the backend, returning some fake data.
In the case of an E2E test, I want to make sure that when I update the backend, that my E2E tests run, so I will have some feedback about the integration of the services. I want to be able to have the confidence that when I deploy my backend service, the core features of my system still function as expected ‘E2E’.
So now the question is raised: where do I store my E2E tests. I surely do not want to have a third repo in which I only store my E2E test-suite because this will detach my tests from any functionality. My solution to this matter: if the E2E tests are mainly frontend driven, I also want to store them within the frontend repository. With which I can even use some common functionalities of the frontend repo (something like i18n or other util functions). Within the frontend repository it will be.
Next challenge: I have to figure out some stuff: if I change some source code in the backend repository, how will I trigger the E2E tests in the frontend repository. The answer is not blowing in the wind, but is given by Gitlab: using multi-project pipelines.
A multi-project pipeline is a pipeline where the pipeline triggers another pipeline which is defined in a different repository.
test-service: stage: test trigger: project: "joelgrimberg/frontend-service" strategy: depend variables: PIPELINE_TYPE: "multi-project-pipeline" rules: - if: $CI_PIPELINE_SOURCE == "merge_request_event" when: always - if: $CI_COMMIT_BRANCH == "main" #run in main-branch pipeline when: on_success - when: manual allow_failure: true
Because some people like it a bit verbose, and like to read a story behind a yaml, here is the explanation of the yaml:
This yaml shows a run-e2e-test job that runs within an e2e-est stage. The job is triggered by a Merge Request (this is based on the pipeline_source) or as a manual action. When this job is triggered, it will trigger the (default) frontend pipeline in which the E2E tests are defined.
strategy: depend is a very useful setting, because it will make sure the parent pipeline will fail if the triggered pipeline defined in the ‘other repository’ is failing. In our case: the backend pipeline triggers the e2e test in the frontend repo and if this test fails, the backend pipeline will fail.
I now want to add some magic to the frontend repository pipeline, because I want to exclude some jobs from the frontend pipeline. I only want to run the E2E tests from the frontend repository and not the entire ‘main’ pipeline. I do not need the build and integration test of the frontend to run. I just want to test the new backend service with the current (main) frontend.
E2E test in my frontend pipeline configuration:
e2e tests: stage: run-e2e-test resource_group: e2e-test script: - NODE_ENV='production' - npm run cypress:e2e rules: - if: $PIPELINE_TYPE == "multi-project-pipeline" when: always - if: $CI_PIPELINE_SOURCE == "merge_request_event" when: always - if: $CI_COMMIT_BRANCH == "main" #run in main-branch pipeline when: on_success - if: $CI_COMMIT_BRANCH != "main" #do not run in feature-branch pipeline when: never - when: manual allow_failure: false
Allright. let’s break it down a bit.
The rules used in this yaml file is Gitlab’s new way of defining ‘when’ to run a job.
Gitlab checks all rules within a job definition from top to bottom. If the rule === true, it will stop checking the other rules because there is just no need for further rule checking within that job and it will add the job to the pipeline.
The resource_group will make sure that only one job of this kind will run at a time. So when 2 backend-service pipelines will trigger an E2E test at the same time, the second will wait for the first to complete. This will make sure no collisions will occur.
So in my example, then there is a variable present with the key ‘multi-project-pipeline’ it will be added to the pipeline. If you look again at my backend-service gitlab-ci.yml you will find that I have added that key to the triggering (parent) pipeline. And therefore this job is added to the pipeline config and will run as child pipeline. The only thing I will have to do is make sure that the other jobs in the triggered pipeline (frontend-pipeline config) will not run when the frontend-pipeline is triggered by the backend pipeline:
- if: $CI_PIPELINE_SOURCE == "parent_pipeline" when: never
You can find the complete setup here: