Using Golang for your Serverless projects

19 Dec, 2023
Xebia Background Header Wave

In one of my previous blogs I wrote why I switched to compiled languages for my lambda functions. But using Golang for your lambda functions does add some challenges. In this blog I would like to share the challenges that I have seen and how to mitigate them.

Spoiler, I use a Makefile

I always use a Makefile in my projects, If you are interested in why I wrote a blog on that as well. I am mentioning this before we dive into the challenges to keep the focus on the solution and not the technology.

Every function is its own module?

When you develop a CLI tool, your project structure is quite simple. You have a go.mod file in the project root directory, and at least a main.go file that contains the business logic. When you are creating a serverless project, this changes. This is because each lambda function needs to be its own module. As a result you will need to create a folder for each function and each folder contains its own go.mod and main.go files, forming its own module.

Managing dependencies

The first challenge that I noticed is dependency management. This is now scattered over your lambda functions. This adds more maintenance to your project. It will also cause issues in your IDE. Since your IDE would expect a single module and not multiple it gets confused.

You can solve this by setting up a multi-module workspace. In essence you need to create a file in the root directory of your project. Then you can point to your folders that contain the modules. Your IDE should now be able to understand all your modules and handle the dependencies for you.

To ensure that your dependencies are up-to-date and we only include the one that we actually use you can run go get -u and go mod tidy. But you will need to do this for each individual module. For this reason I use the following Makefile target:

.PHONY: tidy
tidy: ## Run <code>go get -u and go mod tidy for all modules
   $(info [+] Running go get -u and go mod tidy)
   find . -name go.mod -execdir go get -u \;
   find . -name go.mod -execdir go mod tidy \;

This will look for each folder that contains a go.mod file and executes those 2 commands in that folder for you. This makes it a simple and easy command to run more often.

Running tests

I am a fan of TDD (test driven development), so obviously I wrote tests for my lambda functions. An example can be found in the “Stubbing AWS Service calls in Golang” blog I wrote. But with this multi-module workspace setup it’s hard to run all tests.

If you run go test ./... it will only run the tests that you might have in the root of your project. And, not in the modules that are used by your lambda functions. An option would be to navigate into your lambda folder and execute the command there. But since it requires additional steps the likelihood of you doing that is smaller than a single simple command.

For this reason I use the following Makefile target:

.PHONY: test
   $(info [+] Running unit tests)
   find . -name go.mod -execdir go test ./... -coverprofile=coverage.out -covermode count \;

Again same as we did with the dependency management, we will look for each go.mod file. And execute the go test command in that folder. It will also generate a coverage.out file. But since the file only contains the coverage of the tested module we need to combine them together:

   find . -name coverage.out -type f -exec sh -c 'cat {} | if [ "$1" = "{}" ]; then cat {}; else tail -n +2 {}; fi' sh {} \; > reports/coverage.temp
   echo "mode: count" | cat - reports/coverage.temp > reports/coverage.out
   rm reports/coverage.temp

We are now looking for each coverage.out file and take all the content minus the first line. We are placing all those lines in a single file and then we add the first line back in there.

Now if you want to visualize your coverage in for example GitLab. You will need a Cobertura XML file:

reports/coverage.xml: reports/coverage.out
   $(info Collecting Code Coverage)
   go run < reports/coverage.out > reports/coverage.xml

We will just use the gocover-cobertura module to convert the file to the needed XML format and you are good to go.


It’s possible to have a multi-module setup and have a simple and single command to perform actions like running your tests. However, you do need to maintain some scripting in for example a Makefile. A full working example can be found on Github.

Photo by alleksana

Joris Conijn
Joris has been working with the AWS cloud since 2009 and focussing on building event driven architectures. While working with the cloud from (almost) the start he has seen most of the services being launched. Joris strongly believes in automation and infrastructure as code and is open to learn new things and experiment with them, because that is the way to learn and grow. In his spare time he enjoys running and runs a small micro brewery from his home.

Get in touch with us to learn more about the subject and related solutions

Explore related posts