At the most recent AWS Re:invent, Amazon announced support for custom runtimes on AWS Lambda.

AWS has support for quite a few languages out of the box. NodeJS being the fastest, but not always the most readable one. You can edit Python from the AWS Console, while for Java, C# and Go you’ll have to upload binaries.

The odd thing, in my opinion, is that there are no functional languages in the list of supported languages1. Although the service name would assume something in the area of functional programming. The working of a function itself is also pretty straightforward: an input events gets processed and an out put event is returned (emitted if you like).

Therefore it seemed a logical step to implement a runtime for a functional programming language. My language of choice is Elixir, a very readable functional programming language that runs on the BEAM, the Erlang VM.

Building a runtime

The process of building a runtime is pretty well explained in the AWS documentation. In my case I gained a bit of experience by implementing the bash-based runtime example. This gives a good basis for any custom runtime. A runtime will be started by a script called “bootstrap”. Already having a Bash-based start script will allows you to test a bit while you set up the runtime.

The runtime itself should be bundled as a zip file. An easy way to build such a zip file — especially when there are binaries involved — is with the lambda-base image from the LambCI project.

In order to make the zip file not too big, I had to strip it down considerably. The combined layers, including the runtime should be no bigger than 65MB. Many tools, like Observer (monitoring), Mnesia (database), terminal and GUI related code can all be left out. I was able to bring down the size to a decent 23MB.

The defacto way to distribute an Erlang application is by means of an OTP release. This bundles the code and the BEAM in one single package. For Lambda I want this to be more lean: you’d just have to deploy your compiled code and that should be it. This makes deployments faster, since there are less bytes to move around and the application can be kept in the runtime layer.

Benchmarks

We all want it to be fast. I have not done a full-blown performance test. For a Hello-world function I deployed the responses were quite okay. As low as twenty ms, and many times only a couple of milliseconds.

The cold start speed is about 1.3 seconds, according to AWS X-Ray. This is quite constant. I want to see if I can bring this down. At this point Erlang’s legacy is kind of in the way for its use as a Lambda language: the Erlang/OTP ecosystem is built around applications that never go down, like telephone switches. For Lambda, we have the certainty that this will never be a long lived process. Cold start times are still compatable with Java, though.

Final thoughts

The Lambda model is straight forward. It’s good to see that the use of custom runtimes does not involve a (serious) performance hit. With the tools described above it’s quite straightforward to add support for a language not present on AWS Lambda today. You’ll have to do without the web editor, which I did not consider a big loss, since I tend to put my code in git anyway.

Have a look at the Elixir Lambda repository and give it a go. I’ve added Cloudformation templates and a Makefile for convenience. Let me know what you think!

[1] Well, you could execute F# code with the .Net runtime.