Use Jenkins to Run a Function as a Service Platform
Most shops use Jenkins as a continuous integration/delivery solution - the unequivocal primary role for Jenkins. Hidden in plain sight, however, Jenkins also comes with all the tools necessary, right-out-of-the-box, to supply your infrastructure team with a function as a service platform. In fact, Jenkins was doing FaaS long before it was the hot new thing.
We’re Not Landing on the Moon, Are We?
I don’t care what anything was designed to do; I care about what it can do. I found it quiet weird at first using Jenkins for use cases other than development, but as an infrastructure team expected to provide services, it was a very attractive solution for enabling quick service delivery - in this case, a website management portal. Jenkins is the foundation upon which we built layers for enabling self-service interfaces for managing organizational websites. It provided various functions including creating/deleting websites, adding/removing aliases, managing privileges, and some trivial database actions.
Best part of all this, because Jenkins exposes a REST API, this meant not having to grant Jenkins access to users - all they see is a lightweight homegrown web UI that makes backend calls to various Jenkins jobs which do the heavy lifting.
Guide to Get Started
To use Jenkins to create a function as a service, you require:
- a function tracked in a repository
- a Jenkins job
- an HTTP client
To get started, create a new Freestyle project. Your code is stored in a repository somewhere, so set that up under Source Code Management. The linchpin of the entire setup is in the Build Triggers - check the box for Trigger builds remotely (e.g., from scripts) and set an auth token (example: 4UtH70K3N
).
Simply setting an auth token is not enough to consider this a secure product. If you are planning to run this in production, you should use a proper auth scheme (e.g. AD, LDAP, OAuth2). A sample setup might look like a PHP web app authenticating to LDAP and being hosted behind an Nginx proxy with TLS communication top-to-bottom.
To execute your function, in the Build step, select Execute shell from the Add build step menu and add whatever command you need to the shell area for executing your function:
python run.py
Essentially, every time a Jenkins build is triggered, Jenkins pulls down the code from your repository and runs whatever you’ve defined in the Execute shell step. To test your callable REST API, fire up an HTTP request:
curl -XPOST http://<jenkins server address>/job/<some_function_job>/build \
--data token=4UtH70K3N
which will kick off a “build” of your Jenkins job. Pretty trivial, but effective. A common next step is to make your job parameterized - to do so, check the box for This project is parameterized and add parameters whose names ($FOO
and $BAR
in this example) can be passed into your function code:
python run.py $FOO $BAR
except now, your HTTP request must include those parameters:
curl -XPOST http://<jenkins server address>/job/<some_function_job>/buildWithParameters \
--data token=4UtH70K3N \
--data-urlencode json='{"parameter": [{"name":"FOO", "value":"hello"}, {"name":"BAR", "value":"world"}]}'
So turns out, you don’t have to up and move your codebase to Google Cloud Functions or AWS Lambda just to satisfy your “serverless” itch. Neat? Totally. Easy? You bet. Nimble, heavy duty, and scalable? Maybe not so much, but for small solution delivery, Jenkins absolutely gets the job done. It’s all about knowing your growth and constraints. Check out Jenkins documentation for specifics on how to interact with the Remote access API. And for some more details on how I used Jenkins for service automation, check out conference talk on Jenkins for Smarter Operations (or the just slide deck), specifically the Central Web Services and Web Portal sections - thanks Christian and Curry for the wonderful presentation.
comments powered by Disqus