link{a}list.io
Web and mobile application data made easy
November 28, 2018
Build Refactoring
Modernising the build infrastructure for linkalist
Like many side-projects, getting a system together for making a build has never been a priority for me. Generally, pulling the latest from Bitbucket and a bit of scripting has done the job but this has obviously got its limitations as you try to move the system to a production-ready state. So, a while ago, I decided that it was worthwhile moving from this fairly loose approach to one that involved building and testing a deployable artefact.
So this post will go through the broad brush-strokes of setting up a build process that produces a tested deployable artefact for a LAMP project using Bitbucket Pipelines. I’ll cover the details of actually deploying this artefact in another document.

Unit Testing

From the start with linkalist, I’ve always had a reasonable set of unit tests in place as I am a strong believer in TDD — even for personal projects where you could be expected to be aware of the entire system. Unfortunately, I’ve found that as the system grows, you can quite easily lose awareness of code you’d written years ago, so it is always good to have a degree of unit testing in place.

As linkalist is built on top of the CodeIgniter framework, I’ve always found the Toast framework to work quite well although I’ve made quite a few modifications to it over the years. This produces an output that is close enough to JUnit that most third-party tools can interface with it. It also slots quite nicely into various Atlassian tools.

Build Framework

As I’m already using Bitbucket for source control, JIRA for backlog management, and various other Atlassian tools, the path of least resistance was a choice between Bamboo and Bitbucket Pipelines. Since the former really requires a dedicated server — or at least a dedicated AWS instance, pipelines really seemed like the way to go.
Pipelines is reasonably close to free for small projects although there is a build time limit of 50 minutes per month which gives me about 25 builds per month. Going beyond this costs me $10 for another 1,000 builds which won’t break the bank.
So having chosen pipelines, the next step is to get started on a build image. You can use any public docker image for this or integrate your own as required. Since, I rather limit what is installed on a server, I chose to construct my own docker server.

Docker Image

Because I’m a cheapskate and I felt no need to hide my build image — private images cost money — the docker image I used is publicly available on docker hub. You can plug this image directly into your own Pipeline, but I wouldn’t recommend doing so as your own project will have its own requirements and this image will obviously only be vaguely suitable if your language of choice is PHP.
There are some important differences between the docker image and the real hosting environment. The most notable of these is that we install mysql-server and some associated tools. In addition, we set up git, ftp and a few other tools that are required by the build process but are not required by the production web server.
There is plenty of information out there on how you can go about creating a docker image that meets your requirements and there is far to much detail in it for me to go into it in detail here. As our dockerfile is publicly available, you are certainly welcome to use it as an example for getting started.
We have to install mysql on the docker image as easiest place to host the database needed for unit testing is within the pipeline. Hence as part of the build process, we need to install a test version of our application database.

Deploying the Build

At this point we have got ourselves a built object. Our existing deployment mechanism is fairly manual so for the moment we just altered the existing process so that we can just swap over. However, we are now working on a more automated deployment mechanism that will be covered in a future post.

Conclusions

This has been a fairly brief introduction to the process of setting up an automated build using Bitbucket and Docker image. It was never intended to be a comprehensive guide to doing this and the various services have all the detailed documentation you’ll need even if sometimes the bigger picture is missing.
The outcome of this work is that we now have a properly tested build artefact that can be sanity checked on the Bitbucket pipelines infrastructure that bypasses the need to host our own build server and is effectively free until the point where we need to build multiple times per day.