Make It So
In recent years there’s been something of a resurgence in using a
Makefile for project orchestration. If you’re not familiar,
Make is used to define targets which will perform assigned actions. Its intended purpose is to create assets although it is also possible to treat it as a task runner along the lines of
Make is well-established, widely available and largely portable. It provides an efficient, self-documenting single point of entry which negates multiple scripts littering the repository. With a bit of forethought, you can craft some semantic sugar that gives an immediate contextual hint as what each target will do in easy-to-understand natural language.
Orchestrating in this way means that as well as being runnable on a developer’s local machine, there’s no vendor lock-in to any CI or ties to third-party plugins. It doesn’t matter if the project is using Jenkins, GitHub Actions or Travis, we can defer control to
Another way to leverage these targets is with
git hooks to implement some preventative and corrective actions during local development and get faster feedback loops or on the server-side of the version control system to facilitate GitOps deployments.
In addition to IDE-level tooling, using the fastest tools earliest in the git hook lifecycle provides useful feedback in a timely manner.
I’ve outlined some examples of potential targets and use-cases below:
- Quick linting and code style enforcement.
- Scan for security anti-patterns and secrets -- like passwords, tokens or API keys -- that may have been accidentally added to the codebase.
- Check that the commit message meets any mandated requirements.
- Run auditing or compliance tasks such as checking for accessibility issues, licence violations, and outdated or vulnerable dependencies.
- Developers are free to disable their local hooks. Run those same commands to double check and fail early should anything not be right.
- Run functional tests: unit, contract, mocked integration, regression, and mutation as applicable.
- Quality gates. Get a go / no go to proceed further.
- Create an OCI container image or generate a static website archive.
- Provision resources as defined by declarative infrastructure as code.
- Once all the preceeding steps have been completed, the artifacts can be pushed with confidence. Run transactional end-to-end tests along with non-functional tests such as for security and performance.
- Just as an automated blue / green deployment gives an opportunity to gradually deploy and rollback if any errors are detected, a seperate progressive release strategy provides a safety net should anything go awry at the final step.
- Generate and publish API specifications.
- Circulate release notes or make an announcement on a messaging platform.
Note that there is no direct implication of the underlying technologies used with these abstractions; it’s completely language-independent and tooling-agnostic. The responsibility for implementing the work lies with the projects using this framework.
That said, quite a bit can be inferred from file-based conventions – not necessarily directly tied to our new project templates. Taking inspiration from Heroku’s concept of Buildpacks – which were designed to handle deployments at scale – it’s possible to construct a matrix of trigger files and corresponding tools.
This defines a set of predefined tasks which can be run in parallel at each stage in the process, where any one of them could stop the pipeline should it find an issue.
This is just by way of some minimal examples. In reality there could be layers or combinations of tools applied at each stage. A more fine-grained control of what to do – and when – can be achieved by adding targets to appropriate lists based on the presence of individual configuration files, such as:
The point is that the developers now don’t have to think about a lot of this stuff; it’s taken care of for them and they can focus on what they want to do – namely writing quality code. Paving the roads and providing secure defaults for the quickest, smoothest paths to production.
Alignment on a small selection of languages gives continuity and allows planning for recruitment and maintenance. Doing this too for frameworks and libraries as well as approach and tooling – with caveats of freedom of OS and IDE – can guarantee consistency and adherence to best-practice.
#dev #make #work