In this post, we'll try to give you a clear picture of Wercker pipelines—what they are, how they operate in Wercker, and how you can make the best use of them.
First, a quick refresher course on Wercker itself:
Wercker automates the process of building and deploying your application in a Docker container. A Wercker build can be triggered by changes to your source code in a repository, making it an ideal tool for continuous integration and continuous delivery. Wercker is very flexible, and can be run from a command-line interface.
With Wercker, you can commit code changes to a git repository, and automatically produce a container with a fresh build of your application (including your changes) for review, testing, or deployment. Wercker takes care of all the steps between the commit and the container.
What Wercker Pipelines Aren't
So, where do pipelines fit in? In Wercker, pipelines are the scripting structures which control the develop-build-deploy process. We'll go into more detail on what pipelines do and how they work, but before we do that, let's clear up a couple of basic points regarding pipelines:
As you can see already, in Wercker, "pipeline" has a more specialized meaning than the highly generic "chain of processes" definition found throughout the computer world. A Wercker pipeline is best viewed as a script containing a sequence of steps, with a definite structure and syntax.
A Wercker pipeline is also not a separate file. The file that controls a Wercker project is named wercker.yml; it consists largely of the project's pipelines, along with some information about the environment, and a few related commands. As the extension suggests, wercker.yml is written in the YAML language; this means that Wercker pipelines follow YAML syntax rules and conventions.
What Wercker Pipelines Are
What is a pipeline in Wercker? Like wercker.yml, pipelines can include a small number of environment and control commands, but the core of a pipeline, and of a Wercker project, consists of a sequence of actions, or "steps." A step can be a file (either a Bash script or a compiled executable), or it can be inline script code.
Pipelines as Control Structures
These steps are what actually control the development, build, and deployment processes. The basic step syntax consists of the name of the step, followed by parameters plus values (in the format parameter: value, with each parameter-value pair on a separate line) as required. For an external script or binary, the parameters will vary, depending on the script or application. For an inline script, there are two parameters: a descriptive name for the script, and the Bash code which constitutes the body of the script.
What Wercker Pipelines Do
Wercker executes each line or command in wercker.yml in sequence. Since wercker.yml contains the pipelines, this means that each pipeline's steps are executed in sequence.
Set the Container
The yml file begins with a box: declaration, which sets the container (from the Docker repository) to be used as a base.
sets the initial container to the Go version 1.7 container, which means that the Wercker project will use the interpreters and other language tools in that container by default.
This default will remain in place, unless a pipeline explicitly sets the box to another container:
Here, the deploy pipeline sets the container to the most recent version of Ubuntu (the default, when no version number is included). Within the scope of the deploy pipeline, Ubuntu's resources will be available.
Execute the Steps
Within a pipeline, Wercker executes the steps in sequence. The beginning of a typical pipeline (with no box: declaration in this case) might look like this:
It starts with the pipeline's name (build), then immediately goes into the steps. The step in this case executes an external installation script, with parameters that tell it to install the jshint package, with strict SSL set to false.
An inline script is executed like any other step, with -script: as the type of step:
name: clear out the build binaries
code: rm -rf $GOPATH/pkg
The name of the script is "clear out the build binaries" and the code: parameter's value ("rm -rf $GOPATH/pkg") is the actual Bash script code that will be executed. A script can contain multiple lines, as in this example:
name: generate SHAs
for f in $(find . -type f -name sentcli); do
cd $WERCKER_OUTPUT_DIR/$(dirname "$f")
sha256sum sentcli > SHA256SUMS
Note that the value for the code: parameter starts with the | ("pipeline," in the generic sense of the word) character. This is standard YAML syntax, indicating that a multi-line string (with new lines included) follows.
Putting Pipelines to Use
A Wercker pipeline can be for a specific type of release (stable, beta, canary, etc.) for a specific platform, or for a much more specialized purpose. Consider, for example, the full run-tests pipeline (from the Wercker CLI project at https://github.com/wercker) quoted earlier:
volumes: $CODE_PATH /var/run/docker.sock
It runs the tests contained in the external test-all.sh script, in the location represented by $CODE_PATH. The test script itself could contain anything from an exhaustive series of tests to nominal code. All that the pipeline needs to do is set the environment, then run the script.
Managing Wercker Pipelines
Wercker's online platform includes facilities for managing pipeline workflow, using the Workflows tab. This is a very powerful feature, providing a much greater level of flexibility and complexity than wercker.yml by itself. You can create pipelines within the Workflows environment; these Workflow pipelines typically serve as aliases for the pipelines defined in wercker.yml.
In Workflows, pipelines can be chained. You can set the trigger mechanism to another pipeline, or to a git push. You can also set the git branch which will run a pipeline. Workflows also allow you to set pipeline permissions to any of four levels, from public to admin-only.
Dynamic Organization for Production Deployment
Yet more than anything, Workflows allow you to organize pipelines into complex and dynamic structures which are fully capable of handling production-level build and deployment needs. These Workflow structures include branches and checkpoints. You can create parallel Workflows to handle deployments for different environments, and place pipelines end-to-end to trigger sequential stages of deployment.
A Workflow, then, serves as a kind of meta-pipeline, organizing individual pipelines into a complete management system for continuous integration and continuous delivery, and making it possible to proceed automatically from a git-push trigger to build, test, review, and deployment. It is this complexity and sophistication which helps to make Wercker an enterprise-quality production tool.
About the Author
Michael Churchman started as a scriptwriter, editor, and producer during the anything-goes early years of the game industry. He spent much of the ‘90s in the high-pressure bundled software industry, where the move from waterfall to faster release was well under way, and near-continuous release cycles and automated deployment were already de facto standards. During that time he developed a semi-automated system for managing localization in over fifteen languages. For the past ten years, he has been involved in the analysis of software development processes and related engineering management issues.
We’re hiring! Check out the careers page for open positions in Amsterdam, London and San Francisco.
As usual, if you want to stay in the loop follow us on twitter @wercker or hop on our public slack channel. If it’s your first time using Wercker, be sure to tweet out your #greenbuilds, and we’ll send you some swag!