Building on the humongous effort so far to improve our development pipeline, as described by Gordon in one of the previous posts, we’ve just been introducing some more tweaks to further improve our Continuous Integration.

Because our jobs evolved beyond what a single pipeline or matrix could handle, one of the changes introduced earlier was to split the process into multiple jobs. This also allowed us to structure the jobs in a way to make it easier to re-try only failed configurations, etc. But what about the complexity of managing those jobs?

In Jenkins world the Job DSL comes to the rescue! You can use it to write a generator that will create your entire CI pipeline consisting of multiple jobs. Nice and simple.
Another thing we wanted to address was the disconnect between the source/definition of our CI jobs which lived in Jenkins and source of the application itself which lived in Git.

Usually with many of the CI systems you keep definitions and details of jobs inside a VCS along with your software sources. So with GitLab CI you have your .gitlab-ci.yaml, with Travis you have a .travis.yml, and with Drone you have your .drone.yml at the root of your repo. This makes it easy for developers to work with, as the CI job definitions are right there where you need them, and they evolve with the code so e.g. if you maintain older release branches you don’t have to worry about breaking their builds when you add a new fancy code quality tool to your automated checks in master.

That’s cool, but what about Jenkins? Well, with Pipelines you put your Jenkinsfile at the root the repo and you’re sorted. This is a future improvement for us, in the meantime though, we’re using the Job DSL. With this, it’s still possible to keep the DSL .groovy scripts in VCS which is nice, but you still need to have a “generator” job in Jenkins that uses those scripts, and you need to trigger this job every time you commit a change to those scripts.

If you could somehow automate that step that would be swell wouldn’t it? Well if you use Gerrit and the Gerrit Trigger plugin for Jenkins, you may be in luck. This is exactly what this round of improvements to our CI at WANdisco was focused on.

On to the details then.

We first created our Job DSL scripts in our git repo in src/jenkins/jobs/ directory. Most of the generated jobs use Gerrit Trigger plugin so e.g. verification jobs get triggered whenever a patch or draft is published to Gerrit and can run all the checks before a change is allowed to be submitted into the master branch. Only our deployment jobs currently don’t use Gerrit Trigger as they run after a change is merged and we batch those to run once a day instead of after every merge.

Creating / moving the DSL scripts into the repo allowed us to unify the jobs, make sure common parts like build rotation, artifact handling, etc. are re-used which we ensured using a builder pattern, roughly inspired by Example 7 from Job DSL Gradle Examples and using Groovy closures at strategic places to allow customisations.

The structure of our scripts looks roughly like:

@Builder(builderStrategy = SimpleStrategy, prefix = '')
class VerificationJobBuilder {
  String path
  String jobName
  Closure matrixAxes
  Closure buildSteps
  // ...

  Job build(DslFactory dslFactory) {
    def jobSlug = jobName.replaceAll('[ .]', '-').toLowerCase()

    dslFactory.matrixJob("${path}/${jobSlug}") {
      displayName(jobName)

      properties {
        buildDiscarder {
          strategy {
            logRotator {
              // ...
            }
          }
        }
      }

      parameters {
        // ...
      }

      scm {
        git {
          // ...
        }
      }

      triggers {
        gerritTrigger {
          // ...
        }
      }

      axes this.matrixAxes

      wrappers {
        // ...
      }

      steps this.buildSteps

      publishers {
        // ...
      }
    }
  }
}

new VerificationJobBuilder()
  .path('dir')
  .jobName('foo')
  .matrixAxes({
    text('AXIS1', ['values'])
    text('AXIS2', ['values'])
    jdk('JDK8')
  })
  .buildSteps({
    gradle {
      // ...
    }
  })
  .build()

// ...

As you can see we pass two closures to the builder in this example which encapsulates the common structure of a job, but makes sure to execute the closures in appropriate places to customise it.

Working on the DSL also allowed an opportunity to tweak certain details like enabling repo browser integration:

scm {
  git {
    // ...

    browser {
      gitblit("https://${this.gerritServer}/plugins/gitblit/", 'nsfs')
    }
  }
}

which enables Jenkins to show links to individual commits in GitBlit:


screenshot of commit link in Jenkins

A minor detail, but very handy 😉

In our “generator” job we enabled Gerrit Trigger too and set the events and project file filter so that the job is only triggered if a change is merged and includes a modification to one of the job DSL scripts (i.e. filers matching **/jenkins/jobs/**).


screenshot of Gerrit Trigger settings

This way every change to the job DSL merged into master triggers the generator run in Jenkins, which updates all existing jobs. Pretty much the only time we need to run the generator manually is when cutting a new release branch (as there’s no commit in gerrit associated with that).

That sums up this round of improvements, as usual, there’s more in the pipeline 😉

One thought to “Automating Improvements to Continuous Integration”

  • Quakeonthelake

    Michael Gliwinski, thank you for this post. Its very inspiring.

    Reply

Leave a comment

Your email address will not be published. Required fields are marked *