72

In order to get the fastest feedback possible, we occasionally want Jenkins jobs to run in Parallel. Jenkins has the ability to start multiple downstream jobs (or 'fork' the pipeline) when a job finishes. However, Jenkins doesn't seem to have any way of making a downstream job only start of all branches of that fork succeed (or 'joining' the fork back together).

Jenkins has a "Build after other projects are built" button, but I interpret that as "start this job when any upstream job finishes" (not "start this job when all upstream jobs succeed").

Here is a visualization of what I'm talking about. Does anyone know if a plugin exists to do what I'm after? Build Pipeline


Edit:

When I originally posted this question in 2012, Jason's answer (the Join and Promoted Build plugins) was the best, and the solution I went with.

However, dnozay's answer (The Build Flow plugin) was made popular a year or so after this question, which is a much better answer. For what it's worth, if people ask me this question today, I now recommend that instead.

2
  • Quoting Andrew's Answer.... "Jenkins recently announced first class support for workflow." jenkins-ci.org/content/workflow-plugin-10
    – Clintm
    Commented Jan 12, 2015 at 22:34
  • 1
    You can change the selected answer, if you think the newer answer is more appropriate now.
    – naught101
    Commented Apr 29, 2016 at 4:29

7 Answers 7

31

Pipeline plugin

You can use the Pipeline Plugin (formerly workflow-plugin).

It comes with many examples, and you can follow this tutorial.

e.g.

// build
stage 'build'
...

// deploy
stage 'deploy'
...

// run tests in parallel
stage 'test'
parallel 'functional': {
  ...
}, 'performance': {
  ...
}

// promote artifacts
stage 'promote'
...

Build flow plugin

You can also use the Build Flow Plugin. It is simply awesome - but it is deprecated (development frozen).

Setting up the jobs

Create jobs for:

  • build
  • deploy
  • performance tests
  • functional tests
  • promotion

Setting up the upstream

  1. in the upstream (here build) create a unique artifact, e.g.:

    echo ${BUILD_TAG} > build.tag
    
  2. archive the build.tag artifact.

  3. record fingerprints to track file usage; if any job copies the same build.tag file and records fingerprints, you will be able to track the parent.
  4. Configure to get promoted when promotion job is successful.

Setting up the downstream jobs

  1. I use 2 parameters PARENT_JOB_NAME and PARENT_BUILD_NUMBER
  2. Copy the artifacts from upstream build job using the Copy Artifact Plugin

    • Project name = ${PARENT_JOB_NAME}
    • Which build = ${PARENT_BUILD_NUMBER}
    • Artifacts to copy = build.tag
  3. Record fingerprints; that's crucial.

Setting up the downstream promotion job

Do the same as the above, to establish upstream-downstream relationship. It does not need any build step. You can perform additional post-build actions like "hey QA, it's your turn".

Create a build flow job

// start with the build
parent = build("build")
parent_job_name = parent.environment["JOB_NAME"]
parent_build_number = parent.environment["BUILD_NUMBER"]

// then deploy
build("deploy")

// then your qualifying tests
parallel (
    { build("functional tests",
          PARENT_BUILD_NUMBER: parent_build_number,
          PARENT_JOB_NAME: parent_job_name) },
    { build("performance tests",
          PARENT_BUILD_NUMBER: parent_build_number,
          PARENT_JOB_NAME: parent_job_name) }
)

// if nothing failed till now...
build("promotion",
    PARENT_BUILD_NUMBER: parent_build_number,
    PARENT_JOB_NAME: parent_job_name)

// knock yourself out...
build("more expensive QA tests",
    PARENT_BUILD_NUMBER: parent_build_number,
    PARENT_JOB_NAME: parent_job_name)

good luck.

1
  • I prefer this answer myself - it is very straightforward and more flexible IMHO.
    – metaforge
    Commented May 11, 2015 at 21:10
27

There are two solutions that I have used for this scenario in the past:

  1. Use the Join Plugin on your "deploy" job and specify "promote" as the targeted job. You would have to specify "Functional Tests" and "Performance Tests" as the joined jobs and start them via in some fashion, post build. The Parameterized Trigger Plugin is good for this.

  2. Use the Promoted Builds Plugin on your "deploy" job, specify a promotion that works when downstream jobs are completed and specify Functional and Performance test jobs. As part of the promotion action, trigger the "promote" job. You still have to start the two test jobs from "deploy"

There is a CRITICAL aspect to both of these solutions: fingerprints must be correctly used. Here is what I found:

  1. The "build" job must ORIGINATE a new fingerprinted file. In other words, it has to fingerprint some file that Jenkins thinks was originated by the initial job. Double check the "See Fingerprints" link of the job to verify this.
  2. All downstream linked jobs (in this case, "deploy", "Functional Tests" and "Performance tests") need to obtain and fingerprint this same file. The Copy Artifacts plugin is great for this sort of thing.
  3. Keep in mind that some plugins allow you change the order of fingerprinting and downstream job starting; in this case, the fingerprinting MUST occur before a downstream job fingerprints the same file to ensure the ORIGIN of the fingerprint is properly set.
5
  • 3
    I tried both Join and the Promoted Builds plugins, but I think they suffer from a limitation that I didn't consider when I first typed up my question. Basically, they only seem to work when each 'branch' of the split is exactly one job deep. The plugins both work as expected in my image above, but our real pipeline looks more like this: dl.dropbox.com/u/74726/pipeline.png - Notice how one 'branch' of the split is only one deep (code coverage) but the other branch is TWO jobs deep (deploy and then test) Neither one will launch the 'promote' job when coverage, deploy, and test all pass!
    – Jay Spang
    Commented Feb 7, 2012 at 18:42
  • 2
    I've used this solution when the the paths are multiple jobs deep. The CRITICAL key was the fingerprinted files. I'll update my answer to reflect that. Commented Feb 8, 2012 at 18:11
  • 1
    Thanks. Fingerprints were indeed the answer!
    – Jay Spang
    Commented Feb 9, 2012 at 0:48
  • I face the problem that the branch is only one job deep. Don't know how to make use the fingerprint file. Where can I find more specific information?
    – aleung
    Commented Sep 4, 2012 at 7:11
  • i dont see the Join Trigger option under Post Build Actions, ever after installing the join plugin, I installed the plugin by uploading the .hpi file manually. Commented Jun 28, 2017 at 5:38
10

The Multijob plugin works beautifully for that scenario. It also comes in handy if you want a single "parent" job to kick off multiple "child" jobs but still be able to execute each of the children manually, by themselves. This works by creating "phases", to which you add 1 to n jobs. The build only continues when the entire phase is done, so if a phase as multiple jobs they all must complete before the rest are executed. Naturally, it is configurable whether the build continues if there is a failure within the phase.

10

Jenkins recently announced first class support for workflow.

2

I believe the Workflow Plugin is now called the Pipeline Plugin and is the (current) preferred solution to the original question, inspired by the Build Flow Plugin. There is also a Getting Started Tutorial in GitHub.

2

Answers by jason & dnozay are good enough. But in case someone is looking for easy way just use JobFanIn plugin.

2
  • Hi Yogi , Is there a way to pass parameters to the merge job ? I am using the parametrize trigger plugin Commented Jul 18, 2016 at 20:53
  • 1
    need to add that as enhancement to plugin. Currently this plugin does not do that
    – Yogesh
    Commented Jul 19, 2016 at 4:55
1

This diamond dependency build pipeline could be configured with the DepBuilder plugin. DepBuilder is using its own domain specific language, that would in this case look like:

_BUILD {
    // define the maximum duration of the build (4 hours)
    maxDuration: 04:00 
}

// define the build order of the existing Jenkins jobs
Build -> Deploy
Deploy -> "Functional Tests" -> Promote
Deploy -> "Performance Tests" -> Promote

Screenshot of the DepBuilder plugin UI

After building the project, the build visualization will be shown on the project dashboard page:

Build pipeline visualization

If any of the upstream jobs didn't succeed, the build will be automatically aborted. Abort behavior could be tweaked on a per job basis, for more info see the DepBuilder documentation.

Screenshot of the failed build

Not the answer you're looking for? Browse other questions tagged or ask your own question.