I have a declarative pipeline that is structured as follows:
stages {
stage('Generate Artifacts') {
parallel {
stage ('Artifact A') {
stages {
stage('A-setup') { steps { build job: 'Build Artifact A: Setup' } }
stage('A-finalize') { steps { build job: 'Build Artifact A: Finalize' } }
}
}
stage('Artifact B') { steps { build job: 'Build Artifact B' } }
stage('Artifact C') { steps { build job: 'Build Artifact C' } }
stage('Artifact D') { steps { build job: 'Build Artifact D' } }
}
}
stage('Create Package') {
steps {
// Create package using artifacts from A-finalize, Artifact B, Artifact C, and Artifact D
build job: 'Package Artifacts'
}
}
}
This works fine, but I’m concerned about synchronization when collecting artifacts from the parallel jobs.
For example, jobs in the Artifact A stage can take a long time to complete, but the job in the Artifact B stage is very short job that can also be triggered manually or by changes in the SCM which means that there’s a chance that it can run again before the entire pipeline finishes.
So what I don’t want to happen is a scenario where:
- I kickoff the pipeline
- The Artifact B stage runs and finishes Build #5 as part of the pipeline run
- Everything waits for half an hour while the Artifact A stage finishes
- In the meantime, the Build Artifact B job runs again and finishes Build #6 and Build #7
- When the Artifact A stage finally finishes and the Create Package stage kicks off, it picks up the artifacts from Artifact B Build #7 instead of the artifacts from Build #5 which was the one kicked off by the pipeline
What I want to do is tell the Create Package job, “Get the artifacts from the job builds triggered by this specific pipeline run” so there isn’t a chance the Create Package job picking up the wrong artifact build as described above. What is the best way to accomplish this?