What methods exists to select Build Job?

We have a pipeline build job which serves about 30 repos, each with their own Jenkinsfile.

The Jenkinsfiles have between 2-12 named stages, but it turns out that with few exceptions a stage with a certain name is identical to the same stage in other Jenkinsfiles.

I am thinking of having a common pipeline where all the stages are conditional. Each repo would have a small configuration file (Jenkins.config) that lists which stages to execute.

By creating a few more stages, where you split up almost identical stages into more stages this problem disappears.

It will take time to test each and every repo so the old way with Jenkinsfiles need to coexist with the common pipeline script.

The original way to select a build job uses a regexpr filter on the repo name. The repos that move over to the new job will match the same filter.

Is there a way to priority how jenkins matches?
I would like to configure Jenkins to prioritize the search in a preferred order.
An alternative would be to check for the presence of a file. If the repo has a Jenkinsfile, then I run the old job. If it does not, I run the new job.

So what options do I have to make Jenkins select the preferred build job?

Just thought of a possible solution myself.
It might be possible with a new pipeline job that is started by the orginal regexpr. The old job is no longer started by this.
The new job checks for the precense of my ”Jenkins.config” file, and if found it will continue and run one or more of the conditional stages based on the contents of ”Jenkins.config”.
If ”Jenkins,config” is not found, this should trigger the original job.

Any examples on how to trigger a job from another job?

I’m not sure what this means. I have never seen a job that serves so many repos. Why have 1 job and 30 repos, when you can have 30 jobs each pointed at a single repo?

Why does having the same name matter?

This seems a bit complex to me.

You select your builds with a regex :thinking:

We today have one pipeline job which reads the Jenkinsfile from the repo.
Each repo have their own Jenkinsfile.
This means we have to maintain 30 Jenkinsfile’s.
It is better to maintain a flexible common Jenkinsfile which is configurable.
The Jenkinsfile is a Smörgåsbord, and each repo picks what they want from the Smörgåsbord, but only in a specific order.

So three repos might have

1 'Compile-Library’
2 'Export Library’
3 'Documentation-Create-PDF’

while another might have

1 'Import Headers’
2 'Compile-Application’
3 'Create-Debian-Package’
4 'Documentation-Create-PDF’

A common pipeline might have

1 'Compile-Library’
2 'Export Library’
3 'Import Headers’
4 'Compile-Application’
5 'Documentation-Create-PDF’
6 'Create-Debian-Package’

The libraries will now execute

1 Run 'Compile-Library’
2 Run 'Export Library’
3 Skip ‘Import Headers’
4 Skip ‘Compile-Application’
5 Run 'Documentation-Create-PDF’
6 Skip ‘Create-Debian-Package’

and the Application will run

1 Skip ‘Compile-Library’
2 Skip ‘Export Library’
3 Run 'Import Headers’
4 Run 'Compile-Application’
5 Run 'Documentation-Create-PDF’
6 Run 'Create-Debian-Package’

If all libraries are creating artifact ‘.a’ and '.lib’ but one job should not export ‘*.lib’
they are really not the same, even if they are all called ‘Export Library’
So I can create two versions of ‘Export Library’.

  • 'Export Library x.lib x.a ’
  • ‘Export Library x.lib’

so my pipeline becomes

1 'Compile-Library’
2 'Export Library x.lib x.a '
3 'Export Library x.lib’
4 'Import Headers’
5 'Compile-Application’
6 'Documentation-Create-PDF’
7 'Create-Debian-Package’

So some libraries will run [1,2,6] and others will run [1,3,6]

If the builds are selected by regex, what happens if a regex matches TWO build jobs.
Is there a way to set a priority so that if it matches build job ‘A’ then build job ‘B’ will not be run?
Basically a configuration of the order the regex are searched.

If that is not possible, the solution with a supervisor build job that is run initially which checks various stuff and then selects another job to run will work.

We will run the build job in docker.
Is there a way where you can start another job which uses the same docker container
and repos that were checked out by the supervisor job?

Thanks for providing so much detail. I have a better understanding of what is going on. I still don’t know enough to be certain but I think I can make some recommendations.

Are the steps called by any single repo static or dynamic? It seems like a repo runs the same commands over and over.

If all you need is a place to store common pipeline logic then I really think you should be using a shared library.

In your shared library you could put your common steps as functions.


Then you could either call these functions in each repo’s Jenkinsfile or maybe wrap these in another function in the shared library.


I can’t be 100 percent sure but it seems to me that using a shared library for common logic and then creating a Pipeline job for each repo is the probably a better way to solve this. I usually create 2 or 3 jobs per repo. A development job for testing and linting each PR. A deployment job that deploying to lower environments and a release job that either creates a tag, deploys to production or both depending on the app.

This would have many benefits such as the build logs would be much easier to read as they would only be for one repo and it would be much easier to trigger builds when the code changes.

I think, but if you are using pipelines, you almost never need to call a job that you have to interact closely with.

Like I said I can’t be 100% sure on my recommendations but I think you can probably make so changes that would make the builds less complex and provide more features at the same time.

1 Like

This may not be quite applicable to you, but something we’ve been using to help use a common Jenkinsfile across many jobs is the Remote Jenkinsfile Provider Plugin. We have many jobs which use the same build process in different repositories, so we use a common Jenkinsfile stored in another repository in each job. We have one job per repo though, just with the Jenkinsfile shared between them.

I suspect that the shared libraries that @shadycuz described are more relevant to you, but wanted to add this here just in case it helps at all.

Thank You.
I guess this works best when all jobs have identical Jenkinsfiles.
It is also useful if you want all the Jenkinsfiles in one git repo instead of distributed in the projects.

I can think of two reasons to trigger another job.
You might want to have a complex way of deciding to start a job.
Lets say that each developer has a virtual machine running on their own computer with a Jenkins slave.
When they push to gerrit/jenkins, the build job will start on their own machine which is guaranteed to be free, except for one or two repos that are so heavy they should be sent to a powerful server.

So the first job would just check parameters and make decisions, but will then trigger the real job.

I could also think of a test job which is triggered after the build job.
The test job downloads the build result to a target which runs extensive tests which takes a long time but very little performance is needed from the Jenkins slave. If you can run a build job on one virtual machine with a lot of resources and the test job on another virtual machine with much less resources you could be more efficient than having a single pipeline doing both.