I am trying to come up with an idea to solve the following problem:
I have gitea and Jenkins hooked up via Jenkins’ gitea plugin, meaning that each org in gitea is represented by an organizational folder in Jenkins. Now I would like to do the following:
Create an organizational folder for a group of users.
Create some kind of pipeline code inside that folder, not inside each repo inside this folder.
Apply the pipeline to all repos which are being detected in this folder, but ignore a Jenkinsfile in there, if present.
Any individual process per repo needs to be handled by the pipeline code, but the idea is to have users only commit application code to their repos, but having no influence over any jobs that Jenkins is going to run on their repos, for security reasons. Inside the pipeline code, which I expect to load from different repos, I expect to detect the repo the pipeline was called for, and then look any repo specific instructions up, which need to be configured elsewhere, and by other people, not the various users of the repos.
Any pointers to relevant information or code would be great!
One solution could be to use Shared libraries. This won’t entirely free you from the need to have a Jenkinsfile in all of the repos you want the have a pipeline for – but you could abstract away that pipeline down to a single doMagic() function call, with said function coming from a sharedlib that is implicitly (or explicitly, if you need it) available to any job in your Jenkins instance.
This is how Jenkins itself handles its plugin infrastructure: each plugin has its own Jenkinsfile, but the pipeline in it is reduced to a single buildPlugin(...) with some rather unified parameters. The whole setup is supplied by the Jenkins’ own sharedlib, which contains the definition of buildPlugin.
Hi Artalus,
the goal is not to simplify the Jenkinsfile, but to prevent users from executing anything. Once they have a Jenkinsfile which they control, they can, in theory, wreak all kinds of havoc, so I want to give them nothing and define and execute any pipelines on their behalf. But this should still be automatic, so if the user pushes to their repo, the pipeline should be triggered.
What I envision, is to have a Jenkinsfile or something at the folder level which would make use of one or more Groovy libraries, and all Jenkinsfiles at lower levels, ie. the individual repos, being completely ignored.
The multibranch or org folder job types that automatically create the corresponding jobs when a new project, branch or PR is created do not support using a global Jenkinsfile. Maybe it is possible to write a plugin that can achieve this.
What you can possibly do is use JobDSL plugin to generate the jobs based on a configuration that you maintain (that can be either manually or you write another script that scans for new things and then updates the config). Those jobs can react on pushes in github.
Hi, this is one of the features that Cloudbees has, it’s a sort of Enterprise thing. The only thing I manage to do is (as @Artalus suggested) shared pipelines, and some sort of forced code review to prevent Jenkinsfile to be thinkered.