I’m using Pipeline script from SCM
,
jenkinsfile
file in the same git repository as source code,
and it sounds like there is no problem.
For example, when setting the Pipeline script from SCM
branch to master
,
Step Declarative: Checkout SCM
should be to check out jenkinsfile
and then follow the steps in jenkinsfile
,
check out source code when use checkout scm
, both the jenkinsfile
and the code are up to date.
But if a tag is created on an old commit and triggered tag push
via gitlab webhook,
for checkout specified commit, you need to modify checkout scm
like this
checkout([
$class: 'GitSCM',
branches: [[name: "${env.gitlabAfter}"]],
userRemoteConfigs: [[
url: "${scmUrl}",
credentialsId: "${scmCredentialsId}"
]]
])
env.gitlabAfter
is used here.
The problem arises here because the Pipeline script from SCM
branch is set to master
, so the steps in the jenkinsfile
are up to date.
It does not match the old tag source code.
For example: use docker build
command in the latest jenkinsfile
, but the old tag source code does not have a dockerfile
file.
After google search, i use variables in Pipeline script from SCM
,
modify the Pipeline script from SCM
branch to ${gitlabBranch}
, which should achieve the effect.
But ${gitlabBranch}
only works when gitlab webhook triggered, for first manual run,
No jenkinsfile
could be found because ${gitlabBranch}
does not exist,
Without running jenkinsfile
, triggers
directive will not apply triggers to the pipeline job.
triggers {
gitlab(
triggerOnPush: true,
triggerOnMergeRequest: true,
branchFilterType: 'All',
secretToken: 'passwd')
}
So how should I set up Pipeline script from SCM
?
Or how to set a default value for ${gitlabBranch}
, such as master
;
Or, maybe, keep the Pipeline script from SCM
branch master
,
on the first run, change the pipeline job BranchSpec to ${gitlabBranch}
like this?
Or should i set up a specifically pipeline job for tag push
and set the trigger in advance?