Jenkins Declarative Pipeline - how to load Parameters value immediately in the first next build?

I have Jenkins Declarative pipeline with one Parameter - DEPLOY_ENV. It has default value staging :

  • if I run it for the first time this will fail because no parameters have been loaded yet
  • if I run it for the second time it will run the pipeline and print correct value staging
  • if I change value in script to test and run it it will print staging
  • if I run it once again it will print the value test because only then is the value changed in the parameters list of this build configuration

I do not want to change value manually in the Build Configuration because at that point I must run Build immediately in order to preserve that value. My all builds are being executed only automatically (checkout on new code change)

  • I would like either to change value in script and to be applicable immediately in the first next run
  • or to change the value in “Build with Parameters” but not to run the Build only to Save it for the next first run
  • In View Configuration the value is read only so I cannot change it.

How I can change the value and to be applicable immediately for the first next build (in case I want to change to other values as well like ‘production’, ‘qa’… and to be applicable immediately)? Thanks!!!

pipeline {
    parameters {
        string (name: 'DEPLOY_ENV', defaultValue:'staging')
    }
    agent any

    stages {
        stage('echo stage') {
            steps {
                script {
                    sh "echo $DEPLOY_ENV"

                }
            }
        }
    }   
}

The default value is always applied to the next build.

There are a few plugins (active choice or something) that let you run groovy for the values, but I’ve never used them.

Hi Gavin,

the default value is NOT applied to the first next build but to the second next build.
Because only then is that new value preserved in Jenkins UI in Build with Parameters section.
So for example:
I have default_value='test’
If I change it in script to ‘qa’ it will NOT be processed in the fist next build!!! I have tested it!!!
Only if I run build again (for the second time) then my new default value ‘qa’ will be processed.

Can I somehow have impact on this procedure?

Thanks!

I had same problem. I solved using dsl seed job (a job that build job and pipeline).

Here is an example of dsl job definition, I build it in my casc.yaml file but I guess it could be manually created:


job('seed') {
        description('Job to build jobs')
        configure { 
            put here where you get seed job source code  
        }
        logRotator {
           numToKeep(2)
        }
        label('my-agent')
        steps {
          jobDsl {
            targets 'jobs/*.groovy'
            removedJobAction 'DELETE'
            removedViewAction 'DELETE'
          }
        }
      }

In the targets jobs folder there is a groovy file that creates my pipeline with parameters:

pipelineJob('MY-PIPELINE') { 
    description('describe what pipeline do')
    parameters {
        stringParam('MY_PARAM', 'defaultvalue', 'description')
    } 
    logRotator {
      numToKeep(2)
      artifactNumToKeep(1)  
    }
    configure {
        put here where you get pipeline jenkins file and the name of jenkinsfile  
    }   
}

When I what to change default value, I update groovy seed job file and re-run seed build to update my pipeline.

Hope this help you.

Hi Letizia,

thanks for your kind assistance, but could you please clarify in more details since this is all new to me.

  1. dsl job definition - is that new file which you create separated from Jenkinsfile definition? what is the extension of this file? where do you store it how this is being triggered/invoked? sorry I am not familiar with this concept…
  2. can you put example code for the configure {} section in job (‘seed’)
  3. for groovy file can you put example code for the configure {} section?
  4. I still can have and use standard Jenkinsfile along with these 2 new files? how all of this is linked to my Jenkinsfile? is it through the configure section in groovy file, correct?

dsl job is a plugin: https://plugins.jenkins.io/job-dsl/

Once you have installed the plugin, you can view all about dsl at this url: https://hostnamejenkins/plugin/job-dsl/api-viewer/index.html

In my case I use “configuration as code” to build my jenkins docker image and I have multiple pipeline for my projects so I defined a seed job in casc.yaml configuration and a centralize folder where put jobs and pipelines creation.

In the dsl plugin page you can see that is possible to create a seed job from UI.

I used configure section because I use tfs as source code repository and dsl plugin doesn’t support it.
If you use Git, github or svn, you can play with https://hostnamejenkins/plugin/job-dsl/api-viewer/index.html#path/pipelineJob-definition

Anyway, configure section is used to create an xml portion of job definition, here is my example for seed job:

configure { project ->
          project.remove(project / scm)
          project / scm(class: 'hudson.plugins.tfs.TeamFoundationServerScm', plugin:"tfs@5.157.1") {
            serverUrl('http://hostname:8080/tfs/myCollection')
            projectPath('$/myproject/trunk/where/you/want/seed')
            localPath('.')
            workspaceName('myproject-^${JOB_NAME}-^${NODE_NAME}')
            credentialsConfigurer(class:"hudson.plugins.tfs.model.AutomaticCredentialsConfigurer")
            useUpdate('false')
          }  
        }

the folder $/myproject/trunk/where/you/want/seed contains jobs/*.groovy files referenced by targets in seed job definition. These files contains job and pipeline definition.

Configure section of a pipeline contains:

configure {
           definition ->
                  definition.remove(definition / delegate.definition )   
                  def myDef = definition / delegate.definition(class: 'org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition')            
                  myDef << scm(class: 'hudson.plugins.tfs.TeamFoundationServerScm', plugin:"tfs@5.157.1") {
                    serverUrl('http://hostname:8080/tfs/myCollection')
                    projectPath('$/myproject/trunk')
                    localPath('.')
                    workspaceName('myproject-${JOB_NAME}-${NODE_NAME}')
                    credentialsConfigurer(class:"hudson.plugins.tfs.model.AutomaticCredentialsConfigurer")
                    useUpdate('false')
                  }
                  myDef << scriptPath('Jenkinsfile')                                                       
        }

Jenkinsfile referenced here is your jenkinsfile without parameters definition.

forgot to mention: when jenkins runs a pipeline, it uses the parameters defined in the UI, the parameters defined in your jenkins file are used to update them, but the pipeline uses the ones defined previously. This is what I empirically figured out by doing some testing.

For this reason I moved the definition of the pipeline options, i.e. everything that can be defined from the UI, in the groovy files used by the seed job to create them and left in the jenkins file only what is needed to compile / test / deploy the project

Hi @cane ,

How did you resolve this ?