Matrix job REST API doesn't work when building with parameters

I make use of Matrix Jobs as I use a single codebase for multiple clients, with client specific settings coming from env files. They work pretty well, with some downsides.

One issue I’m facing is that the REST API simply doesn’t work. I have the correct credentials (I can successfully use them in a normal job) and I get the URL from the REST API page, having clicked through on one of the client axis

It gives and URL like the below, for building with parameters

http://myjenkins.com:8080/job/MatrixJob/Client=aClientName/buildWithParameters

It always gives a 500 status. Interestingly, the URL to use to initiate a build without parameters, does work.

Now here is what is confusing. The clients are setup as a user-defined axis, with a label of “Client” and there is a list of client names. This is used as a parameter itself when I trigger a build. I select a git branch and the clients I want to deploy to.

Clicking into an axis (a client) means I get the specific REST API URL for that client, which is fine, but I’d also want to use the base job URL, and pass the client names as a parameter for the axis, but that never works either

eg

http://myjenkins.com:8080/job/MatrixJob/buildWithParameters?Client=aClientName&branch=master

this is accepted and does indeed create a job, but the client param doesn’t work, and the job starts with no clients selected. I have the below error in the console

No such property: aClientName for class: groovy.lang.Binding
09:52:45 groovy.lang.MissingPropertyException: No such property: aClientName for class: groovy.lang.Binding
09:52:45 	at groovy.lang.Binding.getVariable(Binding.java:63)
09:52:45 	at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:285)
09:52:45 	at org.kohsuke.groovy.sandbox.impl.Checker$7.call(Checker.java:375)
09:52:45 	at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:379)
09:52:45 	at org.kohsuke.groovy.sandbox.impl.Checker$checkedGetProperty.callStatic(Unknown Source)
09:52:45 	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallStatic(CallSiteArray.java:55)
09:52:45 	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:197)
09:52:45 	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:233)
09:52:45 	at Script1.run(Script1.groovy:1)
09:52:45 	at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovySandbox.runScript(GroovySandbox.java:195)
09:52:45 	at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovySandbox.run(GroovySandbox.java:363)
09:52:45 	at hudson.matrix.FilterScript.evaluate(FilterScript.java:45)
09:52:45 	at hudson.matrix.FilterScript.apply(FilterScript.java:85)
09:52:45 	at hudson.matrix.Combination.evalGroovyExpression(Combination.java:102)
09:52:45 	at hudson.matrix.Combination.evalGroovyExpression(Combination.java:91)
09:52:45 	at hudson.plugins.matrix_configuration_parameter.DefaultMatrixCombinationsParameterValue.combinationExists(DefaultMatrixCombinationsParameterValue.java:82)
09:52:45 	at hudson.plugins.matrix_configuration_parameter.MatrixCombinationsParameterMatrixBuildListener.doBuildConfiguration(MatrixCombinationsParameterMatrixBuildListener.java:57)
09:52:45 	at hudson.matrix.listeners.MatrixBuildListener.buildConfiguration(MatrixBuildListener.java:70)
09:52:45 	at hudson.matrix.DefaultMatrixExecutionStrategyImpl.filterConfigurations(DefaultMatrixExecutionStrategyImpl.java:188)
09:52:45 	at hudson.matrix.DefaultMatrixExecutionStrategyImpl.run(DefaultMatrixExecutionStrategyImpl.java:123)
09:52:45 	at hudson.matrix.MatrixBuild$MatrixBuildExecution.doRun(MatrixBuild.java:375)
09:52:45 	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:526)
09:52:45 	at hudson.model.Run.execute(Run.java:1895)
09:52:45 	at hudson.matrix.MatrixBuild.run(MatrixBuild.java:323)
09:52:45 	at hudson.model.ResourceController.execute(ResourceController.java:101)
09:52:45 	at hudson.model.Executor.run(Executor.java:442)

So, how do i fix this?

I would love to use pipelines, but no matter what I try, I am unable to create a pipeline job with a branch and user-defined axis parameter, where I can pick my git branch and a check list fo clients to build the job for.

In Classical Matrix jobs you can’t just trigger a single axis. You can only trigger the whole job. It will then build all axis.
An axis is not a parameter of the job so this can’t work

Interestingly, the URL to use to initiate a build without parameters, does work.

Does it really trigger just the axis of the job? I tried but it does nothing for me. The job appears in the queue and then disappears.

My Matrix Job (no idea if its “classical” or not) only has a single axis, a list of client names. where I am able to select one or more at any time. Selecting 5 would create 5 jobs, one for each client name.

Does it really trigger just the axis of the job?

Yup.

You’re using classical matrix jobs.
OK so when you trigger the job it will start builds for all configured clients but you want to only start a single client.

As you might have noticed there is no Build Now link on a single job of an axis (i.e. when you go to http://myjenkins.com:8080/job/MatrixJob/Client=aClientName) . So consequently you can’t trigger just this single job. You can post to the url http://myjenkins.com:8080/job/MatrixJob/Client=aClientName/build but it does nothing for me. When I start it with a delay I can see the job in the queue but then it is never started.

With pipeline jobs it should be fairly easy to achieve what you want.

I would love to use pipelines, but the documentation is simply terrible. There are 2 different syntaxes to use, no idea which one is “correct” and from the various attempts Ive tried, using examples I’ve Googled and asked ChatGPT, there simply isn’t a way to replicate this matrix job setup.

If you know a way to replicate it, I’d love your input, as pipelines is really making me feel dumb right now.

I need 2 parameters that I can manually specify; a git branch/tag from a single prive repo (ideally using a list of the actual branches from the repo) and a list of client names that I can manually select one or more from. Sounds easy, but I’m struggling to understand how to make this script

Just a small snippet with scripted pipeline.
This assumes you have installed Active Choices plugin and defined a parameter with name client that provides the list of clients that can be built. That plugin would even allow you to dynamically calculate the clients that can be built.
Active choices allows to use groovy to calculate the possible parameters values. So this could potentially allow you to get the available branches of the repo.

Then all you need to do is loop over the clients and execute them in parallel. Ideally you run each client on a different agent so maybe you need some mapping from client to agent. But scripted pipeline is just groovy (more or less)

clients = params.client.split(",")
branch = params.branch

clientBuilds = [:]

clients.each { client ->
    clientBuilds[client] = {
        node("linuxx86_64") {
            // here are the steps that you would run in your matrix job under `Build Steps`. But now you can also decide to  run some things only for a certain client in a much simpler way.
            sh "echo build client $client on branch $branch"
        }
    }
}


parallel clientBuilds

I got a bit further with the following pipeline

pipeline {
    agent any

    tools {
        nodejs 'Node 20.11.1'
    }
    
    parameters {
        gitParameter branchFilter: 'origin/(.*)', defaultValue: 'master', quickFilterEnabled: true, sortMode: 'DESCENDING_SMART', name: 'BRANCH', type: 'PT_BRANCH'
        extendedChoice multiSelectDelimiter: ',', name: 'CLIENT', quoteValue: false, saveJSONParameterToFile: false, type: 'PT_MULTI_SELECT', value: 'client1,client2,client3', visibleItemCount: 5
    }
    
    stages {
        stage('Git Checkout') {
            steps {
                git branch: params.BRANCH, changelog: false, credentialsId: '123', poll: false, url: 'git@github.com:myrepo.git'
            }
        }
        stage('Run Jobs for Each Client') {
            steps {
                script {
                    def parallelJobs = [:]
                    // Initialize empty lists to capture parameter values
                    def branches = []
                    def clients = []
                    // Iterate over each selected folder
                    params.CLIENT.split(',').each { client ->
                        // Define a unique label for each parallel branch
                        def clientLabel = "$client"
                        // Start the job in parallel on a remote node
                        parallelJobs[clientLabel] = {
                            // Capture parameter values
                            branches.add(params.BRANCH)
                            clients.add(client)
                            
                            // Run job steps
                            node {
                                // tools {
                                //     nodejs 'Node 20.11.1'
                                // }
                                
                                stage("Job for ${client}") {
                                    checkout scmGit(branches: [[name: params.BRANCH]], extensions: [], userRemoteConfigs: [[credentialsId: '123', url: 'git@github.com:myrepo.git']])
                                    readProperties file: '.env.${client}.live'
                                    sh 'printenv'
                                    tool name: 'Node 20.11.1', type: 'nodejs'
                                    echo "Branch: ${params.BRANCH}"
                                    echo "Client: $client"
                                    sh 'npm --version'
                                }
                            }
                        }
                    }
                    // Wait for all parallel branches to complete
                    parallel parallelJobs
                    // Print parameter values to main job logs
                    echo "Branches: $branches"
                    echo "Clients: $clients"
                }
            }
        }
    }
}

However, the env files are not being loaded in. In my matrix job, I used the option “Inject environment variables to the build process”; i was hoping to be able to do this in the pipeline, but i can’t get it to work.

Being able to inject the env file data would get me a step closer, but then I still need to handle all the client access keys, and client list. Ideally, I want to be able to create the client list based on the env files in the repo. But thats probably a huge ask.

withEnv is the pipeline equivalent of Inject environment variables

I tried that, and couldn’t get it to work. I need to pass the client variable in as the filename and it was having none of it :weary:

you can’t use it one to one, withEnv is not accepting a file (though that might be an interesting extension to allow a properties file).
But with readFile should be easy to acheive

    fileContent = readFile encoding: 'UTF-8', file: ".env.${client}.live"
    withEnv(fileContent.tokenize("\n")) {
        sh 'env'
    }

This assumes the file contains one line per env variable and no comments. If it is more like a properties file and contains multiline properties and comments this will not work. Then you would need to construct the list from after reading with readProperties manually and pass it to withEnv

Yes, the env files have one key/value pair per line. I’ll give this a go!

EDIT - just reread your comment; we have comments in the files, and some empty lines to split up sections. Wondering if any regex can be added in to check the line contains a real key/value pair?

EDIT EDIT - hmm, looks like it worked anyway? Oddly, the sh('env') command does list the env vars, but sh('printenv) doesn’t. Do I need to keep my other commands within the withEnv(){} section?

Hmm, I can’t seem to be able to print out any of the env vars.

I’ve tried both

echo “${env.ONE_FOOTBALL_API_LOGIN}”

and

sh “echo $ONE_FOOTBALL_API_LOGIN”

inside the withEnv() step and its not printing the value

maybe you can post what your Jenkinsfile looks now.

Key part here

stage("Job for ${client}") {
                                    checkout scmGit(branches: [[name: params.BRANCH]], extensions: [], userRemoteConfigs: [[credentialsId: '123', url: 'git@github.com:myrepo.git']])
                                    fileContent = readFile encoding: 'UTF-8', file: ".env.${client}.live"
                                    withEnv(fileContent.tokenize("\n")) {
                                        sh 'env'
                                        echo "${env.MY_ENV_VAR}"
                                    }
                                    tool name: 'Node 20.11.1', type: 'nodejs'
                                    echo "Branch: ${params.BRANCH}"
                                    echo "Client: $client"
                                    sh 'npm --version'
                                }

I think its the env var file. Its key/value pairs, but with spaces around the = sign, eg

MY_ENV_VAR = foobar

I think its this as I am seeing null output in the logs.

as you said you have comments and blanks the approach with readFile is not so good. You have above the readProperties so I assume this is the file with the env vars you want to set. So this might do the trick

    props = readProperties file: ".env.${client}.live"
    list = []
    props.each{ entry -> list.add("${entry.key}=${entry.value}")}
    withEnv(list) {
        sh 'env'
    }

the env variables will only be available inside the withEnv of course.
And this works when there are blanks before/after the = in the file with the env vars

I fixed the issue of the spaces around the = symbols and its loading the values in now. Annoyingly, printing them out only seems to work using double quotes :weary:

If you do echo "${env.MY_ENV_VAR}" outside of a sh step then this is expected.
This should work and print the env

withEnv(["MY_ENV_VAR=value"]) {
  sh '''
  echo $MY_ENV_VAR
'''
  echo "${env.MY_ENV_VAR}"  // This will print the env, because groovy will interpolate the string
  echo '${env.MY_ENV_VAR}'  // this will not print, as there is no string interpolation done
}

Yea, i figured this out not long after posting!

So I;ve got much further now, loading the env files, using credentials as well (stored in jenkins). All looking good.

My only issue now is that for each parallel job, the UI spits out a “block” for each stage, and with about 50 jobs, with 3-4 stages, the UI is going to be unusable. Is there a good way to “group” the stages, other that put all commands in 1 stage?

which UI are you talking about?
Stage-view, blue ocean, pipeline-graph-view?

It’s the default stage view