Save the variable values and pass it to downstream job

Hi All,

So, I’m trying to run python script and get the values like prId, repoName, branch information and then I want to pass these values to the downstream job.
Can you please let me know if there is a way to do it ?

Below is an example.

Get the parameters for current job..
a=1
b=2
c=3

Execute Shell :
python test.py a b c
// The above script will generate the values d e f but it’s not stored anywhere.

Now I want to pass the values a,b,c,d,e,f to the downstream job.

Thanks in advance,
Shesha

Simplest way would be to make the downstream job accept parameters:


, so you could access them as ${A} in your pipeline.
In the upstream pipeline you’d be using the build() step with parameters to trigger it:

build(
  job: 'folder/downstream',
  parameters: [
    string(name: 'A', value: '1')
  ]
)

First question is if you run a freestyle job or a pipeline script.
In freestyle the python script would need to write d e f to a properties file. The plugin Parameterized Trigger plugin then offers several ways to pass parameters from the current job to a downstream job, e.g. by reading them from a properties file or passing the current jobs parameters.
In pipeline there is the build step that allows to trigger other jobs with parameters. Again the python script must persist the values somewhere so you can read them in the pipeline script and pass them to the downstream job

In any case all parameters must be defined in the downstream job.