Failed downstream job fails to be triggered by upstream job

Summary

I am unable to rebuild downstream jobs by manually triggering the upstream job

Steps to reproduce

I have pipeline A, B , C that are triggered by a launcher pipeline D upon its completion with SUCCESS
Trigger added to each pipleineJob section in Yaml file is :

pipelineJob('A') {
    triggers {
        upstream('D', 'SUCCESS')
    }
}

When pipeline A, B and C are marked SUCCESS or FAILED while associated with a build version (i.e v2023.x.x.x) , the automatic trigger via pipeline D works correctly
However, if downstream job fails and marked failed with build number # , the automatic trigger via pipeline D does not re-start build despite the SUCCESS status of pipeline D unless I manually rebuild downstream jobs separately
image

Type of error that caused the downstream job to fail in this case is a checkout failure to git repo and so pipelines A,B and C were unable be triggered by pipeline D

fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.

	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2842)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandWithCredentials(CliGitAPIImpl.java:2185)
	at org.jenkinsci.plugins.gitclient.CliGitAPIImpl$1.execute(CliGitAPIImpl.java:635)
	at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:997)
	... 8 more
ERROR: Error fetching remote repo 'origin'
ERROR: Maximum checkout retry attempts reached, aborting
Finished: FAILURE

Expected results

When triggering pipeline D and upon completion with SUCCESS : downstream jobs A, B and C should be triggered and start build despite having failed before

Actual results

When triggering pipeline D and upon completion with SUCCESS : downstream jobs A, B and C are not triggered

Hello and welcome to this community, @Asgarzadeh . :wave:

I’m not so sure I understood everything, but here are my thoughts anyway.
By default, Jenkins does not re-trigger downstream jobs that have previously failed, even if the upstream job completes successfully.

One way to work around this issue would be to use the build step in your pipeline script to explicitly trigger the downstream jobs, regardless of their previous status.

Here is what I have in mind (untested):

pipeline {
    agent any

    stages {
        stage('Trigger Downstream Jobs') {
            steps {
                script {
                    build(job: 'A', propagate: false)
                    build(job: 'B', propagate: false)
                    build(job: 'C', propagate: false)
                }
            }
        }
    }
}

In this example, the build step is used to trigger the downstream jobs ‘A’, ‘B’, and ‘C’.
As far as I understand, the propagate: false option means that the status of the downstream jobs (whether they succeed or fail) does not affect the status of the upstream job.

Hello @poddingue , Thanks for your answer!

However in my case I am working on scripted pipelines using JasC to define them and configure them through YAML files
So to trigger downstream jobs I add trigger {} as given in my post to each pipeline to be triggered by the upsteam pipeline.
So adding build step defeat the purpose here. Unless I add what you suggested in Jenkinsfile as a preBuild stage that checks if the downstreams failed , run this stage or something similar ? (Brainstorming here :slight_smile: )

Thanks!