Jenkins running multiple jobs at once - kubernetes

Hi,
I am running Jenkins on Kubernetes cluster.
Every time a new job is run a new kubeagent is created which runs the build.
I have no issues in executing a job, but if I try to execute two jobs one after another it lets me run only one, and stops the older one.

I am sure it is a manner of settings but I was unsuccessful in identifying where can I change this behavior and allow running multiple jobs at once/

Please someone assist? @poddingue
To clarify: I have a job with parameterized build, I want to be able to run multiple jobs (with different params) . Currently when a I run same build with different parameters, because they are launched in the same workspace, one of them gets cancelled.

I’m sorry, but I don’t know enough about Kubernetes and Jenkins yet. :person_shrugging:

How about on a non-kubernetes setup? - i am quite sure the general solution would be the same

Could the issue you’re experiencing be that Jenkins by default uses the same workspace for the same job, even when the job is parameterized?

As far as I know, when two builds of the same job are running concurrently, they try to use the same workspace, which can lead to one build being canceled or the builds interfering with each other. :thinking:

To try and solve this issue, I think you could use the Throttle Concurrent Builds Plugin to prevent concurrent builds of the same job…
Except these solutions will queue the second build until the first one is finished, which might not be what you want. :thinking:

If you want to allow concurrent builds of the same job with different parameters, you could configure Jenkins to use a unique workspace for each build.
I’ve never tried it, but this may be achieved by using the ws step in your Jenkinsfile.

Here’s an example:

node {
stage(‘Build’) {
// Use a unique workspace for each build
ws(“{env.WORKSPACE}-{env.BUILD_NUMBER}”) {
// Your build steps go here
}
}
}
Here, {env.WORKSPACE}-{env.BUILD_NUMBER} creates a unique workspace for each build by appending the build number to the workspace path.
This should allow multiple builds of the same job to run concurrently without interfering with each other.

Of courseThanks a lot for correcting, this will increase the disk space used by Jenkins, as each build will have its own workspace. :person_shrugging:
You might need to configure Jenkins to clean up old workspaces@mawinter69 to prevent it from using too much disk space.! :hugs:

1 Like

No!
Jenkins will take care that you get separate workspaces, at least when they run on the same node. The second run should happen in a workspace with @2 appended.

This also applies to using the ws step

node('node') {
  ws('/data/test')
}

You only have a problem when you use above script and it runs on different agents but for both agents the /data/test is actually the same physical directory (e.g. because it is an nfs mount or the same docker volume).

Then the approach to include the build number in the path should avoid that the builds interfere.

1 Like

Thanks a lot for correcting, @mawinter69 . :hugs:

1 Like

btw, pipeline jobs have a configuration option that disallows concurrent builds and that aborts previous builds.

1 Like

Thank you for help!
A temporary fix is to disable concurrent builds, but the issue turned out to be more Kubernetes related than I thought initially.

Jenkins will take care that you get separate workspaces, at least when they run on the same node

This is exactly right! - When a build job is run, a new Pod (Jenkins agent) is created to facilitate the running job. If I rerun the same build with different parameters a new pod is launched in the SAME workspace. Because the builds are run on a different agent (Pod) they do not know about each others existence so the @2 will not be appended to the second builds workspace build, and instead they will attempt to run in the same workspace.

Do you know how can I fix that?

I was able to solve the issue:

  1. I used to have cleanWs() step which deleted the subdirectories when created.
  2. Most importantly I forced the job with various build parameters to run in subdirectories.
    stage('Pre Build Cleanup') {
        // Remove all files and subdirectories within the workspace directory 
        sh "rm -rf ${env.WORKSPACE}/${COMPONENT}/*"
        echo "Removed the subdirectories in ${env.WORKSPACE}/${COMPONENT}"
    }
    
    stage('Build') {
        ws("${env.WORKSPACE/${COMPONENT}}") {
        sh '''
        ....
        
     }
}
1 Like

But you still would have the problem when 2 builds for the same component run.