Running files from inside image from host in Jenkins

So I have a Jenkins pipeline called Jenkins A. I also have another project called Jenkins B. They are both dockerized. I want to know how I can execute files that belong inside of Jenkins B, from Jenkins A. You can think of both Jenkins as separate projects. I am using a hub.docker link for JenkinsB.

I tried

docker.image(JenkinsB).inside() {
  sh " {arguments for py file} "

When I run it like that, it looks for from the Jenkins A file system and executes that inside of Jenkins B . But I want to find the file inside of Jenkins B and execute that file with the arguments I’m passing in from Jenkins A . How can I do this? Do I need to use .withRun?

Or would it be easier if there was a way I could copy the file from JenkinsB into Jenkins A and execute it that way? So essentially copying a file from the Image to the host?

Yes, you need one of the run variants. docker.inside mounts your current workspace and sets is as workdir, so that is why you see the A

So the run command executes the container files within the containers?

There are no ‘container files’. There are workspace files and you can access those with the docker.image.inside() block.

a sh inside such a block is more or less equivalent to:

docker run -v$PWD:$PWD sh "cd $PWD; "

with $PWD being the jenkins workspace.

You probably should provide a bit more context what you really try to and show the complete pipelines to get better help.

So the workspace files you’re referring to are files that are within the image that you are calling.inside() on? So by saying docker.image(ImageB).inside(), I’m getting access to the workspace files that belong to ImageB? So let’s say ImageB has a file called “”, I can find and execute it by using the .inside() method?

I can’t post the whole pipeline so that one I posted is minimal reproducible. These pipelines are really long.

Thanks again for what you’ve said so far, you’ve been pushing me in the right direction.

I think you mix up jenins jobs, workpsaces and images and containers too much.

A job has a workspace (where typically you check out a git repo)
In the job you have diffferent steps which could be in a image.inside block

The code inside is running in a container with the workspace from the current job.

Images just have the files that they were built with (docker build …), but never access to the worksapce of another job. Use archive artifact/use artifact to get files from another workspace.

Ahh so I need to use artifacts to get files from another workspace. Similar to how I can access the anchor gates files? I looked online and I saw something called “template.xml”. But I don’t have that files in my artifacts folder. Can you point to the direction where I can learn how to programmatically use artifacts to get files from another workspace?

I actually figured out the solution. In the ImageA pipeline, I added the ‘’ to the artifacts and then in ImageA, I cloned ‘’ and executed it.


always {
        archiveArtifacts artifacts: '', onlyIfSuccessful: true

Current pipeline:

sh "curl {url to imageA artifact}/ -o"
docker.image(ImageA).inside {

    sh "python "


By image, I mean the different projects. Not actual images.