I have Jenkins deployed on kubernetes, where one node hosted the jenkins pod and another node for the workers tasks. It was running on EKS and using kubernets version 1.21.
This is how my pipeline looked:
pipeline {
agent {
kubernetes {
defaultContainer 'jnlp'
yaml """
apiVersion: v1
kind: Pod
spec:
nodeSelector:
illumex.ai/noderole: jenkins-worker
containers:
- name: docker
image: docker:latest
imagePullPolicy: Always
command:
- cat
tty: true
volumeMounts:
- mountPath: /var/run/docker.sock
name: docker-sock
volumes:
- name: docker-sock
hostPath:
path: /var/run/docker.sock
"""
}
}
stages {
stage('build') {
steps {
container('system') {
sh """
docker system prune -f
"""
Then, I upgraded kubernetes from 1.21 to 1.24, and my pipeline fails with the following error:
Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?
So some questions:
- I guess it means that docker can’t be mounted when using kubernetes?
- What will be the right approach in this scenario?
- I’m trying to use the “docker-dind” image (docker in docker), but then, how can I make my docker builds cached?
Thank you