Jenkins kubernetes dynamic agent issue while running sh commands

Jenkins setup:

Jenkins 2.479.1 (running on AWS EC2)
directly running jenkins.war

Kubernetes Client API PluginVersion6.10.0-240.v57880ce8b_0b_2

Kubernetes pluginVersion4296.v20a_7e4d77cf6

Hey Guys!

I have been stuck with a very weird issue after upgrading jenkins.

The issue is as follows.

My Jenkins controller is running on AWS EC2 instance (t2.small). I have an EKS cluster on which I want to spawn my agent when running the pipeline. below is my pipeline script:

pipeline {
    agent {
    kubernetes {
      yaml '''
        apiVersion: v1
        kind: Pod
        metadata:
          labels:
            label: runner 
        spec:
          containers:
          - name: maven
            image: maven:3.6.3-openjdk-17-slim
            command:
            - cat
            tty: true
            volumeMounts: 
            - name: data-share
              mountPath: /tmp/
          - name: dind-daemon 
            image: docker:dind-rootless
            command:
            - cat
            resources: 
              requests: 
                cpu: 100m 
                memory: 512Mi
              limits:
                cpu: 500m
                memory: 1Gi
            tty: true    
            securityContext:
              privileged: true 
            volumeMounts: 
            - name: docker-graph-storage 
              mountPath: /var/lib/docker
            - name: data-share
              mountPath: /tmp/
          volumes: 
          - name: docker-graph-storage 
            emptyDir: {}
          - name: data-share
            emptyDir: {}
            '''
       }
    }
    


    stages() {
        stage('Checkout Application') {
            steps {
                container('maven') {
                    checkout scmGit(branches: [[name: '$branch_Name']], extensions: [], userRemoteConfigs: [[credentialsId: 'git-bwt-jenkins-token', url: 'URL']])
                }
            }
        }
        stage('Change Properties') {
            steps {
                container('maven') {
                 sh ''' ls -al
                        cd src/main/resources/
                        cat application-dev.properties
                    '''
                }
            }
        }
        stage('Maven Test & Build') {
            steps {
                container('maven') {
                sh 'mvn clean install -Dactive.profile=dev'
                }
            }
        }
        
    }
}    

The issue is the stage checkout works fine and successfully completes but when the pipeline moves to the next stage (‘Change Properties’) it gets stuck on sh.

When I go check the runners on my eks all 3 containers are running maven, jnlp, dind-daemon. But when I exec into the maven container and check running processes i see a cat command running. Basically the containers idle command is running and the sh ‘’ command is not overriding.

PFA logs and screenshots