Better pipeline development workflow?

Hi all,

I have been using jenkins pipeline for years now and was wondering if there is anything better than my current workflow. When I start a new pipeline,

  1. Create the jobs dsl groovy with the parameters to the pipeline
    declarative does not parse the parameters correctly for new pipelines.
  2. Commit, push, run seed jobs
  3. Edit pipeline library to create new step or helper functions
  4. Edit jenkinsfile in Vim/Neovim to use new functions or debug print
  5. commit push pipeline lib, pipeline, and any other dependent git repos
  6. ssh to controller and run job, read output
    1. Fix syntax errors and typos
  7. Repeat last 4 steps over and over.

Basically, is there a way to edit pipeline and pipelines library and test without having to commit to source control and push? Things would go so much smother if I didn’t have to wait around for git network operations. In a busy shop like my work, network can get slow sometimes.

I don’t have a lot of suggestions, but

there’s lots of shell scripts out there that hit the linter api endpoint. It only works with declarative though - GitHub - PeterMosmans/jenkinslint: Basic linter (validator) for Jenkinsfiles. Can easily be used in pre-commit hooks and the pre-commit framework looks like a decently robust version

For shared pipelines, you could use GitHub - jenkinsci/JenkinsPipelineUnit: Framework for unit testing Jenkins pipelines to confirm your shared libs work, then update a tag which all your jobs point to.

For small edits, there’s a link on the sidebar for replaying job, which lets you edit the pipeline script. lets you do quick edits that are not committed.

1 Like

This plugin allows you to configure your shared library to be a directory instead of a SCM:

Good for iterating on a shared library

1 Like

I second GitHub - jenkinsci/JenkinsPipelineUnit: Framework for unit testing Jenkins pipelines it can really help speed up development! You may also have side effects in your pipeline which you are not interested in when developing, so mocking them and being able to assert the method calls is super helpful and it’s great for catching regressions. It has saved me many, many, many hours :+1: Keep in mind you may not always get parity between JenkinsPipelineUnit and Jenkins due to GitHub - jenkinsci/JenkinsPipelineUnit: Framework for unit testing Jenkins pipelines.

Using GitHub - jenkinsci/jenkinsfile-runner: A command line tool to run Jenkinsfile as a function with File System SCM can also help speed up development, and get you closer to the final environment. I often use jenkinsfile-runner for testing jobDSL, to avoid new jobs on the controller when testing. For unit testing jobDSL it, I found this example helpful to get started job-dsl-gradle-example/src/test/groovy/com/dslexample/builder at master · sheehan/job-dsl-gradle-example · GitHub

Last advice when using jenkinsfile-runner (this is explicitly warned against in Pipeline Best Practices) is to override built-in steps in another library which is loaded along side the library which is being tested, I do this sometimes to mock sh or plugin steps. It’s brittle, It’s hard to maintain, but I found it helpful on occasions, and it was good to get off my chest :stuck_out_tongue:

1 Like

This looks interesting, I do the same thing but in a shell script lol.

When developing locally:

def lib
node(){
    
    stage('Setup Dynamic Library'){
        sh(
            """\
            rm -rf lib || true && \
            mkdir lib || true && \
            cp -r /var/jenkins_home/shared_lib/* lib
            """
        )
        
        sh("""git config --global user.email "shadycuz@xxx.com" && \
                git config --global user.name "shadycuz"
        """)
        
        String repoPath = sh(returnStdout: true, script: 'pwd').trim() + "/lib"
        
        sh("""cd lib && \
                (rm -rf .git || true) && \
                git init && \
                git add --all && \
                git commit -m init
        """)
        lib = library identifier: 'local-lib@master', 
                retriever: modernSCM([$class: 'GitSCMSource', remote: repoPath]), 
                changelog: false
    }
    
    env.PIPELINE_LOG_LEVEL = 'DEBUG'
    
    String cps = sh(script: '#!/bin/bash\nset +x; > /dev/null 2>&1\necho Test for CPS issue', returnStdout: true)
    
    sh('mkdir -p images/ubuntu')
    
    def path = lib.org.dsty.os.Path.new("${env.WORKSPACE}/images")
}

In CICD I use a groovy init file:

/* groovylint-disable , FileCreateTempFile, JavaIoPackageAccess */
import jenkins.plugins.git.GitSCMSource
import org.jenkinsci.plugins.workflow.libs.GlobalLibraries
import org.jenkinsci.plugins.workflow.libs.LibraryConfiguration
import org.jenkinsci.plugins.workflow.libs.SCMSourceRetriever

println('== Configuring the Jenkins Pipeline library')

final String libName = 'jenkins-std-lib'
final String repoBranch = 'master'
final LibraryConfiguration lc
final File file = new File('/var/jenkins_home/pipeline-library/src')

if (file.exists()) {
    println("===== Adding local ${libName}")

    StringBuilder stdout = new StringBuilder()
    StringBuilder stderr = new StringBuilder()

    File script = File.createTempFile('temp', '.sh')

    String lib = '/var/jenkins_home/lib'

    script.write(
        """#!/usr/bin/env bash
        mkdir ${lib} || true
        cp -r /var/jenkins_home/pipeline-library/* ${lib}

        git config --global user.email ""admin@non.existent.email""
        git config --global user.name "Jenkinsfile-runner"

        cd ${lib}
        rm -rf .git || true
        git init
        git add --all
        git commit -m init
        """
    )

    Process proc = "bash ${script.absolutePath}".execute()

    proc.consumeProcessOutput(stdout, stderr)
    proc.waitForOrKill(5000)

    if (stderr) {
        println("stdout: ${stdout}")
        println("stderr: ${stderr}")
    }

    GitSCMSource scm = new GitSCMSource(libName, lib, null, null, null, false)

    lc = new LibraryConfiguration(libName, new SCMSourceRetriever(scm))
    lc.with {
        implicit = true
        defaultVersion = repoBranch
    }
} else {
    println("===== Adding remote ${libName}")

    // TODO: change defaults once https://github.com/jenkins-infra/pipeline-library/pull/78 is merged
    String repoURL = 'https://github.com/DontShaveTheYak/jenkins-std-lib'

    println("===== Using the Pipeline library from ${repoURL}")

    GitSCMSource libSource = new GitSCMSource(libName, "${repoURL}.git", null, null, null, false)

    lc = new LibraryConfiguration(libName, new SCMSourceRetriever(libSource))
}

lc.with {
    implicit = true
    defaultVersion = repoBranch
}
GlobalLibraries.get().libraries.add(lc)

I might try out the File SCM plugin and see how it works, but I mostly happy with the setup I have.

Do you have an example of configuring File System SCM on a jenkinsfile-runner? This is my current setup jenkins-std-lib/docker at master · DontShaveTheYak/jenkins-std-lib · GitHub

Lots of people have already dropped some valuable knowledge in this thread, but let me give some of my highly opinionated and often controversial opinions.

I try not to use parameters at all and lots of them to me is a code smell. This is going to depend a lot on the type of jobs you have, but for IasC or CasC, the only parameter I typically have is log level. Most other CI tools like GitHub Actions, GitLab CI, Travis etc, don’t have parameters.

For local development I often just hit replay over and over again. Sometime using my IDE and sometime just using the web interface on a local docker jenkins.

Most of the stuff I do involves infra, so I can’t even test locally, I just use our production jenkins and do the replay game.

For our jenkins-job repo we have automation that merges the feature branch with the main branch and then seeds it. It works okay for one person, but the second someone else comes along it wipes their job out. I have been meaning to make it merge all feature branches and seed that, but its low priority.

Mad respect to the wizards that still use vi and vim. If you want a little more help, like intellisense and auto-completion, you can try out vscode and the Jenkins Extension Pack - Visual Studio Marketplace.

This is where the IDE comes in. I don’t have this problem anymore.

Lots of people mentioned the JenkinsFile-Runner and I also use it. My library is opensource if you want to see how I have it setup GitHub - DontShaveTheYak/jenkins-std-lib: Bringing the Zen of Python to Jenkins.

I made this one for it: GitHub - MadsJakobsen/jenkinsfile-runner-filescm-example: Example of using filescm with jenkinsfile-runner

1 Like

This is pretty slick. What about on a normal Jenkins? My main development Jenkins is a docker container, I only use runner for CICD.

If I configure my Jenkins container for file SCM, do I need to restart the container when I modify my library code, or will the changes take affect next time I run a job?

You don’t need to restart the container assuming you’ve mounted it into your container

1 Like