Running a jenkins-ssh-agent image while on SSH

Hello all, I was trying the tutorial on Using Agents while working on a remote server, which I connect to via SSH. On step 2, it needs to connect the internal port 22 to the host’s port 22; however, this port 22 is already being used by the SSH session.

I tried changing the flag to “-p 7522:22” and changing accordingly on the Jenkins agent configuration on the dashboard, but the controller was not able to connect.

My question is the following:
How do I run an agent image without having to change the server’s native SSH port?

why wasn’t it able to connect? do you have a firewall? did it give an error message or something?

It simply failed to connect due to timeout. If I use the normal SSH port, 22, and just connect the controller directly to the server instead of the server’s container, it works just fine.

It sounds very much like a routing or firewall issue. If running it on 22 works but not another port is say firewall. I would recommend looking at your firewall settings for whatever flavour of os your running.

Thank you very much, I don’t know why I didn’t even check that. Port 7522 was obviously closed.

I’m now having a problem passing the SSH key with the environment flag

-e "JENKINS_AGENT_SSH_PUBKEY=[pubkey]"

It doesn’t seem to be passing the public key to the container. It should be placed on /etc/ssh/, right? I’ve tried putting the public key directly after =, putting it inside [] or '', but nothing works. What am I doing wrong?
The error that appears on the controller:
The SSH key for this host is not currently trusted. Connections will be denied until this new key is authorised.

I should have already said this, but my setup is a Jenkins controller on one server and the docker agent on another. I have created the credential as per the aforementioned tutorial.

EDIT: If I trust host on the Jenkins controller dashboard, I get the following:

SSH host key matches key seen previously for this host. Connection will be allowed.
ERROR: Server rejected the 1 private key(s) for ubuntu (credentialId:jenkins/method:publickey)

So that has nothing to do with the public key. SSH Has a key assigned to the server, and when you connect, it checks if its in the trusted list, if not, It assumes you’ve connected to the wrong server, and bails (on the cli it asks you a prompt). So trusting it seems like the right answer.

Does your public key start with ssh-? I’ve never even heard of this image before, but looking at https://github.com/jenkinsci/docker-ssh-agent/blob/0c1d128c33d2d7c0fabb34ebbccb29a6d9c4fcf6/setup-sshd#L46 it looks like its a requirement. Make sure your key(s) look like the ones @ https://github.com/halkeye.keys (just a single one, I have 3 authorized keys at time of writing)

Does your public key start with ssh-?

It does, it was generated in the first part of the tutorial and then added to the controller’s credentials. It looks pretty similar to the ones in your link.

It seems that for some reason it isn’t even creating a .ssh folder in JENKINS_AGENT_HOME (which would be /home/jenkins/ in the container). I will look more into this.

EDIT: The SSH key must be passed directly after JENKINS_AGENT_SSH_PUBKEY= without any ""or ''. The key is now placed in ~/.ssh/authorized_keys, but I still get no access.

I accidentally put the wrong username for the SSH credential in the controller (‘ubuntu’ instead of ‘jenkins’). After fixing that, and choosing the tag ‘jdk11’ instead of ‘alpine’ (the latter gives a java missing error), I finally got it running. Thank you @halkeye for helping me through this!