Hi, we’re using Jenkins pod with slave pods (as cloud w/kubernetes pluging) .
our Git repo is Atlassian BitBucket signed with Let’s encrypt SSL .
As of last week , due to change by Let’s encrypt , our Jenkins cannot access BB repos and throws error,
server certificate verification failed. CAfile: none CRLfile: none.
According to Lets Encrypt web site, one needs to update CA trust files in running OS.
Since we’re using Jenkins POD . we cannot update container OS . tried to upgrade jenkins pod to Latest version (using Helm) but It didnt help .
So now our R&D are stuck .
Urgent Help needed . Thanks
Which docker image are you using? I’m pretty sure the newer ones should be fine, but if your using a older one you’ll probably have to extend it, switch to root, update certs then switch back to Jenkins.
Docker image updates shows how to extend any image you want
Reminder, please use the term agent instead of slave. Slave has been removed for years now.
In the past, an outdated Java version seemed to be associated with SSL certificate recognition failures.
As @halkeye noted, if you’re running the current versions of the Docker images, you’ll have a current Java version and a current operating system version. If you’re not running the current versions of the Docker images, you’ll benefit from updating to the current versions.
I got this issue as well, plugins will not update.
There were errors checking the update sites: SSLHandshakeException: Remote host terminated the handshake
java.io.EOFException: SSL peer shut down incorrectly
Caused: javax.net.ssl.SSLHandshakeException: Remote host terminated the handshake
Caused: javax.net.ssl.SSLHandshakeException: Remote host terminated the handshake
Caused: java.io.IOException: Failed to load updates jenkins io download/plugins/stashNotifier/1.24/stashNotifier.hpi to /opt/jenkins/home/plugins/stashNotifier.jpi.tmp
Caused: java.io.IOException: Failed to download from updates jenkins io download/plugins/stashNotifier/1.24/stashNotifier.hpi (redirected to: get jenkins io plugins/stashNotifier/1.24/stashNotifier.hpi)
So am I thinking in right direction, that this is because of cacerts keystore containing expired root cert?
Updating java should include more recent keystore without this issue then?
Can someone confirm this?
Yes, that’s been the technique used by others.
I am still facing same issue after java update.
checked cacerts included with openjdk 11.0.13, it contains x1 root certificate, even added R3 intermediate from the chain, did not help.
Is jenkins using cacerts from java, that starts the process?
Yes, Jenkins relies on Java for the CA certificates. Jenkins uses Java. Java uses the CA certificates. If your operating system provides the CA certificates as a separate package (many do that), then you need to assure that the operating system CA certificates package has been updated.
For the test on my VM I removed x1 root cert form cacerts and got different error - sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
Issue still remains with SSLHandshakeException: Remote host terminated the handshake.
Finally managed to track down difference in environments - they are in different geographical locations and there are multiple redirects updates.jenkins.io to get.jenkins.io to some mirror close to physical location of the machine and those are different mirrors. And it seems, that remote host is that mirror that refuses connection.
Poking servers with sslpoke reveals that issue seems to be with TLSv1.3
some mirrors handshake correctly, but few cannot send server side handshake when protocol is not explicitly stated or is 1.3, only one that works all the time is TLSv1.2