I'm setting up Jenkins in a local environment, where the Jenkins master node is local, and I'm using the EC2 Plugin to provision slave nodes on-demand in AWS. The setup is intended to connect the master to the EC2 slave nodes via SSH.
I am able to manually SSH into the slave nodes from the master node without any issues using the slave's public IP address. However, when Jenkins attempts to connect to a slave node, the connection fails, showing the following log in Jenkins:
INFO: Connecting to 172.31.44.131 on port 22, with timeout 10000.
Nov 06, 2024 9:58:55 AM hudson.plugins.ec2.EC2Cloud
INFO: Failed to connect via ssh: The kexTimeout (10000 ms) expired.
Nov 06, 2024 9:58:55 AM hudson.plugins.ec2.EC2Cloud
It appears that Jenkins is attempting to connect to the slave node using its private IP address (172.31.44.131), assuming the slave is on the same local network as the master. This leads to a timeout, as the master is unable to reach the private IP of the EC2 instance over SSH from outside the AWS network.
How can I configure the EC2 Plugin or Jenkins to use the public IP address of the EC2 slave nodes rather than the private IP? Alternatively, is there a way to allow the Jenkins master node to reach the slave nodes on their private IP from outside the AWS network?
Look for "Connection Strategy" under "Advanced". There are "Public IP" and "Public DNS" options.