I want to deploy sshfs on 2 machines using ansible. I have the following hosts configuration:
--- # hosts
...
cloud:
children:
cloudalpine:
hosts:
web-alpine-1:
ansible_host: Public IP
web-alpine-2:
ansible_host: Public IP
In my playbook, I want to mount an sshfs volume, but I can't seem to only reach the OTHER nodes private IP address.
The final configuration I want to achieve is:
web-aplpine-1 > web-alpine-2:/etc/docker/storage
web-alpine-2 > web-alpine-1:/etc/docker/storage
--- # Playbook
...
# I want to create a playbook with an sshfs volume mount
- name: Volume mount for docker
docker_volume:
volume_name: "sshvolume"
use_ssh_client: true
driver: vieux/sshfs
driver_options:
# I am using a local private key
IdentityFile: "{{ lookup('file', ansible_ssh_private_key_file) }}"
port: 22
# I am able to get the machines that is running the tasks private ip in here, but I can't get the other machines one.
# I need each to point to the others private IP address.
sshcmd: "<username>@:{{ The other hosts Private IPv4 address }}//etc/docker/storage"
How should I access the other nodes IP address on each machine?
You can use jinja2 loop and condition:
- hosts: all
tasks:
- set_fact:
other_ip: "{% for host in vars['play_hosts'] %}{% if host != inventory_hostname %}{{ hostvars[host].ansible_default_ipv4.address }}{% endif %}{% endfor %}"
- debug:
msg: "<username>@:{{ other_ip }}//etc/docker/storage"
My inventory:
test-001 # 192.168.1.24
test-002 # 192.168.1.27
Result:
TASK [debug] *************************************************
ok: [test-001] => {
"msg": "<username>@:192.168.1.27//etc/docker/storage"
}
ok: [test-002] => {
"msg": "<username>@:192.168.1.24//etc/docker/storage"
}
To get the hostname instead of IP:
other_ip: "{% for host in vars['play_hosts'] %}{% if host != inventory_hostname %}{{ host }}{% endif %}{% endfor %}"