linuxsshvagrantrebootansible

Synced folders lost when rebooting a Vagrant machine using the Ansible provisioner


Vagrant creates a development environment using VirtualBox and then provisions it using ansible. As part of the provisioning, ansible runs a reboot and then waits for SSH to come back up. This works as expected but because the vagrant machine is not being started from a "vagrant up" command the synced folders are not mounted properly when the box comes back up from the reboot.

Running "vagrant reload" fixes the machine and mounts the shares again.

Is there a way of either telling vagrant to reload the server or to do all the bits 'n bobs that vagrant would have done after a manual restart?

Simply running "sudo reboot" when SSH-ed into the vagrant box also produces the same problem.


Solution

  • There is no way for Vagrant to know that the machine is being rebooted during the provisioning.

    If possible, the best would be to avoid rebooting here altogether. For example kernel updates should be already done when building the base box.

    Another easy (but not very convenient) way is to handle it with log output or documentation, or with a wrapper script which invokes vagrant up && vagrant reload.

    And finally, you could write a plugin which injects all the needed mounting etc. actions to Vagrant middleware stack after the provisioning, but you would still need to think how to let the plugin know that the machine has been booted. Other challenge is that this easily gets provider specific.