elasticsearchgoogle-compute-enginegoogle-cloud-platformbackup-strategies

How can I configure a structure for backing up elasticsearch data on Google Compute Engine?


I have e elasticsearch environment configured on GCE (Google Compute Engine) with two nodes, therefore two VM's and I need to create a backup strategy for this. I first thought I could use the elasticsearch snapshot API to backup all my data to a given storage as the API supports a few ways to store the snapshot.

I tried to used the shared filesystem option, but it requires that the store location be shared between nodes. Is there a way I can do this on GCE?

curl -XPUT http://x.x.x.x:9200/_snapshot/backup -d '{
    "type": "fs",
    "settings": {
        "compress" : true,
        "location": "/elasticsearch/backup"
    }

}'

nested: RepositoryVerificationException[[backup] store location [/elasticsearch/backup] is not shared between node

I know there is a AWS plugin for elasticsearch for storing the backups. Is there any plugin for Google Cloud Storage? Is is possible to do that?

If any of those alternatives above are not possible, is there any other recommended strategy for backing-up my data?


Solution

  • Elasticsearch now has a plugin for Google Cloud Storage, so this is natively supported.