I've recently backed up my Influx database from a docker container and have now a backup file in the format of .tar.gz. I wanted to import the data from this file into another Influx database that is also running inside of a docker container.
What I tried to do was using Chronograf and its "Write Data" feature to import the content of the backup since it supports .gz files, but it seems that Chronograf only supports files that are up to 25MB in size and this backup of mine is 70MB.
I've searched for other possible methods to solve this in the "Docker Influx Documentation" and "InfluxDB Shell Documentation". The only thing I found relevant was the "-import" option that is refenced in the Shell Documentation. I tried using it but to no avail. Any command that wasn't a direct query in the InfluxDB shell was rejected and all I got was an error message that said: ERR: error parsing query: found influx, expected SELECT, DELETE, SHOW, CREATE, DROP, EXPLAIN, GRANT, REVOKE, ALTER, SET, KILL at line 1, char 1
Just to be clear, I'm on Windows 10 at the moment.
I figured it out:
Copy the backup file you want from your host to your container with: docker cp LOCAL_FILE CONTAINER_NAME:/etc/NEW_FILE
where LOCAL_FILE is the file on your host that you want to copy over, CONTAINER_NAME is the name of your docker container, /etc/ is a default and already existing directory, and NEW_FILE is just the name of the file that's going to get the data of the LOCAL_FILE.
Go inside your Docker container file system with: docker exec -it CONTAINER_NAME /bin/bash
and navigate to where you copied your NEW_FILE.
Make a new folder(name it "backup" for the sake of clarity) and extract the contents of NEW_FILE into it.
Restore the backup to a new database with: influxd restore -portable -newdb NEW_DATABASE_NAME backup
For any alternative options in the last step, go to the documentation here