I currently have a set of applications running in separate Docker containers. The containers themselves are running on the Cloud. Sometimes, a configuration update is necessary. The configuration update requires me to send a file that is 300 - 500 MB in size. I am wondering what would be the best practice to do this.
The containers each have a Swagger doc exposed, so I could expose an endpoint to potentially load the file in. Would I send the contents of the files via the request body in this case?
I do not think exposing an endpoint and uploading an entire file to the server is a great idea both due to Network and Security issues.
Currently, I do not have any idea about what those update files are, but by your approach container will be doing double work. (Doing its own job + downloading updates)
For copying the updated files you use a secure protocol like scp
from local to the docker container. now assuming you have backups of update files in case of network failure while copying updates then this approach works.
Here is a little better strategy.
Ideally all your update files should be hosted on docker host which is then mounted using volume to docker container
Better way will be you should upload this update file to a cloud storage service that is highly scalable.
Your storage trigger will hit an endpoint which will start downloading the update file to your docker host
post successful download you can restart your container which will pick the latest update from the docker host
So to conclude you will have an endpoint outside Docker that download updates for you and handle the logic of starting/stopping the container with the correct update version.
Also not to mention it will be great to have a backup of previous updates which will help in case of a faulty update. You can use a rolling update file writer kind of mechanism for this.
This approach is great both in terms of Network and Security aspects.
Hope this helps. Thanks.