dockerdeploymentdocker-composeautomated-deploy

Remote `docker-compose build`: workaround slow connections


I'm using docker-compose to deploy into a remote host. This is what my config looks like:

# stacks/web.yml
version: '2'

services:
  postgres:
    image: postgres:9.6
    restart: always
    volumes:
      - db:/var/lib/postgresql/data

  redis:
    image: redis:3.2.3
    restart: always

  web_server:
    depends_on: [postgres]
    build: ../sources/myapp
    links: [postgres]
    restart: always
    volumes:
      - nginx_socks:/tmp/socks
      - static_assets:/source/public

  sidekiq:
    depends_on: [postgres, redis]
    build: ../sources/myapp
    links: [postgres, redis]
    restart: always
    volumes:
      - static_assets:/source/public

  nginx:
    depends_on: [web_server]
    build: ../sources/nginx
    ports:
      - "80:80"
    volumes:
      - nginx_socks:/tmp/socks
      - static_assets:/public
    restart: always

volumes:
  db:
  nginx_socks:
  static_assets:

# stacks/web.production.yml
version: '2'

services:
  web_server:
    command: bundle exec puma -e production -b unix:///tmp/socks/puma.production.sock
    env_file: ../env/production.env

  sidekiq:
    command: bundle exec sidekiq -e production -c 2 -q default -q carrierwave
    env_file: ../env/production.env

  nginx:
    build:
      args:
        ENV_NAME: production
        DOMAIN: production.yavende.com

I deploy using:

eval $(docker-machine env myapp-production)`
docker-compose -f stacks/web.yml -f stacks/web.production.yml -p myapp_production build -no-deps web_server sidekiq
docker-compose -f stacks/web.yml -f stacks/web.production.yml -p myapp_production up -d

Although this works perfectly locally, and I did couple successful deploys in the past with this method, now it hangs when building the "web_server" service and finally show some timeout error, like I describe in this issue.

I think that the problem originates from the combination of my slow connection (Argentina -> DigitalOcean servers on USA) and me trying to build images and push them instead of using hub hosted images.

I've been able to do deploy by cloning my compose config into the server and running docker-compose directly there.

The question is: is there a better way to automate this process? Is a good practice to use docker-compose to build images on the fly?

I've been thinking about automating this process of cloning sources into the server and docker-composeing stuff, but there may be better tooling to solve this matter.


Solution

  • I was remote building images. This implies pushing the whole source needed to build the image over the net. For some images that was over 400MB of data sent from Argentina to some virtual servers in USA, and proved to be terribly slow.

    The solution is to totally change the approach to dockerizing my stack:

    This means I only push changes -no the whole source- via git. Then DockerHub builds the image.

    Then I docker-compose pull and docker-compose up -d my site.

    Free alternatives are running your own self-hosted docker registry and/or possibly GitLab, since it recently released it's own docker image registry: https://about.gitlab.com/2016/05/23/gitlab-container-registry/.