Could you tell me if I do it in correct way:
Before I had had in .gitlab-ci.yml
same_task:
stage: deploy
image: python:3
script:
- python -V
Now I have:
pep8:
stage: deploy
image: gitlablogin/projectname
script:
- python -V
and after this change job failed:
Running with gitlab-runner 11.4.2 (cf91d5e1)
on docker-auto-scale 72989761
Using Docker executor with image gitlablogin/projectname ...
Pulling docker image gitlablogin/projectname ...
ERROR: Job failed: Error response from daemon: pull access denied for gitlablogin/projectname, repository does not exist or may require 'docker login' (executor_docker.go:168:0s)
Is my usage of docker in context of gitlab CI and gitlab registry is correct? I also want to keep my docker file on same repo and build new image when samething change in Dockerfile, what will be the best way to do it?
Your approach isn't far from what you want to achieve. I believe what you are missing is this:
According to: https://docs.gitlab.com/ee/ci/docker/using_docker_images.html#what-is-an-image
in order to use the image you have built for your CI you will first need to add it to runner's config.toml file as a service.
Once you have done so you would be able to use the directive: image: my_image
However, there's another option which you could do: you could login to your Docker registry, pull and run the CI Docker image and then you can exec inside where you will be running your pipeline.