For example, I need to get database credentials from the vault to connect to in python script. I am using gitlab ci/cd so I can use .gitlab-ci.yml file. What is the best way to get some values and pass them to python script?
Well you can do it with GitLab CI/CD environment variables and the Vault integration provided by GitLab.
Step 1: Integrate GitLab with Vault you have documentation for that .
Step 2: Define Environment Variables in GitLab CI/CD : In your GitLab project settings, navigate to Settings > CI/CD > Variables.
Define environment variables for the Vault address, Vault token, and any other variables needed to connect to Vault and retrieve the secrets.
Steps 3: Configure .gitlab-ci.yml: Use the before_script section to set up the environment, including installing dependencies and configuring any necessary tools.
nside the script, retrieve the database credentials from Vault using the Vault CLI or GitLab Vault integration. You can use environment variables defined in step 2 to authenticate with Vault.
here is an example of CI CD yaml.
stages:
- deploy
before_script:
# Install Vault CLI (if not already installed)
- apk add --no-cache vault
deploy:
stage: deploy
script:
# Authenticate with Vault using the provided token
- vault login --token $VAULT_TOKEN $VAULT_ADDRESS
# Retrieve database credentials from Vault
- export DB_USERNAME=$(vault kv get -field=username secret/database)
- export DB_PASSWORD=$(vault kv get -field=password secret/database)
# Run Python script with database credentials as environment variables
- python my_script.py --db-username=$DB_USERNAME --db-password=$DB_PASSWORD
here I have Passed the retrieved credentials to the Python script as environment variables or command-line arguments.
By this, you can securely retrieve sensitive values from Vault and pass them to your Python script in a GitLab CI/CD pipeline without exposing them in your repository or pipeline logs.
As mentioned if you want to pass it the env variable to Dockerfile you can do it directly.
Modify your Dockerfile to load the secrets from Vault at container startup. You can use environment variables or configuration files to pass the secrets to your Python script.
# Install Vault CLI
RUN apk add --no-cache vault
# Set up environment variables for Vault address and authentication token
ENV VAULT_ADDR="https://vault.example.com"
ENV VAULT_TOKEN="<your-vault-token>"
# Copy and run a script to retrieve secrets from Vault and export them as environment variables
COPY entrypoint.sh /entrypoint.sh
RUN chmod +x /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]
# Copy your Python script into the container
COPY my_script.py /app/my_script.py
# Set up working directory
WORKDIR /app
# Command to run the Python script
CMD ["python", "my_script.py"]
create a shell script entrypoint.sh
#!/bin/bash
# Retrieve secrets from Vault and export them as environment variables
export DB_USERNAME=$(vault kv get -field=username secret/database)
export DB_PASSWORD=$(vault kv get -field=password secret/database)
# Run the Python script
exec "$@"
Here is an example gitlab Ci yaml after editing dockerfile
stages:
- build
- deploy
build:
stage: build
script:
- docker-compose build
deploy:
stage: deploy
script:
- docker-compose up -d
And pass it to your docker compose yaml
app:
build: .
entrypoint: /entrypoint.sh
environment:
- DB_USERNAME=${DB_USERNAME}
- DB_PASSWORD=${DB_PASSWORD}
I hope this should work !!