dockerairflowopenshiftbitnamiokd

Bitnami airflow scheduler could not connect database while webserver can connect even they have same properties?


I want to configure airflow on openshift.

I set database on openshift like below :

kind: Service
apiVersion: v1
metadata:
  name: airflow-database
  namespace: ersin-poc
spec:
  ports:
    - name: 5432-tcp
      protocol: TCP
      port: 5432
      targetPort: 5432
  selector:
    deployment: airflow-database
  type: ClusterIP
  sessionAffinity: None
  ipFamilies:
    - IPv4
  ipFamilyPolicy: SingleStack   

and my database deployment like below :

kind: Deployment
apiVersion: apps/v1
metadata:
  name: airflow-database
  namespace: ersin-poc 
labels:
  deployment: airflow-database
spec:
  replicas: 1
  selector:
    matchLabels:
      deployment: airflow-database
  template:
    metadata:
      creationTimestamp: null
      labels:
        deployment: airflow-database
    spec:
      volumes:
        - name: generic
          persistentVolumeClaim:
            claimName: generic
        - name: empty1
          emptyDir: {}
      containers:
        - resources: {}
          name: airflow-database
          env:
            - name: POSTGRESQL_USERNAME
              value: 'bn_airflow'
            - name: POSTGRESQL_PASSWORD
              value: 'bitnami1'
            - name: POSTGRESQL_DATABASE
              value: 'bitnami_airflow'
          ports:
            - containerPort: 5432
              protocol: TCP
          volumeMounts:
            - name: generic
              mountPath: /bitnami/postgresql/
          image: >-
            bitnami/postgresql:latest
          hostname: airflow-database  

I can connect this db from my webserver like below :

kind: Deployment
apiVersion: apps/v1
metadata:
  name: airflow-webserver
  namespace: ersin-poc 
labels:
  deployment: airflow-webserver
spec:
  replicas: 1
  selector:
    matchLabels:
      deployment: airflow-webserver
  template:
    metadata:
      creationTimestamp: null
      labels:
        deployment: airflow-webserver
    spec:
      volumes:
        - name: generic
          persistentVolumeClaim:
            claimName: generic
        - name: empty1
          emptyDir: {}
      containers:
        - resources: {}
          name: airflow-webserver
          env:
            - name: AIRFLOW_HOME
              value: /home/appuser
            - name: USER
              value: appuser
            - name: AIRFLOW_FERNET_KEY
              value: '46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho='
            - name: AIRFLOW_SECRET_KEY
              value: 'a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08='
            - name: AIRFLOW_EXECUTOR
              value: 'CeleryExecutor'
            - name: AIRFLOW_DATABASE_NAME
              value: 'bitnami_airflow'
            - name: AIRFLOW_DATABASE_USERNAME
              value: 'bn_airflow'
            - name: AIRFLOW_DATABASE_PASSWORD
              value: 'bitnami1'
            - name: AIRFLOW_LOAD_EXAMPLES
              value: 'yes'
            - name: AIRFLOW_PASSWORD
              value: 'bitnami123'
            - name: AIRFLOW_USERNAME
              value: 'user'
            - name: AIRFLOW_EMAIL
              value: 'user@example.com'
            - name: AIRFLOW_DATABASE_HOST
              value: 'airflow-database'
            - name: AIRFLOW_DATABASE_PORT_NUMBER
              value: '5432'
          ports:
            - containerPort: 8080
              protocol: TCP
          volumeMounts:
            - name: generic
              mountPath: /home/appuser
            - name: generic
              mountPath: /home/appuser/logs/
            - name: generic
              mountPath: /home/appuser/dags/
          image: >-
            bitnami/airflow:latest
          hostname: airflow-webserver

but when i try it with airflow-scheduler it gives error :

airflow-scheduler 09:29:43.31 INFO ==> Trying to connect to the database server airflow-scheduler 09:30:47.42 ERROR ==> Could not connect to the database

and my scheduler yaml is :

kind: Deployment
apiVersion: apps/v1
metadata:
  name: airflow-scheduler
  namespace: ersin-poc 
labels:
  deployment: airflow-scheduler
spec:
  replicas: 1
  selector:
    matchLabels:
      deployment: airflow-scheduler
  template:
    metadata:
      labels:
        deployment: airflow-scheduler
    spec:
      volumes:
        - name: generic
          persistentVolumeClaim:
            claimName: generic
        - name: empty1
          emptyDir: {}
      containers:
        - resources: {}
          name: airflow-scheduler
          env:
            - name: AIRFLOW_HOME
              value: /home/appuser
            - name: USER
              value: appuser
            - name: AIRFLOW_FERNET_KEY
              value: '46BKJoQYlPPOexq0OhDZnIlNepKFf87WFwLbfzqDDho='
            - name: AIRFLOW_SECRET_KEY
              value: 'a25mQ1FHTUh3MnFRSk5KMEIyVVU2YmN0VGRyYTVXY08='
            - name: AIRFLOW_EXECUTOR
              value: 'CeleryExecutor'
            - name: AIRFLOW_DATABASE_NAME
              value: 'bitnami_airflow'
            - name: AIRFLOW_DATABASE_USERNAME
              value: 'bn_airflow'
            - name: AIRFLOW_DATABASE_PASSWORD
              value: 'bitnami1'
            - name: AIRFLOW_DATABASE_HOST
              value: 'airflow-database'
            - name: AIRFLOW_DATABASE_PORT_NUMBER
              value: '5432'
            - name: AIRFLOW_WEBSERVER_HOST
              value: 'airflow-webserver'
            - name: AIRFLOW_WEBSERVER_PORT_NUMBER
              value: '8080'
            - name: REDIS_HOST
              value: 'airflow-redis'
            - name: REDIS_PORT_NUMBER
              value: '6379'
          ports:
            - containerPort: 8080
              protocol: TCP
          volumeMounts:
            - name: generic
              mountPath: /home/appuser
            - name: generic
              mountPath: /home/appuser/logs/
            - name: generic
              mountPath: /home/appuser/dags/
          image: >-
            bitnami/airflow-scheduler:latest
          hostname: airflow-scheduler 

so i cant understand why i got this error with same properties?

thanks in advance

EDIT

and I try in scheduler pod this commands to see whether i can connect to db or not :

psql -h airflow-database -p 5432 -U bn_airflow -d bitnami_airflow -W

pass: bitnami1

select * from public.ab_user;

and yes I can.


Solution

  • After a lot of search , I ve decided to make this with apache/airflow images. (posgresql and redis are still bitnami - it doesnt important)

    You can see all ymal files about airflow on openshift.

    https://github.com/ersingulbahar/airflow_on_openshift

    It works now as expected