I am adding a NextJS frontend to my Kubernetes cluster. I added the following file:
apiVersion: apps/v1
kind: Deployment
metadata:
name: client-depl
spec:
replicas: 1
selector:
matchLabels:
app: client
template:
metadata:
labels:
app: client
spec:
containers:
- name: client
image: ldco2016/client
---
apiVersion: v1
kind: Service
metadata:
name: client-srv
spec:
selector:
app: client
ports:
- name: client
protocol: TCP
port: 3000
targetPort: 3000
to my infra/k8s/
directory and then reconfigured ingress-srv.yml
like so:
apiVersion: extensions/v1beta1
kind: Ingress
metadata:
name: ingress-service
annotations:
kubernetes.io/ingress.class: nginx
nginx.ingress.kubernetes.io/use-regex: "true"
spec:
rules:
- host: ticketing.dev
http:
paths:
- path: /api/users/?(.*)
backend:
serviceName: auth-srv
servicePort: 3000
- path: /?(.*)
backend:
serviceName: client-srv
servicePort: 3000
and the skaffold.yml
file:
apiVersion: skaffold/v2alpha3
kind: Config
deploy:
kubectl:
manifests:
- ./infra/k8s*
build:
local:
push: false
artifacts:
- image: ldco2016/auth
context: auth
docker:
dockerfile: Dockerfile
sync:
manual:
- src: "src/**/*.ts"
dest: .
- image: ldco2016/client
context: client
docker:
dockerfile: Dockerfile
sync:
manual:
- src: "**/*.js"
dest: .
When I run skaffold dev
it hangs about here:
starting deploy...
- deployment.apps/auth-depl created
- service/auth-srv created
- deployment.apps/auth-mongo-depl created
- service/auth-mongo-srv created
- deployment.apps/client-depl created
- service/client-srv created
- ingress.extensions/ingress-service created
Waiting for deployments to stabilize...
- deployment/auth-depl: waiting for rollout to finish: 0 of 1 updated replicas are available...
- deployment/auth-mongo-depl: waiting for rollout to finish: 0 of 1 updated replicas are available...
- deployment/client-depl: waiting for rollout to finish: 0 of 1 updated replicas are available...
- deployment/client-depl is ready. [2/3 deployment(s) still pending]
- deployment/auth-mongo-depl is ready. [1/3 deployment(s) still pending]
Any ideas?
I am running Docker desktop and Kubernetes as well. As this is a microservices application, I thought perhaps Kubernetes needed more resources. I tried to add more resources, but that did not solve the problem.
I suspected the issue was with one of my pods so I ran: kubectl get pods
NAME READY STATUS RESTARTS AGE
auth-depl-5867ffb6bd-n5s6w 0/1 CreateContainerConfigError 0 2m7s
auth-depl-669fc8fd66-qr8kj 0/1 CreateContainerConfigError 0 6m11s
auth-mongo-depl-585f5f978c-tnc9w 1/1 Running 0 2m7s
So the issue seemed to be with my auth-depl
and so I reviewed its yaml
file and I suspected the problem to be the secret key I added so I commented it out like so:
apiVersion: apps/v1
kind: Deployment
metadata:
name: auth-depl
spec:
replicas: 1
selector:
matchLabels:
app: auth
template:
metadata:
labels:
app: auth
spec:
containers:
- name: auth
image: ldco2016/auth
# env:
# - name: JWT_KEY
# valueFrom:
# secretKeyRef:
# name: jwt-secret
# key: JWT_KEY
---
apiVersion: v1
kind: Service
metadata:
name: auth-srv
spec:
selector:
app: auth
ports:
- name: auth
protocol: TCP
port: 3000
targetPort: 3000
To which then I ran skaffold dev --cleanup=false
and saw:
Listing files to watch...
- ldco2016/auth
Generating tags...
- ldco2016/auth -> ldco2016/auth:latest
Some taggers failed. Rerun with -vdebug for errors.
Checking cache...
- ldco2016/auth: Found Locally
Tags used in deployment:
- ldco2016/auth -> ldco2016/auth:367e6b2171c5c8477a3f3458d23dd73030f35716df45a290aa54baa5f4dcdaa1
Starting deploy...
- deployment.apps/auth-depl configured
- service/auth-srv configured
- deployment.apps/auth-mongo-depl configured
- service/auth-mongo-srv configured
- ingress.extensions/ingress-service configured
Waiting for deployments to stabilize...
- deployment/auth-depl: waiting for rollout to finish: 1 old replicas are pending termination...
- deployment/auth-mongo-depl: waiting for rollout to finish: 1 old replicas are pending termination...
- deployment/auth-depl is ready. [1/2 deployment(s) still pending]
- deployment/auth-mongo-depl is ready.
Deployments stabilized in 3.633465001s
Watching for changes...
[auth-depl-5c59699679-tnzk2 auth]
[auth-depl-5c59699679-tnzk2 auth] > auth@1.0.0 start /app
[auth-depl-5c59699679-tnzk2 auth] > nodemon ./src/index.ts
[auth-depl-5c59699679-tnzk2 auth]
[auth-depl-5c59699679-tnzk2 auth] [nodemon] 2.0.5
[auth-depl-5c59699679-tnzk2 auth] [nodemon] to restart at any time, enter `rs`
[auth-depl-5c59699679-tnzk2 auth] [nodemon] watching path(s): *.*
[auth-depl-5c59699679-tnzk2 auth] [nodemon] watching extensions: ts,json
[auth-depl-5c59699679-tnzk2 auth] [nodemon] starting `ts-node ./src/index.ts`
[auth-depl-5c59699679-tnzk2 auth] (node:40) UnhandledPromiseRejectionWarning: Error: JWT must be defined
That served as a big clue, because when I went to kubectl get secrets
I found that my JWT was no longer in the Kubernetes secret and I believe that is because my machine was inadvertently restarted recently, meaning I forgot to click on foregoing until a later time and it restarted later on that evening which restarted my local copy of Docker desktop with Kubernetes.
So I ran the kubectl create secret...
command again and then ran kubectl get secrets
and saw my secret key in there again.
I added back in those environment variables with the secret key or the value from that secret key inside my auth-depl.yml
file and then ran skaffold dev --cleanup=false
again and:
Listing files to watch...
- ldco2016/auth
Generating tags...
- ldco2016/auth -> ldco2016/auth:latest
Some taggers failed. Rerun with -vdebug for errors.
Checking cache...
- ldco2016/auth: Found Locally
Tags used in deployment:
- ldco2016/auth -> ldco2016/auth:367e6b2171c5c8477a3f3458d23dd73030f35716df45a290aa54baa5f4dcdaa1
Starting deploy...
- deployment.apps/auth-depl configured
- service/auth-srv configured
- deployment.apps/auth-mongo-depl configured
- service/auth-mongo-srv configured
- ingress.extensions/ingress-service configured
Waiting for deployments to stabilize...
- deployment/auth-depl: waiting for rollout to finish: 1 old replicas are pending termination...
- deployment/auth-mongo-depl: waiting for rollout to finish: 1 old replicas are pending termination...
- deployment/auth-depl is ready. [1/2 deployment(s) still pending]
- deployment/auth-mongo-depl is ready.
Deployments stabilized in 3.612848017s
Watching for changes...
[auth-depl-5c59699679-tnzk2 auth] Error from server (BadRequest): container "auth" in pod "auth-depl-5c59699679-tnzk2" is terminated
[auth-depl-7d9bf44d9f-n9rcq auth]
[auth-depl-7d9bf44d9f-n9rcq auth] > auth@1.0.0 start /app
[auth-depl-7d9bf44d9f-n9rcq auth] > nodemon ./src/index.ts
[auth-depl-7d9bf44d9f-n9rcq auth]
[auth-depl-7d9bf44d9f-n9rcq auth] [nodemon] 2.0.5
[auth-depl-7d9bf44d9f-n9rcq auth] [nodemon] to restart at any time, enter `rs`
[auth-depl-7d9bf44d9f-n9rcq auth] [nodemon] watching path(s): *.*
WARN[0004] exit status 1
[auth-depl-7d9bf44d9f-n9rcq auth] [nodemon] watching extensions: ts,json
[auth-depl-7d9bf44d9f-n9rcq auth] [nodemon] starting `ts-node ./src/index.ts`
[auth-depl-7d9bf44d9f-n9rcq auth] Connected to MongoDB
[auth-depl-7d9bf44d9f-n9rcq auth] Listening on port 3000!!!!!
Back in business.