google-cloud-platformgoogle-cloud-buildpacker

How would you execute a python script located in a GCP bucket, on a schedule?


Quick summary: Currently each month when Google releases its latest native images (Windows 2016, 2019, 2022), we manually run packer against those images, and create our own customized images that get uploaded back to GCP, where we can deploy machines from those custom images. The native images are named like Win-2016-20240702, where the last 8 characters are the date it is released.

We are trying to completely automate the process. I have set up: 1. A docker container running the latest Packer image in Container Registry. 2. A python script that compares the latest Google native images, to our custom images, and if there is a difference in the name(means new images dropped) it will update the Packer variable file located in the GCP bucket with the new image source and our custom image name. It then submits a Cloud Build containing the Packer config file and the updated variable file, generating new custom images. (Pulling the rest of the files from the GCP bucket)

What is the best way to run the python script on a schedule, say every 6 hours?

Or better yet, is there a way to just have the Python script run automatically when the new images drop?

I was thinking of maybe trying to use a Cloud Function, and use Cloud Scheduler? Any help would be great, Thanks!

I've tried the Cloud Function route, and I'm having problems getting the HTTP trigger to work. And I'm not sure this is the best way to do it.


Solution

  • After looking at my implementation and thinking about it, I realized this was a bit more complex than need be.

    I recreated everything in Cloud build, removing Cloud run job and the docker container. Now I have a pub/sub listening for an event, with Cloud Scheduler configured for once per week, to send a message to the pub/sub.

    Then I just configured a Cloud Build Trigger that listens for that pub/sub, I have my cloudbuild.yaml configured inline in the trigger. The yaml file downloads the python script that determines if the new images dropped, as well as all of the packer files. Same process after that, if no new images, it exits, if it does detect new images, it updates the packer variable file, and executes packer. Basically removed an entire layer of code. Cheers!