I have a repository structure hosted in a Google Cloud Repository that looks like this for Google Cloud Functions:
.
module.py
/common
module1.py
module2.py
/cloudfunction1
main.py
requirements.txt
/cloudfunction2
main.py
requirements.txt
Where each of the cloudfunction directories is deployed as a separate cloud function.
What I'd like to do is import modules from either the common directory, or from the root, however utilising a sys.path.append('..')
approach doesn't appear to work. I presume this is because the cloud function deployment process only includes the files in the directory in which main.py is located?
How can I resolve this?
If you find yourself needing to modify sys.path
or otherwise import "beyond the top level package", this is generally a code smell in Python that your project is not correctly structured.
In this example of Cloud Functions, one thing you can do is structure your project like this:
.
├── common
│ ├── module1.py
│ └── module2.py
├── main.py
└── requirements.txt
Where main.py
contains both functions:
from common import module1, module2
def cloudfunction1(request):
...
def cloudfunction2(request):
...
And you deploy those functions either directly by name:
$ gcloud functions deploy cloudfunction1 --runtime python37 --trigger-http
$ gcloud functions deploy cloudfunction2 --runtime python37 --trigger-http
Or by entrypoint:
$ gcloud functions deploy foo --runtime python37 --entry-point cloudfunction1 --trigger-http
$ gcloud functions deploy bar --runtime python37 --entry-point cloudfunction2 --trigger-http
Note that this has some downsides:
requirements.txt
file needs to contain all the dependencies for both functionscommon
directory, you'll need to redeploy both functionsThat said, if your functions are so related that they share common code and often need to be deployed together, a better option might be to make them part of a single App Engine app. (This only applies if they both use HTTP triggers).