I would like to be able to re-use code across multiple Digital Ocean Functions (their serverless tool). I do not want to publish a library if I can avoid it.
I have tried a couple of approaches, but am willing to do about anything to get this to work.
My first attempt was using a single file with multiple Function entry points.
Under packages/tom/
I have the file tomsfns.py
:
def t1():
return { "body", make_msg("T1") }
def t2():
return { "body", make_msg("T2") }
def make_msg(caller):
return "Hello " + caller
Dumb as toast, but should work fine. Under packages in the project.yml I have the following:
- name: tom
functions:
- name: tomsfns
main: t1
binary: false
runtime: python:3.11
web: false
- name: tomsfns
main: t2
binary: false
runtime: python:3.11
web: false
What actually happens when you deploy that is you get one Function, tom/tomsfns
and the entirety of the code in that single file. What I hoped for was two Functions off the same codebase, even if they duplicated the code.
My second attempt was was to break out the desired utility into it’s own python file and then have separate files call it. This is the ‘style’ that makes the most sense to me but doesn’t work either.
Both the structure and the entirety of their contents are:
packages
rick
r1.py
import make_msg
def main():
return { "body": make_msg("R1") }
r2.py
import make_msg
def main():
return { "body": make_msg("R2") }
rickutils.py
def make_msg(caller):
return "Hello " + caller
(Begin: added after the original post)
The relevant project.yml is:
- name: rick
functions:
- name: r1
runtime: python:3.11
web: false
- name: r2
runtime: python:3.11
web: false
(End: Added after original post)
This results in what appears to be the right structure and code in the published functions:
Deployed functions ('doctl sls fn get <funcName> --url' for URL):
- rick/r1
- rick/r2
- rick/rickutils
- tom/tomsfns
However
stderr: Invalid function: No module named 'make_msg'
I have tried every combination of import statement and file location I can think of to get that method into the Function code but nothing has worked.
Does anybody have any thoughts or examples of this kind of thing?
Thanks.
(as I implemented it)
from the proj root
.
├── lib
│ └── utils.py
├── packages
│ └── sample
│ └── r1
│ ├── .include
│ └── r1.py
└── project.yml
There are two key points.
They will be copied into each function at build time.
This way that you can 'see' the utility methods during development.
Mine looks like this:
../../../lib/utils.py
NOTE:
I was able to avoid a build.sh for now but as soon as I start including external libraries, I will have to to that in the build.sh script. However, I do not expect that to change any of this.
The documentation page https://docs.digitalocean.com/products/functions/reference/build-process/ mentions a possible lib/
directory and possible build.sh
build scripts.
I've (ab?)used the build script to include the library code in the function directory. I'm really not sure if this is the right way, but otherwise I couldn't find the library code anywhere in the runtime environment.
Here is what it looks like:
$ find lib/packages/
lib/
lib/ricklib.py
packages/
packages/rick
packages/rick/r1
packages/rick/r1/ricklib.py
packages/rick/r1/build.sh
packages/rick/r1/__deployer__.zip
packages/rick/r1/__main__.py
packages/rick/r2
packages/rick/r2/__main__.py
cat packages/rick/r1/build.sh
cp ../../../lib/ricklib.py .
. $ unzip -l packages/rick/r1/__deployer__.zip
Archive: packages/rick/r1/__deployer__.zip
Length Date Time Name
--------- ---------- ----- ----
1295 04-12-2024 22:10 __main__.py
19 04-12-2024 22:10 ricklib.py
--------- -------
1314 2 files
Note that __deployer__.zip
and ricklib.py
are in the function directory because of the build process. They shouldn't be there in a clean repository.
Note that using a remote build with something like `doctl serverless deploy . --remote-build' will leave our working directory clean (and the resulting function will still work as expected).