amazon-web-servicesnumpyamazon-s3aws-lambdastatsmodels

Is there a way to get around the 250 MB limit for an AWS lambda function?


I'm working on a Lambda Function in AWS and I tried to use Layers to load the dependencies (which are statsmodels, scikit-learn, pyLDAvis, pandas, numpy, nltk, matplotlib, joblib, gensim, and eli5), but I'm not able to add them because I get an error saying that the maximum allowed size of the code and layers together is 262144000 bytes (250 MB). I managed to cut it down to 264 MB, but it's still not small enough, and even if it was allowed, I'm not sure it would work properly.

Is there any way to add more space for the dependencies? Or, alternatively, is there a way for me to delete some of the subdirectories within the packages-- for example, I only need the distributions for statsmodels, so could I delete everything else?


Solution

  • Is there any way to add more space for the dependencies?

    If you package your lambda function as container lambda image, you will have 10 GB for your dependencies. On runtime, you function still has only 500MB of /tmp storage though.