herokuredisceleryheroku-redis

Heroku - Redis memory > 'maxmemory' with a 20MB file in hobby:dev where it should be 25MB


So I am trying to upload a file with Celery that uses Redis on my Heroku website. I am trying to upload a .exe type file with the size of 20MB. Heroku is saying in they're hobby: dev section that the max memory that could be uploaded is 25MB. But I, who is trying to upload a file in Celery(turning it from bytes to base64, decoding it and sending it to the function) is getting kombu.exceptions.OperationalError: OOM command not allowed when used memory > 'maxmemory'. error. Keep in mind when I try to upload for e.g a 5MB file it works fine. But 20MB doesn't. I am using Python with the Flask framework


Solution

  • There are two ways to store files in DB (Redis is just an in-memory DB). You can either store a blob in the DB (for small files, say a few KBs), or you can store the file in memory and store a pointer to the file in DB.

    enter image description here

    So for your case, store the file on disk and place only the file pointer in the DB.

    The catch here is that Heroku has a Ephemeral file system that gets erased every 24 hours, or whenever you deploy new version of the app.

    So you'll have to do something like this:

    1. Write a small function to store the file on the local disk (this is temporary storage) and return the path to the file
    2. Add a task to Celery with the file path i.e. the parameter to the Celery task will be the "file-path" not a serialized blob of 20MB data.
    3. The Celery worker process picks the task you just enqueued when it gets free and executes it.

    If you need to access the file later, and since the local heroku disk only has temporary, you'll have to place the file in some permanent storage like AWS S3.

    (The reason we go through all these hoops and not place the file directly in S3 is because access to local disk is fast while S3 disks might be in some other server farm at some other location and it takes time to save the file there. And your web process might appear slow/stuck if you try to write the file to S3 in your main process.)