pythonfile-uploadpython-requestslarge-file-upload

python requests upload large file with additional data


I've been looking around for ways to upload large file with additional data, but there doesn't seem to be any solution. To upload file, I've been using this code and it's been working fine with small file:

with open("my_file.csv", "rb") as f:
    files = {"documents": ("my_file.csv", f, "application/octet-stream")}
    data = {"composite": "NONE"}
    headers = {"Prefer": "respond-async"}
    resp = session.post("my/url", headers=headers, data=data, files=files)

The problem is that the code loads the whole file up before sending, and I would run into MemoryError when uploading large files. I've looked around, and the way to stream data is to set

resp = session.post("my/url", headers=headers, data=f)

but I need to add {"composite": "NONE"} to the data. If not, the server wouldn't recognize the file.


Solution

  • You can use the requests-toolbelt to do this:

    import requests
    from requests_toolbelt.multipart import encoder
    
    session = requests.Session()
    with open('my_file.csv', 'rb') as f:
        form = encoder.MultipartEncoder({
            "documents": ("my_file.csv", f, "application/octet-stream"),
            "composite": "NONE",
        })
        headers = {"Prefer": "respond-async", "Content-Type": form.content_type}
        resp = session.post(url, headers=headers, data=form)
    session.close()
    

    This will cause requests to stream the multipart/form-data upload for you.