pythonnginxtornado

transferring big files to a limited memory server


I want to have my webservice accept large file transfers from customers. To do this, I am planning to use nginx over tornado to take care of limited memory at the server side during file upload. Is this a good plan? Or should I use some other framework/protocol to transfer large file from a user to my server.


Solution

  • Tornado needs some work before it can stream very large uploads, see issue 231. I'd suggest Nginx's HttpUpload module: Nginx uploads user files into server-side temp files, then notifies your application so you can decide what to do with the file.