I'm trying to upload files from a submitted form that has multiple image files from different input names.
For example, asumming a multipart form with text inputs and file inputs, and its not practical to group all file inputs with the same name, e.g images[]
<form ...>
some text inputs...
<input type='file' name='logo'/>
<input type='file' name='image-1'/>
<input type='file' name='image-2'/>
</form>
I want to create an express/sails middleware to handle the uploads, so i could access them directly on my controllers.
If i do this (Assuming that upload method on req.file() is promisified)
async function(req, res, next){
const maxTimeToBuffer = 9500
const dirname = '/some-upload-path'
const uploads = [
await req.file('logo').upload(dirname, maxTimeToBuffer)
await req.file('image-1').upload(dirname, maxTimeToBuffer)
await req.file('image-2').upload(dirname, maxTimeToBuffer)
]
// attaching the uploads to the request body ...
next()
}
This will throw an error:
EMAXBUFFER: An Upstream (
images-1) timed out before it was plugged into a receiver. It was still unused after waiting 9500ms. You can configure this timeout by changing the
maxTimeToBufferoption.
It doesn't matter if i increase the maxTimeToBuffer
to 30000 even on the skipper module itself.
Since trying to process each file in a sequential fashion throws an error no matter what, processing them in parallel works, as long as i know the input field names.
async function(req, res, next){
const maxTimeToBuffer = 9500
const dirname = '/some-upload-path'
const uploads = await [
req.file('logo').upload(dirname, maxTimeToBuffer)
req.file('image-1').upload(dirname, maxTimeToBuffer)
req.file('image-2').upload(dirname, maxTimeToBuffer)
]
// attaching the uploads to the request body ...
next()
}
Now the question is, how do i achieve the same thing on the last snippet but without knowing in advance the input fields. Something like this
async function(req, res, next){
const maxTimeToBuffer = 9500
const dirname = '/some-upload-path'
const uploads = await req._fileparser.upstreams
// attaching the uploads to the request body ...
next()
}
The thing is that req._fileparser.upstreams
gets updated every time it receives a new stream and that would only happen after the first image is fully received (sequential) but if i wait for the upstreams array to be complete i will get a timeout error too.
Is there any workaround for this?
I know there is a lot to digest, if you need further clarification let me know. Any idea is appreciated
Ok, after a lot of banging my head against the wall i came up with this. We start running the streams without waiting for them and each time a new stream is added we run it by calling upload, then we have an interval checking every 100ms if the parser/form was set as closed by skipper, when that's true we can safely wait for all the streams, and finally we resolve the promise, or if the timeout fires before we reject it. I've simplified my code since mine is doing a lot of extra parsing, i hope it helps somebody else.
async function(req, res, next){
const maxTimeToBuffer = 9500
const dirname = '/some-upload-path'
const uploads = await new Promise((resolve, reject)=> {
const TIMEOUT = 20000
const {upstreams} = req._fileparser
const tasks = _.map(upstreams, (stream)=> stream.upload(dirname, maxTimeToBuffer))
// We proxy the original push method, to push to our task list every time a new stream is added
upstreams.push = (stream)=> {
tasks.push(stream.upload(dirname, maxTimeToBuffer))
Array.prototype.push.call(upstreams, stream)
}
const startTime = (new Date()).getTime()
const intervalId = setInterval(async function(){
if(req._fileparser.closed){
clearInterval(intervalId)
fileObjects = await tasks
resolve(fileObjects)
}
else if((new Date()).getTime() - startTime > TIMEOUT){
clearInterval(intervalId)
reject(new Error('Timeout triggered before form closed'))
, 100)
}
})
// attaching the uploads to the request body ...
next()
}
I think skipper should not behave like this, and force this kind of hacks so maybe i´m missing something, if i am, let me know.