I'm having trouble reading a file via FileReader
and properly uploading it to S3 via PutObject
without the file being corrupted.
I've been using MongoDB Stitch as my serverless platform with my app, and it comes built-in with an S3 service provider. I can correctly upload text files, but if I try to upload any binary file (Excel, PDF, images, etc.) and download it straight from my S3 bucket and open it in its respective program, my computer reports it as corrupted.
I've looked at both files, the original on my computer and the one uploaded to S3, and they're very similar, but it seems some data was changed in the upload.
I'm using FileReader
to open a reading stream and FileReader.readAsBinaryString()
to read the file. My ContentType
in the headers is set to the original file type (e.g. for .txt
, text/plain
; for .jpg
, .jpeg
) via File.type
.
I've seen that the W3C recommends using FileReader.readAsArrayBuffer()
instead of readAsBinaryString()
, but S3 reports an error when trying to upload the resulting data: value for field "Body" should be of type "io.ReadSeeker"
.
I've also tried FileReader.readAsDataURL()
, but I don't know of a way to convert the base64 URL for any file type back to its original format. The only examples I can find all involve using base64 to encode/decode images.
const readFile = (file) => {
return new Promise((resolve) => {
const fileReader = new FileReader()
fileReader.onload = (event) => {
resolve(event.target.result)
}
fileReader.readAsBinaryString(file)
})
}
async s3Upload(file) {
const fileData = await readFile(file)
const putObjectArgs = {
Body: fileData,
Key: file.name,
ContentType: file.type,
ACL: 'public-read',
Bucket: 'BUCKET_NAME',
}
// call the Stitch services to upload
}
Is there a way to upload any type of file to S3 in the same way it's stored in my filesystem? or a generic method to encode/decode any file type to/from base64?
I found FilePond and similar libraries, and since it supports any type of server upload I've decided to use it rather than attempting to create the file upload logic myself.
UPDATE:
Even FilePond makes you define your own AWS upload logic... But I finally figured it out.
When using MongoDB Stitch, you have to convert the ArrayBuffer to a UInt8Array, then a BSON Binary type using their SDK. Here's the code:
const reader = new FileReader()
reader.onload = (e) => {
const fileData = new Uint8Array(reader.result)
const fileBson = new BSON.Binary(fileData)
// upload to S3
// ...
}
reader.readAsArrayBuffer(file)