amazon-s3next.jsblobbufferformidable

formidable, formData received and parsed, UPLOADS a file of 0 bytes to S3 Bucket


I may need some help here. After receiving this file from the client using the code below and I send it to S3, the file uploaded has 0 bytes. I read a few people and methods using Buffer and ArrayBuffer, I tried to use it but then I got field.ArrayBuffer() is not a function.

{ file } -> received is this:

[
  PersistentFile {
    _events: [Object: null prototype] { error: [Function (anonymous)] },
    _eventsCount: 1,
    _maxListeners: undefined,
    lastModifiedDate: 2023-11-27T19:51:36.822Z,
    filepath: 'C:\\Users\\evega\\AppData\\Local\\Temp\\b400159781466e32b7810ac20',
    newFilename: 'b400159781466e32b7810ac20',
    originalFilename: '87085734.png',
    mimetype: 'image/png',
    hashAlgorithm: false,
    size: 704634,
    _writeStream: WriteStream {
      fd: 5,
      path: 'C:\\Users\\evega\\AppData\\Local\\Temp\\b400159781466e32b7810ac20',
      flags: 'w',
      mode: 438,
      start: undefined,
      pos: undefined,
      bytesWritten: 704634,
      _writableState: [WritableState],
      _events: [Object: null prototype],
      _eventsCount: 1,
      _maxListeners: undefined,
      [Symbol(kFs)]: [Object],
      [Symbol(kIsPerformingIO)]: false,
      [Symbol(kCapture)]: false
    },
    hash: null,
    [Symbol(kCapture)]: false
  }
]```


This is the code running in the server. It has been a nightmare to get it into a Buffer in order to stream it into the S3 bucket. Not sure what else to do

```export default async function AddBlogPost(req, res) {
    
    const session = await getServerSession(req, res, authOptions);

    if (session && req.method === 'POST') {
        const form = formidable({ multiples: true });
        form.parse(req, async (err, fields, files) => {
            if (err) {
                console.error('Error parsing form data:', err);
                res.status(500).json({ error: 'Internal Server Error' });
                return;
            }
            const { file } = files,
                  { title, paragraph, user } = fields,
                  fileName = file[0].originalFilename,
                  bucketName = process.env.AWS_S3_BUCKET_NAME,
                  key = `appluex-blogs/${fileName}`; // set the desired key for the S3 object

    
            // Call the handler to stream the file into S3 Bucket
            try {
                await uploadImageToS3(file, bucketName, key);
                res.status(200).json({ message: 'Image uploaded successfully' });
            } catch (uploadError) {
                console.error('Error uploading image to S3:', uploadError);
                return res.status(500).json({ error: 'Internal Server Error' });
            }
}}```


The file seems to be received well from the frontend client, since its size is 704634. Still, I think I'm missing some kind of buffer implementation, I have tried multiple things, and all I got is a new error.


Solution

  • I found what I was missing. I had to convert to a string the Image before sending it to uploadImageToS3. I solved it by adding the code below:

    // Get the file into Buffer const fileBuffer = fs.readFileSync(file[0].filepath);

    // Create a Uint8Array from the buffer const uint8Array = new Uint8Array(fileBuffer);