javascriptnode.jsamazon-s3coffeescript

Uploading base64 encoded Image to Amazon S3 via Node.js


Yesterday I did a deep night coding session and created a small node.js/JS (well actually CoffeeScript, but CoffeeScript is just JavaScript so lets say JS) app.

what's the goal:

  1. client sends a canvas datauri (png) to server (via socket.io)
  2. server uploads image to amazon s3

step 1 is done.

the server now has a string a la

data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAADICAYAAACt...

my question is: what are my next steps to "stream"/upload this data to Amazon S3 and create an actual image there?

knox https://github.com/LearnBoost/knox seems like an awesome lib to PUT something to S3, but what I'm missing is the glue between the base64-encoded-image-string and actual upload action?

Any ideas, pointers and feedback welcome.


Solution

  • For people who are still struggling with this issue. Here is the approach I used with native aws-sdk :

    var AWS = require('aws-sdk');
    AWS.config.loadFromPath('./s3_config.json');
    var s3Bucket = new AWS.S3( { params: {Bucket: 'myBucket'} } );
    

    Inside your router method (ContentType should be set to the content type of the image file):

      var buf = Buffer.from(req.body.imageBinary.replace(/^data:image\/\w+;base64,/, ""),'base64')
      var data = {
        Key: req.body.userId, 
        Body: buf,
        ContentEncoding: 'base64',
        ContentType: 'image/jpeg'
      };
      s3Bucket.putObject(data, function(err, data){
          if (err) { 
            console.log(err);
            console.log('Error uploading data: ', data); 
          } else {
            console.log('successfully uploaded the image!');
          }
      });
    

    s3_config.json file :

    {
      "accessKeyId":"xxxxxxxxxxxxxxxx",
      "secretAccessKey":"xxxxxxxxxxxxxx",
      "region":"us-east-1"
    }