118

In a Node.js project I am attempting to get data back from S3.

When I use getSignedURL, everything works:

aws.getSignedUrl('getObject', params, function(err, url){
    console.log(url); 
}); 

My params are:

var params = {
              Bucket: "test-aws-imagery", 
              Key: "TILES/Level4/A3_B3_C2/A5_B67_C59_Tiles.par"

If I take the URL output to the console and paste it in a web browser, it downloads the file I need.

However, if I try to use getObject I get all sorts of odd behavior. I believe I am just using it incorrectly. This is what I've tried:

aws.getObject(params, function(err, data){
    console.log(data); 
    console.log(err); 
}); 

Outputs:

{ 
  AcceptRanges: 'bytes',
  LastModified: 'Wed, 06 Apr 2016 20:04:02 GMT',
  ContentLength: '1602862',
  ETag: '9826l1e5725fbd52l88ge3f5v0c123a4"',
  ContentType: 'application/octet-stream',
  Metadata: {},
  Body: <Buffer 01 00 00 00  ... > }

  null

So it appears that this is working properly. However, when I put a breakpoint on one of the console.logs, my IDE (NetBeans) throws an error and refuses to show the value of data. While this could just be the IDE, I decided to try other ways to use getObject.

aws.getObject(params).on('httpData', function(chunk){
    console.log(chunk); 
}).on('httpDone', function(data){
    console.log(data); 
});

This does not output anything. Putting a breakpoint in shows that the code never reaches either of the console.logs. I also tried:

aws.getObject(params).on('success', function(data){
    console.log(data); 
});

However, this also does not output anything and placing a breakpoint shows that the console.log is never reached.

What am I doing wrong?

joe
  • 3,441
  • 2
  • 16
  • 29
Sara Fuerst
  • 4,888
  • 7
  • 41
  • 77

10 Answers10

254

When doing a getObject() from the S3 API, per the docs the contents of your file are located in the Body property, which you can see from your sample output. You should have code that looks something like the following

const aws = require('aws-sdk');
const s3 = new aws.S3(); // Pass in opts to S3 if necessary

var getParams = {
    Bucket: 'abc', // your bucket name,
    Key: 'abc.txt' // path to the object you're looking for
}

s3.getObject(getParams, function(err, data) {
    // Handle any error and exit
    if (err)
        return err;

  // No error happened
  // Convert Body from a Buffer to a String
  let objectData = data.Body.toString('utf-8'); // Use the encoding necessary
});

You may not need to create a new buffer from the data.Body object but if you need you can use the sample above to achieve that.

@aws-sdk/client-s3 (2021 Update)

Since I wrote this answer in 2016, Amazon has released a new JavaScript SDK, @aws-sdk/client-s3. This new version improves on the original getObject() by returning a promise always instead of opting in via .promise() being chained to getObject(). In addition to that, response.Body is no longer a Buffer but, one of Readable|ReadableStream|Blob. This changes the handling of the response.Data a bit. This should be more performant since we can stream the data returned instead of holding all of the contents in memory, with the trade-off being that it is a bit more verbose to implement.

In the below example the response.Body data will be streamed into an array and then returned as a string. This is the equivalent example of my original answer. Alternatively, the response.Body could use stream.Readable.pipe() to an HTTP Response, a File or any other type of stream.Writeable for further usage, this would be the more performant way when getting large objects.

If you wanted to use a Buffer, like the original getObject() response, this can be done by wrapping responseDataChunks in a Buffer.concat() instead of using Array#join(), this would be useful when interacting with binary data. To note, since Array#join() returns a string, each Buffer instance in responseDataChunks will have Buffer.toString() called implicitly and the default encoding of utf8 will be used.

const { GetObjectCommand, S3Client } = require('@aws-sdk/client-s3')
const client = new S3Client() // Pass in opts to S3 if necessary

function getObject (Bucket, Key) {
  return new Promise(async (resolve, reject) => {
    const getObjectCommand = new GetObjectCommand({ Bucket, Key })

    try {
      const response = await client.send(getObjectCommand)
  
      // Store all of data chunks returned from the response data stream 
      // into an array then use Array#join() to use the returned contents as a String
      let responseDataChunks = []

      // Handle an error while streaming the response body
      response.Body.once('error', err => reject(err))
  
      // Attach a 'data' listener to add the chunks of data to our array
      // Each chunk is a Buffer instance
      response.Body.on('data', chunk => responseDataChunks.push(chunk))
  
      // Once the stream has no more data, join the chunks into a string and return the string
      response.Body.once('end', () => resolve(responseDataChunks.join('')))
    } catch (err) {
      // Handle the error or throw
      return reject(err)
    } 
  })
}

@aws-sdk/client-s3 Documentation Links

peteb
  • 16,991
  • 8
  • 49
  • 59
  • So the data coming back does seem to be a `Buffer` object, which I'm not familiar with. Theoretically I could use `new Buffer(data.Body).toString('utf-8');` to get to the content? – Sara Fuerst Apr 29 '16 at 17:49
  • 5
    If the content is already a Buffer, no need to create a new Buffer from that. Simply just do `data.Body.toString('utf-8');`. A Buffer is a representation of Binary data in node, if you need more info [here are the docs](https://nodejs.org/api/buffer.html) – peteb Apr 29 '16 at 17:49
  • Thank you! This would be so much easier to work through if I could actually get a breakpoint working. For whatever reason (and this is the only place I've had this happen), within the function Netbeans refuses to show any variable values – Sara Fuerst Apr 29 '16 at 17:54
  • 4
    This works for text, but is there a generic solution for handling text files as well as .png, .jpg, etc.? – carter Feb 23 '18 at 21:12
  • 4
    @carter This is a general solution. Just change the `.toString('utf8')` when accessing `data.Body` to `.toString('binary')` if you want a binary string for images. If the `Buffer` in `data.Body` doesn't need to be converted to a String like in this question, then you can just return `data.Body` and work with the `Buffer` directly. – peteb Feb 23 '18 at 21:17
  • 8
    "Convert Body from a Buffer to a String"... would be great if the AWS docs made this a bit more clear. I'm getting pretty fed up wrestling with AWS. – osullic Jun 22 '20 at 23:22
  • 18
    It is absolutely insane that I can't find any documentation within AWS docs or on their NPM repo that just has this code. Like, did they really have to make it that hard? They couldn't just have a "for dummies" method (or even a stupid command class) that just gets the file contents b64 encoded or something with the rest of the object data? Like it grabs an absurd amount of garbage metadata that I, and I imagine most people, will NEVER NEED. But it can't return the object data itself? – Ethan Standel Aug 26 '21 at 05:00
  • 2
    @peteb, your 2021 Update saved me after two days of being stuck on this exact problem. I ended up using a Buffer.concat() as per your suggestion to get a Buffer (which I later displayed as a Base64 image following this post: https://stackoverflow.com/questions/57699628/how-to-convert-a-file-buffer-to-img-tag-src). Thanks! – SSF Nov 11 '21 at 08:33
  • @peteb I am getting some random set of symbols from the `responseDataChunks`. Any pointers for me? – Divyanshu Juneja Apr 28 '22 at 18:48
55

Based on the answer by @peteb, but using Promises and Async/Await:

const AWS = require('aws-sdk');

const s3 = new AWS.S3();

async function getObject (bucket, objectKey) {
  try {
    const params = {
      Bucket: bucket,
      Key: objectKey 
    }

    const data = await s3.getObject(params).promise();

    return data.Body.toString('utf-8');
  } catch (e) {
    throw new Error(`Could not retrieve file from S3: ${e.message}`)
  }
}

// To retrieve you need to use `await getObject()` or `getObject().then()`
const myObject = await getObject('my-bucket', 'path/to/the/object.txt');
Arian Acosta
  • 5,645
  • 1
  • 30
  • 30
9

For someone looking for a NEST JS TYPESCRIPT version of the above:

    /**
     * to fetch a signed URL of a file
     * @param key key of the file to be fetched
     * @param bucket name of the bucket containing the file
     */
    public getFileUrl(key: string, bucket?: string): Promise<string> {
        var scopeBucket: string = bucket ? bucket : this.defaultBucket;
        var params: any = {
            Bucket: scopeBucket,
            Key: key,
            Expires: signatureTimeout  // const value: 30
        };
        return this.account.getSignedUrlPromise(getSignedUrlObject, params);
    }

    /**
     * to get the downloadable file buffer of the file
     * @param key key of the file to be fetched
     * @param bucket name of the bucket containing the file
     */
    public async getFileBuffer(key: string, bucket?: string): Promise<Buffer> {
        var scopeBucket: string = bucket ? bucket : this.defaultBucket;
        var params: GetObjectRequest = {
            Bucket: scopeBucket,
            Key: key
        };
        var fileObject: GetObjectOutput = await this.account.getObject(params).promise();
        return Buffer.from(fileObject.Body.toString());
    }

    /**
     * to upload a file stream onto AWS S3
     * @param stream file buffer to be uploaded
     * @param key key of the file to be uploaded
     * @param bucket name of the bucket 
     */
    public async saveFile(file: Buffer, key: string, bucket?: string): Promise<any> {
        var scopeBucket: string = bucket ? bucket : this.defaultBucket;
        var params: any = {
            Body: file,
            Bucket: scopeBucket,
            Key: key,
            ACL: 'private'
        };
        var uploaded: any = await this.account.upload(params).promise();
        if (uploaded && uploaded.Location && uploaded.Bucket === scopeBucket && uploaded.Key === key)
            return uploaded;
        else {
            throw new HttpException("Error occurred while uploading a file stream", HttpStatus.BAD_REQUEST);
        }
    }
Chaos Legion
  • 2,422
  • 1
  • 14
  • 14
6

Extremely similar answer to @ArianAcosta above. Except I'm using import (for Node 12.x and up), adding AWS config and sniffing for an image payload and applying base64 processing to the return.

// using v2.x of aws-sdk
import aws from 'aws-sdk'

aws.config.update({
  accessKeyId: process.env.YOUR_AWS_ACCESS_KEY_ID,
  secretAccessKey: process.env.YOUR_AWS_SECRET_ACCESS_KEY,
  region: "us-east-1" // or whatever
})

const s3 = new aws.S3();

/**
 * getS3Object()
 * 
 * @param { string } bucket - the name of your bucket
 * @param { string } objectKey - object you are trying to retrieve
 * @returns { string } - data, formatted
 */
export async function getS3Object (bucket, objectKey) {
  try {
    const params = {
      Bucket: bucket,
      Key: objectKey 
    }

    const data = await s3.getObject(params).promise();

    // Check for image payload and formats appropriately
    if( data.ContentType === 'image/jpeg' ) {
      return data.Body.toString('base64');
    } else {
      return data.Body.toString('utf-8');
    }

  } catch (e) {
    throw new Error(`Could not retrieve file from S3: ${e.message}`)
  }
}
serraosays
  • 6,621
  • 3
  • 31
  • 55
6

Updated (2022)

nodejs v17.5.0 added Readable.toArray. If this API is available in your node version. The code will be very short:

const buffer = Buffer.concat(
    await (
        await s3Client
            .send(new GetObjectCommand({
                Key: '<key>',
                Bucket: '<bucket>',
            }))
    ).Body.toArray()
)

If you are using Typescript, you are safe to cast the .Body part as Readable (the other types ReadableStream and Blob are only returned in browser environment. Moreover, in browser, Blob is only used in legacy fetch API when response.body is not supported)

(response.Body as Readable).toArray()

Note that: Readable.toArray is an experimental (yet handy) feature, use it with caution.

enter image description here

=============

Original answer

If you are using aws sdk v3, the sdk v3 returns nodejs Readable (precisely, IncomingMessage which extends Readable) instead of a Buffer.

Here is a Typescript version. Note that this is for node only, if you send the request from browser, check the longer answer in the blog post mentioned below.

import {GetObjectCommand, S3Client} from '@aws-sdk/client-s3'
import type {Readable} from 'stream'

const s3Client = new S3Client({
    apiVersion: '2006-03-01',
    region: 'us-west-2',
    credentials: {
        accessKeyId: '<access key>',
        secretAccessKey: '<access secret>',
    }
})
const response = await s3Client
    .send(new GetObjectCommand({
        Key: '<key>',
        Bucket: '<bucket>',
    }))
const stream = response.Body as Readable

return new Promise<Buffer>((resolve, reject) => {
    const chunks: Buffer[] = []
    stream.on('data', chunk => chunks.push(chunk))
    stream.once('end', () => resolve(Buffer.concat(chunks)))
    stream.once('error', reject)
})
// if readable.toArray() is support
// return Buffer.concat(await stream.toArray())

Why do we have to cast response.Body as Readable? The answer is too long. Interested readers can find more information on my blog post.

transang
  • 3,349
  • 2
  • 32
  • 43
4

Alternatively you could use minio-js client library get-object.js

var Minio = require('minio')

var s3Client = new Minio({
  endPoint: 's3.amazonaws.com',
  accessKey: 'YOUR-ACCESSKEYID',
  secretKey: 'YOUR-SECRETACCESSKEY'
})

var size = 0
// Get a full object.
s3Client.getObject('my-bucketname', 'my-objectname', function(e, dataStream) {
  if (e) {
    return console.log(e)
  }
  dataStream.on('data', function(chunk) {
    size += chunk.length
  })
  dataStream.on('end', function() {
    console.log("End. Total size = " + size)
  })
  dataStream.on('error', function(e) {
    console.log(e)
  })
})

Disclaimer: I work for Minio Its open source, S3 compatible object storage written in golang with client libraries available in Java, Python, Js, golang.

Mayur Shah
  • 3,184
  • 1
  • 21
  • 40
koolhead17
  • 1,880
  • 1
  • 12
  • 19
  • Tried mino, but how to get buffer data, when I print dataStream.Body its giving 'undefined'. ie console.log('datastream', dataStream.Body); //undefined – Dibish Mar 07 '19 at 12:15
3

At first glance it doesn't look like you are doing anything wrong but you don't show all your code. The following worked for me when I was first checking out S3 and Node:

var AWS = require('aws-sdk');

if (typeof process.env.API_KEY == 'undefined') {
    var config = require('./config.json');
    for (var key in config) {
        if (config.hasOwnProperty(key)) process.env[key] = config[key];
    }
}

var s3 = new AWS.S3({accessKeyId: process.env.AWS_ID, secretAccessKey:process.env.AWS_KEY});
var objectPath = process.env.AWS_S3_FOLDER +'/test.xml';
s3.putObject({
    Bucket: process.env.AWS_S3_BUCKET, 
    Key: objectPath,
    Body: "<rss><data>hello Fred</data></rss>",
    ACL:'public-read'
}, function(err, data){
    if (err) console.log(err, err.stack); // an error occurred
    else {
        console.log(data);           // successful response
        s3.getObject({
            Bucket: process.env.AWS_S3_BUCKET, 
            Key: objectPath
        }, function(err, data){
            console.log(data.Body.toString());
        });
    }
});
bknights
  • 12,328
  • 2
  • 14
  • 25
-1

This is the async / await version

var getObjectAsync = async function(bucket,key) {
  try {
    const data = await s3
      .getObject({ Bucket: bucket, Key: key })
      .promise();
      var contents = data.Body.toString('utf-8');
      return contents;
  } catch (err) {
    console.log(err);
  }
}
var getObject = async function(bucket,key) {
    const contents = await getObjectAsync(bucket,key);
    console.log(contents.length);
    return contents;
}
getObject(bucket,key);
loretoparisi
  • 14,485
  • 11
  • 91
  • 127
-1

Converting GetObjectOutput.Body to Promise<string> using node-fetch

In aws-sdk-js-v3 @aws-sdk/client-s3, GetObjectOutput.Body is a subclass of Readable in nodejs (specifically an instance of http.IncomingMessage) instead of a Buffer as it was in aws-sdk v2, so resp.Body.toString('utf-8') will give you the wrong result “[object Object]”. Instead, the easiest way to turn GetObjectOutput.Body into a Promise<string> is to construct a node-fetch Response, which takes a Readable subclass (or Buffer instance, or other types from the fetch spec) and has conversion methods .json(), .text(), .arrayBuffer(), and .blob().

This should also work in the other variants of aws-sdk and platforms (@aws-sdk v3 node Buffer, v3 browser Uint8Array subclass, v2 node Readable, v2 browser ReadableStream or Blob)

npm install node-fetch
import { Response } from 'node-fetch';
import * as s3 from '@aws-sdk/client-s3';

const client = new s3.S3Client({})
const s3Response = await client.send(new s3.GetObjectCommand({Bucket: '…', Key: '…'});
const response = new Response(s3Response.Body);

const obj = await response.json();
// or
const text = await response.text();
// or
const buffer = Buffer.from(await response.arrayBuffer());
// or
const blob = await response.blob();

Reference: GetObjectOutput.Body documentation, node-fetch Response documentation, node-fetch Body constructor source, minipass-fetch Body constructor source

Thanks to kennu comment in GetObjectCommand usability issue

yonran
  • 17,008
  • 7
  • 64
  • 85
-1

The Body.toString() method no longer works with the latest version of the s3 api. Use the following instead:

const { S3Client, GetObjectCommand } = require("@aws-sdk/client-s3");

const streamToString = (stream) =>
    new Promise((resolve, reject) => {
      const chunks = [];
      stream.on("data", (chunk) => chunks.push(chunk));
      stream.on("error", reject);
      stream.on("end", () => resolve(Buffer.concat(chunks).toString("utf8")));
    });

(async () => {
  const region = "us-west-2";
  const client = new S3Client({ region });

  const command = new GetObjectCommand({
    Bucket: "test-aws-sdk-js-1877",
    Key: "readme.txt",
  });

  const { Body } = await client.send(command);
  const bodyContents = await streamToString(Body);
  console.log(bodyContents);
})();

Copy and pasted from here: https://github.com/aws/aws-sdk-js-v3/issues/1877#issuecomment-755387549

Not sure why this solution hasn't already been added as I think it is cleaner than the top answer.

tnrich
  • 7,316
  • 7
  • 35
  • 48