Note: This site is currently "Under construction". I'm migrating to a new version of my site building software. Lots of things are in a state of disrepair as a result (for example, footnote links aren't working). It's all part of the process of building in public. Most things should still be readable though.

Read And Load A File From S3 In JavaScript

This is how I read a JSON file from S3 and load it into a variable. There's a couple other options below. I haven't done a full test of the different processes yet though.

(TODO: Verify this works, or get it working, in next.js, etc...)

(TODO: Note this is for V3 of the aws sdk and maybe put a v2 example in?)

(TODO: Check out this code from the offical docs:

https://docs.aws.amazon.com/AmazonS3/latest/userguide/example_s3_GetObject_section.html )

(TODO: Note that if the stuff from the docs work it doesn't have good seo because it took several tries to get to ti)

Code

This code doesn't work in emacs yet. Need to figure
  that out, but copying and pasting into an indivdul
  file worked. Not sure if there's async stuff to
  deal with or not

Code

#!/usr/bin/env node

  import { S3Client, GetObjectCommand } from '@aws-sdk/client-s3' // ES Modules import

  const params = {
    Bucket: `aws-test-sandbox`,
    Key: `example.json`,
  }

  const client = new S3Client()
  const command = new GetObjectCommand(params)
  const response = await client.send(command)

  let s3ResponseBody = ''
  for await (const chunk of response.Body) {
    s3ResponseBody += chunk
  }

  const data = JSON.parse(s3ResponseBody)
  console.log(data)

** TODO

Try this (via: https://github.com/aws/aws-sdk-js-v3/issues/1877#issuecomment-793028742)

Code

npm install node-fetch
  npm install --save-dev @types/node-fetch

Code

import { Response } from 'node-fetch'

  const response = new Response(s3Response.Body)
  const data = await response.json()

** TODO:

Go though this and test the differnet methods out and write them up

https://github.com/aws/aws-sdk-js-v3/issues/1877

** OLD Notes and links to review:

- https://stackoverflow.com/questions/67366381/aws-s3-v3-javascript-sdk-stream-file-from-bucket-getobjectcommand

- https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/Request.html#createReadStream-property

Make a project and install the skd:

Code

mkdir some_thing
cd some_thing
npm init -i
# fill stuff out
yarn add @aws-sdk/client-s3

echo 'console.log("Hello, World")' < index.js

Notes:

- In boto3 in python, you install a single module and get access to all the services. Here, AWS has setup to split each module to its own thing.

- Look at this one for a sample for a test file: https://github.com/aws/aws-sdk-js-v3/issues/1877 - But thinking why not just always do it the same with binary so you don't have to worry about making a switch

https://stackoverflow.com/questions/36942442/how-to-get-response-from-s3-getobject-in-node-js#36944450

https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/welcome.html

https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/index.html

https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/s3-example-photo-album-full.html

https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/classes/getobjectcommand.html

https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/index.html

https://dev.to/ldsrogan/aws-sdk-with-javascript-download-file-from-s3-el2 - this one gets a URL instead of reading it directly

https://docs.aws.amazon.com/AmazonS3/latest/userguide/download-objects.html

https://www.mydatahack.com/uploading-and-downloading-files-in-s3-with-node-js/

https://github.com/aws/aws-sdk-js-v3/issues/1877

https://stackoverflow.com/questions/43799246/s3-getobject-createreadstream-how-to-catch-the-error - see if this is V2 or V3 (I think it's 2)