How to upload all files from directory to s3 - node.js

I am writing node js script which should send images from directory '/images/', to the amazon s3. I know the knox is very good library, but how can I upload all files from directory, with the old file names. I can probably use fs module, get all names and upload it with for loop. Is there any function in knox which can do this?

Knox does not provide any functionality for client-side file handling.
You need to find your files manually and upload them one after one.
Unfortunately its impossible to upload multiple files once. The problem is that S3 requires that you send the Content-Length header for every file.

Why not use the command line tool s3cmd ( http://s3tools.org/s3cmd ) ? If you really want to do it in node.js, you can spawn a process to execute s3cmd in your javascript code.

Related

is there any way to load csv file in aws opensearch?

hi anyone knows how to upload csv file to aws opensearch directly using api call (like bulk api of aws).I want to do this using nodejs, i don't want to use kinesis or logstash also make sure that upload must be happen in chunks .I tried a lot but couldn't make it happen.
Opensearch provide javascript client. You can use Bulk API to upload documents in chunks.
Update 1:
As you mentioned you want yo index directly CSV file then use elasticsearch-csv NPM package.

Javascript: How can i upload to S3 directly without saving output file locally?

I have a problem with developing a crawler using nodejs/puppeteer. The old crawler was:
Crawl Pages
Store the output file locally with the fs module
Since i'm going to introduce UI on the server, have set up the scenario to upload it to S3 instead of storing it locally, and show the result as a UI.
Crawl Pages
Stream output files to the server with the fs module
Get the output file back and upload it to the S3 bucket
The above is a scenario that i know of as a knowledge, and i'd like to know if it is possible as below.
Crawl Pages
Upload the stream stored data to memory to the S3 bucket
If you have a scenario like this, I would like to receive a guide. I would really appreciate if you would comment or reply :)
This is definitely possible. If you just pipe from your input stream to your server and pipe up to the S3 it should complete the loop.
This is possible because you can stream uploads to S3, even without knowing the size of the file beforehand.
This answer should help you out: S3 file upload stream using node js
If you post some code we can answer this a little bit better. But hope this puts you in the right direction.

AWS: What Happens to Static S3 Files When a New Instance of a Website is Deployed?

So a little background. We have a website (js, jquery, less, node) that is hosted on Amazon AWS S3 and is distributed using CloudFront. In the past we have stored our resources statically in the assets folder within app locally and on S3.
Recently we have set up a node lambda that listens to Kinesis events and generates a json file that is then stored within the assets folder in S3. Currently, the file in the bucket with the same key is overwritten and the site using the generated file as it should.
My questions is, what happens to that json file when we deploy a new instance of our website? Even if we remove the json file from the local assets folder, if the deployment overwrites the whole assets directory in the S3 project when a new one is deployed, does that result in the json file being removed?
Thanks in advance!
Please let me know if you need any more clarification.
That will depend on how you'r syncing files, I recommend you use the "sync" command so that only new files are uploaded and only if you specify to delete a file that doesn't exist in your repo but it exists in S3 it will get deleted, otherwise not.
See for example the CLI command docs here: http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html ... as you can see, if you specify --delete the files will be deleted.
But not sure what's your use case, do you want that file to get deleted? It seems that you don't want that :)

Upload file and folder structure to S3

The users in my site need to be able to upload a bunch of files and folders into S3 while maintaining the folder structure.
Say they have the following files in their local boxes.
/file1.jpg
/some_folder/file2.jpg
After upload, I need their s3 urls to be
http://s3bucket.amazon.com/usersfolder/file1.jpg
http://s3bucket.amazon.com/usersfolder/some_folder/file2.jpg
How can i do this ? To make matters a little more complicated, Upload from client side can be initiated only after they download an upload policy.
Edit: I would like to know a solution for the front end part of this question. Looks like on server i can use a wildcard character to specify access permissions, so i am good on that part.
I am using Node.JS/Express JS as a backend

Saving Images to S3 from External URL with Node.js and MongoDB

I'm trying to save the images from a third-party API to my own S3 bucket using Node.js and MongoDB. The API provides a URL to the image on the third-party servers. I've never done this before but I'm assuming I have to download the image to my own server and then upload it to S3?
Should I save the image to mongodb with GridFS and then delete it once it is on S3? If so, what's the best way to do that?
I've read and re-read this previous question:
Problem with MongoDB GridFS Saving Files with Node.JS
But I can't find good documentation on how I should determine buffer/chunk size and other attributes for a JPEG image file.
Once I've saved the file on my server/database, then it seems like I should use:
https://github.com/appsattic/node-awssum
To upload it to S3. Is that a good idea?
I apologize if this is an obvious answer, I'm pretty green when it comes to databases and server scripting.
The easiest thing to do would be to save the image onto disk and then stream the upload from there using AwsSum's S3 PutObject operation. Alternatively, if you have the file contents in a Buffer you can just use that.
Have a look at the following two examples and they should help you figure out what to do:
https://github.com/appsattic/node-awssum/blob/master/examples/amazon/s3/put-bucket.js
https://github.com/appsattic/node-awssum/blob/master/examples/amazon/s3/put-object-streaming.js
Let me know if you need any more help. :)
Cheers,
Andy
Disclaimer: I'm the author of AwsSum.

Resources