how to write the csv file in S3 using expresses NodeJs - node.js

Can any one help on writing file CSV and then uploading using S3 Nodejs.
I was trying
fs.writeFileSync(path, data)
but it does not work for me
Please guide, a demo will help me a lot
Thanks

You don't upload the file directly,first you need to add AWS S3 API module to your project and use it. You can find a good example here

Related

Firebase upload raw data to bucket

I am creating an application in nodejs/typescript that uses Firebase Functions and I basically need to upload a JSON object to a bucket. I am having issues because the JSON I am creating exists in memory, and is not an actual file - as the application is a serverless one.
I know firebase is just a wrapper for Google Cloud Functions and have looked for solutions everywhere however cannot seem to get this working. Is anyone able to give me any guidance or suggestions please?
If I cannot upload the in-memory file to a bucket, does anyone know if its possible to programatically export a database document as a json to a firestore bucket using firebase? (as I can easily just upload the json to a database document).
Below is one example of what I have tried. However the code is obviously invalid.
await storage()
.bucket()
.file('test.json') // A random string filename and not an existing file
.createWriteStream()
.write(JSON.stringify(SOME_VALID_JSON))
Thanks!
You can use save() to write data in memory to a bucket.
await storage()
.bucket()
.file('test.json')
.save(JSON.stringify(SOME_VALID_JSON))

HTML to PDF creation in AWS Lambda using Python

I am trying to create a pdf file that contains images, tables from HTML data in AWS lambda using python. I searched a lot on google and I didn't find any super cool solution. I tried some libraries in local(FPDF, pdfKit) and but it doesn't work on AWS. Is there any simple tool to create pdf and upload it to S3 bucket. Thanks in advance.
you can use reportlab PDF python module. It is good for all the things you have asked for. You can add images, create tables etc. There are a lot of styling options available as well. You can find more about it here: https://www.reportlab.com/docs/reportlab-userguide.pdf
I am using this is in my production and works pretty well for my use case where I have to create an invoice. You can create the invoice in the /tmp directory and then upload this to S3
pdfkit library works with aws lambda. pdfkit internally needs the wkhtmltopdf binaries installed, you can add them as lambda layer. You can download files from https://wkhtmltopdf.org/downloads.html.
Once you add the lambda layers you can set the config path as following:
config = pdfkit.configuration(wkhtmltopdf="/opt/bin/wkhtmltopdf")
pdfkit.from_string/from_file(input, <temp path of pdf file on lambda>, configuration=config)
You can uplod the file generated in your lambda temp location to S3 bucket using upload_file(). You can refer https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.upload_file on how to upload to s3 bucket.

How to upload downloaded file to s3 bucket using Lambda function

I saw different questions/answers but I could not find the one that worked for me. Hence, I am really new to AWS, I need your help. I am trying to download gzip file and load it to the json file then upload it to the S3 bucket using Lambda function. I wrote the code to download the file and convert it to json but having problem while uploading it to the s3 bucket. Assume that file is ready as x.json. What should I do then?
I know it is really basic question but still help needed :)
This code will upload to Amazon S3:
import boto3
s3_client = boto3.client('s3', region_name='us-west-2') # Change as appropriate
s3._client.upload_file('/tmp/foo.json', 'my-bucket', 'folder/foo.json')
Some tips:
In Lambda functions you can only write to /tmp/
There is a limit of 512MB
At the end of your function, delete the files (zip, json, etc) because the container can be reused and you don't want to run out of disk space
If your lambda has proper permission to write a file into S3, then simply use boto3 package which is an AWS SDK for python.
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html
Be aware that if the lambda locates inside of VPC then lambda cannot access to the public internet, and also boto3 API endpoint. Thus, you may require a NAT gateway to proxy lambda to the public.

Jimp write image to Google cloud storage node js

I'm currently writing images to my filesystem using jimp.
For example: image.write('path')
This cannot be used on Google App Engine because its read only filesystem.
How can I write this to a Google storage bucket? I've tried writestreams but keep getting read only errors so I feel like jimp is still writing to the drive.
Thanks heaps
As I can understand you are modifying some images on your App Engine and you want to upload them to a bucket but you didn't mention if you are using standard or flex environment. In order to do this you want to make your publicly readable so it can serve files.
Following this Google Cloud Platform Node.js documentation you can see that to upload a file to a bucket you need to create object first using:
const blob = bucket.file(yourFileName);
Then using createWriteStream you can upload the file

Upload filestream to cloud-files with Node.js

I know that it's possible to upload files to my cloud-files account in Node.js, using the following module: node-cloudfiles.
But is it also possible to upload a filestream directly?
In my case I am dowloading an image from a certain location in Node.js and want to upload this directly to my cloud-files account without saving the image temporary on my server.
Of course it is possible - you can just read the documentation on Rackspace Cloud Files API ( http://docs.rackspacecloud.com/files/api/cf-devguide-latest.pdf ) and implement the necessary parts yourself.
However, I'd suggest to wait until https://github.com/nodejitsu/node-cloudfiles/pull/11 gets integrated into the trunk - then node-cloudfiles library will support uploading files using streams so you won't have to create files before uploading.

Resources