AWS S3 Standard upload scheme vs Browser-based uploads in node - node.js

I'm trying to understand what are the implementation differences between a standard AWS S3 upload scheme (e.g. using aws-sdk) and browser-based uploads, particularly, in node.js.
I understand that in any case, there needs to be a server that will store my AWS credentials and sign the requests to S3.
But there's a bunch of things I don't seem to understand:
If I use a browser-based upload, I'll have an HTML form on the client side with the signature and the policy values in hidden fields that I get from my server. But if I use a standard scheme for uploading files, i.e. completely through my server, how exactly is it implemented? There's a lot of code examples on server side implementations, but what should happen on the client side? So, there will be an HTML form with action attribute pointing to my server's URL designated for file uploads, right? But what will actually happen? Will the file firstly get uploaded to my server's storage and then to S3? Or will it somehow use streaming or something? It really confuses me and I'd really appreciate some code example where there's both server and client side code.
What are the pros and cons of both uploading schemes? When should I favour one approach over another (my personal use case - video uploads in a multi-account system)?

Related

Rest API - Uploading large files to S3 using Pre-signed URLs and updating backend on success

Context
I am building Stateless REST APIs for a browser-based platform that needs to store some user-generated files. These files could potentially be in the GBs.
I am using AWS S3 for storage. I have used AWS SDK in the past for this to route the file uploads through the NodeJS server (Basically - Upload to Server, Server uploads to S3).
I am trying to figure out how to improve it using the Pre-signed urls. I understand the dynamics and the flow on how to get the presigned urls and how to upload the file to S3 directly.
I cannot use SQS or Lambda to trigger object created event.
The architecture needs to be AWS independent.
Question
The simplest of flows I need to achieve is pretty common -
User --> Opens Profile
Clicks Upload Photo
Client Sends Request to /getSignedUrl
Server Returns the signedURL for the file name/type
The client executes the PUT/POST request to upload the file to the signedUrl
Upload Successful
After this - my understanding is -
Client tells the server - File Uploaded Successfully
Server associates the S3 Url for the Photo to the User.
...and that's my problem. How do I associate the successfully uploaded file back to the user on the server in a secure way?
Not sure what I've been missing. It seems like a trivial use case but I haven't been able to find anything regarding it.
1/ I think for the avatar, you should set it as public-read.
When create signed upload url in the
GET: /signed-upload-url
You need to set the image as public-read. After that you are free to interact with image through the direct url. Since this is an avatar, so you can compress it, reduce the size of image by the AWS Lambda function.
2/ If you don't want to have the public-read, you need to associate with server to get signed-download-url to interact with image
GET: /signed-download-url

Use nodeJS server with symfony

I have a huge symfony app and I wanted to add some feature that I could only do with a nodeJS server .
So I have a big JSON file which result from my nodeJS run, this file have to go in Symfony.
And symfony have to be able to send some pdf file to the node server (the one which will be transform in JSON by my node server).
Is anyone have some starting idea ?
thansk for help :D
No one is going to be able to provide a full answer with so few details, but generally speaking messaging and remote procedure calls are excellent for interop between parts of a large app.
You could send a message from Symfony (which includes the path of the PDF, or the contents itself), and node will provide the result. You can encode that as JSON, and send it as an answer.
RabbitMq is widely supported, allows both produce-consume or RPC-style use.

using backend files nodejs

Sorry, It might be very novice problem but I am new to node and web apps and just have been stuck on this for couples of days.
I have been working with a API called "Face++" that requires user to upload images to detect faces. So basically users needed to upload images to my webapps backend and my backend would do an API request with that image. I somehow managed to upload the files at my node's backend using tutorial provided below but now I am struggling how to use those image files. I really don't know how to have access to those files. I thought writing just the filepath/filename would help but it did not. I am really new at webapps.
I used tutorial from here: https://coligo.io/building-ajax-file-uploader-with-node/
to upload my files at back-end.
thanks
You can also use the Face++ REST API node client
https://www.npmjs.com/package/faceppsdk
As per in documentation it requires a live URL on web. Then you have to upload your files into remote location (You may upload files to a Amazon S3 Bucket)
And also you check the sample codes from Documentation where you can upload directly to Face++

How to send or stream base64 images with AJAX to Node.js Express server?

I am new to working with images in web development. We have a Node.js Express server that will run on Heroku and uses Cloudinary to store images.
Ideally we could save images directly to Cloudinary, but I am not sure if that's possible and we are afraid of putting our Cloudinary credentials on the client.
Assuming we must send images data to our server first instead of sending them directly to Cloudinary - if the images are encoded as base64 on the client, is it possible to stream the images from the client to the server - or must we send all the data at once? Either way, what headers do we use to send binary / base64 data?
is it possible to send or even stream binary data from the client to the server?
since it is a Node.js server, it would be ideal to use streams and to stream the file from our server to Cloudinary.
hope this makes sense, and info would be very helpful.
Why not using direct uploads from the client side to Cloudinary using the jQuery plugin?
This method supports both signed or unsigned uploads, when the signature can (and should) be generated on your server before rendering the page, for privacy reasons. Uploading from Base64 URI is also possible with this mechanism.
Note that Cloudinary's client-libraries also wrap this plugin and provide you with "off the shelf" solutions for embedding the upload-fields in your web app, with the signature already inside.
Let us know if you need any further guidance.

Manage security on file upload to nodejs

I have an image upload view on my client (ember.js) that send the resized image to nodejs rest api;
it works well but it is easy for someone expert to force upload of a non-resized image;
I would like to keep the resize process on the client because this allows users to select heavy-weight images, that are resized locally and uploaded only after that, when they are lightweight;
If someone else uses something like this, I'm interested on how it is possible to make this as safe as possible;
As a rule of thumb when developing web applications is never ever trust any data coming from the client side, always try to do a check in your server side!
Use authentication, this ensures that user only allow to upload data to their own account and not fiddling others files.
Add a special message passing between your server and client, a simple example would be
i. send a post API request first (that contains the image information and targeted compressed size) to your server indicating that your client is starting to compress the picture
ii. when uploading, add a metadata to include the complete compressed image, and check the uploaded image with your server if it is within the accepted threshold, else discard it
You could enhance the security of the message passing to be more complicated!
This would be my simple security, anyone else got better solution? :)
Approaches here also work for file uploads. You can use a combination of checking:
content-length header and/or (i.e. req.headers['content-length'] > x)
reading stream size as it's being read by server. (i.e req.on('data'))
If the stream data exceeds a certain size you can respond accordingly. Check out something like Multer for file uploads, specifically the limits section. Best approach would probably the second option.

Resources