Nodejs downloading tar.gz file with HTTP get - node.js

im a bit stuck with my project and I'am not finding the right answeres to my questions...
So first of all I have this client that is a linux machine and it pokes my server with curl trying to download this tar.gz file.
On the nodejs server my purpose is to download this tar.gz file from third-party server with HTTP get and pass it to my client. So I have few things that I dont understand.
Firstly: Is the file that i am downloading from the third party server in the http responses body or does the response have a body when downloading a file?
Secondly: I don't want to save the file to my server, so how do I stream it to the clients http response. Is there a fileStreams that i should use or is there just regular streams.
At the moment I'am trying to pipe the third party response to the clients responses body.
I've played around with the server and I've gotten some gibberish to my client. When I "file upload.tar.gz" I get that its data and not gzip. Is this the tar.gz file in different format? Am I doing wrong with the http request or is my stream piping causing this.
I would appreciate if someone could brief me a bit or point me to the right direction! :)

Related

How do I upload a file to a REST endpoint?

Using Twitter as an example: Twitter has an endpoint for uploading file data. https://developer.twitter.com/en/docs/media/upload-media/api-reference/post-media-upload-append
Can anyone provide an example of a real HTTP message containing, for example, image file data, showing how it is supposed to be structured? I'm fairly sure Twitter's documentation is nonsense, as their "example request" is the following:
POST https://upload.twitter.com/1.1/media/upload.json?command=APPEND&media_id=123&segment_index=2&media_data=123
Is the media_data really supposed to go in the URL? What if you have raw binary media data? Would it go in the body? How is the REST service to know how the data is encoded?
You're looking at the chunked uploader - it's intended for sending large files, breaking them into chunks, so a network failure doesn't mean you have to re-upload a 100 MB .mp4. It is, as a result, fairly complicated. (Side note: The file data goes in the request body, not the URL as a GET parameter... as indicated by "Requests should be multipart/form-data POST format.")
There's a far less complicated unchunked uploader that'll be easier to work with if you're just uploading a regular old image.
All of this gets a lot easier if you use one of Twitter's recommended libraries for your language.
to upload a file, you need to send it in a form, in node.js server you save accept the incoming file using formidable.
You can also use express-fileupload or multer

Is there a way to know how large a file being streamed via http is?

I'm sending a fairly large file via express in a little node app of mine. As the file is so large, I send the file in a stream.
On the receiving node app that is downloading the file, I would really like the ability to track the progress of the download.
This SO question uses the 'content-length' header to determine how big a file is. However after a little more research, it seems that when streaming a file the content-length cannot be known ahead of time.
Is there something obvious I'm missing here? I'm a bit surprised that there is no way to know how big a file being streamed is before it's finished downloading...
if you upload via multipart/form-data you can find the content-length included.

how to file upload from an http client to an http server?

I am trying to convert the protocol my clients and servers use in a program from ftp to http but I have no idea even where to begin with the plethora of modules that exist. should I be using the request module? http module? The act of uploading a single text file is so simple yet I cannot seem to find a straight answer.
request module is wrapper over http module...
Anyways you can use any, but point to upload is you have to set content-type header correctly, mostly it is multipart/form-data.
You should use something like POSTMAN or ADVANCE REST CLIENT(ARC) from chrome extensions and try out the request to server and with same set of headers you can use in http module.

How to send stream of data from client to server?

I am trying to upload a pdf file from client to my server.
I know how to read a file using node js "fs" module but how to read a file which is not on my server (i.e is on client disk). There is a upload button which chooses the file and then from client side I want to send that file in stream to my server. And then I can write the stream into a file on my server.
How are the packages like ostrio:Files doing it?
Maybe this could help you https://github.com/VeliovGroup/Meteor-Files
Great package to ease the upload and storage of files :)
Hope it helps,
Regards,
Yann

Image Upload : Base64 to server in post request or Express Js Middelware

I need to upload a local file to s3 and save its link in the database. Right now I am converting the image to base64 and sending it to my rails server, which saves it on s3 and returns a url. I send this URL in the next HTTP request. Now, how about I save it via express get a link and then use it for the request. What be the better approach? Using middleware or backend server?
For file upload, i suggest you to use multer middleware, because native multipart implementation is a little bit tricky. For interaction with amazon s3 middleware is used.
To send file somewhere else you could use pipes:
fs.createReadStream(rqPath).pipe(res);
In above example, file is read from local system and piped to response.
All mentioned modules could be find at NPM
If you're still trying to figure this out, I was struggling with the same issue, decided to POST binary data (converted from base64) directly in body without dealing with multipart forms, and whipped up the base64-image-upload package to make this easy.

Resources