I'm working on REST server script written in NodeJS. This script lets user do POST request for uploading files. Since my server is not responsible to store files and it is taken care of by another remote server which is also REST server, I like to forward/redirect file upload stream to that remote server. What is the best approach to do that?
Should I buffer the stream and wait until I receive file in its entirety before streaming it to the remote server? Or
Should I just pipe the chunks to remote server as I receive them? I tried piping in my upload route with this statement -
req.pipe(request.post(connection).pipe(res)
but I received an error from the remote server - "Connection has been cancelled by the client". Or
Is it possible to redirect the incoming upload stream to that remote server so that my script wouldn't be a middleman?
Thanks
Related
I would like to send updates of a running HTTP request to the client to tell it at what stage the request-triggered process currently is.
The process behind the request does currently the following things (in this order):
Client-side:
Client sends an HTTP Request (upload of a file) to the server
Server-side:
Takes the uploaded file
Encrypt it
Upload it to an archive storage
Return response to the client
(Meanwhile, the client does not know what currently happens)
Client-side:
Get response and show it to the user
I want to tell the client at what stage the process is, like “Uploading done. Encrypting…” and so on.
Is there a way to realize that, or am I missing something? Is it even possible to do?
Frameworks I'm using:
Client: Next.js
Server: Hapi.dev for API development
Thanks
You can send non-final 1xx header responses for your http request as described here.
On a server with Node.js + Express I have an API with the purpose of exporting a million records from DB to file, which then needs to be downloaded by the client. How I am doing it now: create the file while the client waits for a response, which takes around 2 minutes, and when the file is ready do res.download(file).
It works fine locally, but when deployed I am getting a Gateway 504 error. Initially I tried to pipe data to the client while going through the DB cursor, but it was too slow so I implemented a multithreading mechanism, and piping it is not possible anymore.
Is there a way I can work around this? Perhaps by opening the response stream to the client but without sending anything until the file is ready to be downloaded. Not sure if this is possible?
I need to create nodejs http server and the Angular client.
The server listen to the incoming http request in which clients specify the query parameter
/api/pictures?query=some query
Let’s say I have a folder ‘pictures’ that contains:
pic1.jpg
pic2.jpg
pic3.jpg
instruction.jpg
manual Instruction.jpg
…
Whenever the client sends the request with url like:
/api/pictures?query=pic
The server should returns files whose names contains the query specified by the client. In this case:
pic1.jpg
pic2.jpg
pic3.jpg
And if the client sends the request with url like:
/api/pictures?query=instruction
The server should returns:
instruction.jpg
manual Instruction.jpg
My question is, how to send these files in the most efficient way?
I was thinking about streaming these files, but it’s rather impossible for the browser client to read the files from such a stream
Or maybe just read all the pictures that matches the criteria to the memory, zip them and then send them.
But I believe there is efficient way to do that, any idea guys? :)
The following case:
I have a webserver, which downloads files if requested by clients and works as a filecache.
The client requests a file and passes the file url as parameter. The webserver checks, if he has the file cached. If not, the webserver downloads the file and serves the file after downloading.
The response to the client has to be the file. It is not possible to close the response with a "downloading, please check back later" and open a second request from the client after a couple of minutes.
No, I won't switch to sockets, as the client does not support it. The client has to use .NET WebClient.DownloadFile.
The problem is, that the HTTP request to the webserver is on hold while downloading the file. The file can be any size, which results in the client's request canceling with timeout, if the file can't be downloaded and returned to the client in time.
I don't want to set a timeout on the client, as this would be too much of a hack.
Does anybody have an idea how to tackle this problem? I have read about HTTP status 102 (processing), but I have no idea how to set that status.
I am using node.js on the webserver, but interested in any kind of (tcp level) solution.
I solved the problem streaming the download to a temporary file and serving the content of the file to the requesting clients.
As the file grows while downloading is use npm growing-file to open the temp file and pipe the data into the response stream of the clients.
The web server should start delivering the file content to the client as soon as any of it is received from wherever it is downloading it from ... rather than downloading the entire file before it sends any of it to the client, which wastes time and space.
I have implemented a read/write stream to read a buffer, manipulate the data(like adding headers and footers during output file creation) and write it to a file. What should I do to instead of writing to a file locally, to write to a file in a remote location, but I have only FTP access to the remote server.
I wrote a client using POCO to transfer the file to the ftp server, but it is a two step process. How can I implement a solution which directly writes to the FTP server? I am not able to get how to connect a source stream(which is actually a ReadFile call) to the FTP network stream?
Thanks.
You need an FTP client library that you can call directly from your app, to avoid the need to write the file to disk and then send it via a separate process.
This earlier question has some useful info.