How to handle large amount of json data response payload in fastapi? - node.js

a get call which has many lines of json respone gets some time to respond in swagger ui.
how can reduce this time ; but, i want each response attrebute from my big responds model!.
i have tried gzip content encoading. but, it does not solved my problem; because of the large number of line response;
for eg: while getting all job details (note:one job responds 36000 line response)
I'm a beginner

You don't have an issue with FastAPI. Your question is how to handle a large response body with Swagger UI.
a get call which has many lines of json respone gets some time to respond in swagger ui.
This is a known issue with Swagger UI, even sometimes large response bodies cause hanging(see).
how can reduce this time
In your case using some tool like Postman or Insomnia could fix this.
i have tried gzip content encoading. but, it does not solved my problem; because of the large number of line response;
Expected. This will not make any effect in Swagger, yes it can reduce the latency when you are dealing with large response bodies. But in the end, Swagger is going to show it as JSON. So this will not make any changes to your Swagger experience.

Related

What happens if I do not use body parser or express.json()?

I am new to the whole backend stuff I understood that both bodyparser and express.json() will parse the incoming request(body from the client) into the request object.
But what happens if I do not parse the incoming request from the client ?
without middleware parsing your requests, your req.body will not be populated. You will then need to manually go research on the req variable and find out how to get the values you want.
Your bodyParser acts as an interpreter, transforming http request, in to an easily accessible format base on your needs.
You may read more on HTTP request here ( You can even write your own http server )
https://nodejs.org/api/http.html#http_class_http_incomingmessage
You will just lose the data, and request.body field will be empty.
Though the data is still sent to you, so it is transferred to the server, but you have not processed it so you won't have access to the data.
You can parse it yourself, by the way. The request is a Readable stream, so you can listen data and end events to collect and then parse the data.
You shall receive what you asked for in scenarios where you do not convert the data you get the raw data that looks somewhat like this username=scott&password=secret&website=stackabuse.com, Now this ain't that bad but you will manually have to filter out which is params, what is a query and inside of those 2 where is the data..
unless it is a project requirement all that heavy lifting is taken care of by express and you get a nicely formatted object looking like this
{
username: 'scott',
password: 'secret',
website: 'stackabuse.com'
}
For Situation where you DO need to use the raw data express gives you a convenient way of accessing that as well all you need to do is use this line of code
express.raw( [options] ) along with express.json( [options] )

Sending large amounts of data from an ESP8266 server

I am building a web server from an ESP8266 that will send environmental data to any web client as a web page. I'm using the Arduino IDE.
The problem is that the data can get rather large at times, and all of the examples I can find show assembling a web page in memory and sending it all at once to the client via ESP8266WebServer.send(). This is ok for small web pages, but won't work with the amount of data I need to send.
What I want to do is send the first part of the web page, then send the data out directly as I gather it, then send the closing parts of the web page. Is this even possible? I've looked unsuccessfully for documentation and there doesn't seem to be any examples anywhere.
For future reference, I think I figured out how to do it, with help from this page: https://gist.github.com/spacehuhn/6c89594ad0edbdb0aad60541b72b2388
The gist of it is that you still use ESP8266WebServer.send(), but you first send an empty string with the Content-Length header set to the size of your data, like this:
server.sendHeader("Content-Length", (String)fileSize);
server.send(200, "text/html", "");
Then you send buffers of data using ESP8266WebServer.sendContent() repeatedly until all of the data is sent.
Hope this helps someone else.
I was having a big issue and a headache in serving big strings concatenating together with other strings variables to the ESP32 Ardunio webserver with
server.send(200, "text/html", BIG_WEBPAGE);
and often resulted in a blank page as I reported in my initial error.
What was happening was this error
E (369637) uart: uart_write_bytes(1159): buffer null
I don't reccommend to use the above server.send() function
After quite a lot of reaserch I found this piece of code that simply works like a charm. I just chunked my webpage in 5 pieces like you see below.
server.sendHeader("Cache-Control", "no-cache, no-store, must-revalidate");
server.sendHeader("Pragma", "no-cache");
server.sendHeader("Expires", "-1");
server.setContentLength(CONTENT_LENGTH_UNKNOWN);
// here begin chunked transfer
server.send(200, "text/html", "");
server.sendContent(WEBPAGE_BIG_0);
server.sendContent(WEBPAGE_BIG_1);
server.sendContent(WEBPAGE_BIG_2);
server.sendContent(WEBPAGE_BIG_3);
server.sendContent(WEBPAGE_BIG_4);
server.sendContent(WEBPAGE_BIG_5);
server.client().stop();
I really own much to this post. Hope the answer hepls someone else.
After some more experiments I realized it is faster and more efficient the code if you do not feed the string variable into the server.sendContent function. Instead you just paste there the actual string value.
server.sendContent("<html><head>my great page</head><body>");
server.sendContent("my long body</body></html>");
It is very important the when you chunk the webpage you don't chunk html tags and you don't chunk an expression of a javascript code (like cutting in half a while or an if), while chunking scripts just chunk after the semicolon or better between two function declarations.
Chunked transfer encoding is probably what you want, and it's helpful in the situation where the web page you are sending is being dynamically created on-the-fly and is also too large to fit into memory. In this situation, you have two problems. One, you can't send it all at once, and two, you don't know ahead of time how big the result is going to be. Both problems can be fixed like this:
String webPageChunk = "some html";
server.setContentLength(CONTENT_LENGTH_UNKNOWN);
server.send ( 200, "text/html", webPageChunk);
while (<page is being generated>) {
webPageChunk = "some more html";
server.sendContent(webPageChunk);
}
server.sendContent("");
Sending a blank line will tell the client to terminate the session. Be careful not to send one in your loop before you're done generating the whole page.

Nodejs: Do additional stuff after res.send

I'm using Node as webserver and I want to log every request to it into a database. I also want the user to receive the response as quickly as possible, so I came up with this code:
// ... putting together the response_data
res.send(response_data);
// ... now log the request into the DB and maybe do additional stuff
It works and I like the idea of putting some of the (time) expensive stuff behind the send. But as I'm new to Node I'm asking if this is a common pattern?
On Stackoverflow I just find people having problems bc they try to send additional data after res.send - but I never heard anybody saying "yeah this is a great feature for your responsiveness" so I'm not sure if there's a major flaw with this solution I just don't see yet...
As long as you don't need to send anything back to the user as a result of the "additional" stuff then your approach is fine.
The problem most people come across is trying to send data down the response after the response has already been sent e.g.
res.send(response_data);
// do additional stuff
res.send(additional_data); // KABOOM!

MarkLogic 8 - XQuery write large result set to a file efficiently

UPDATE: See MarkLogic 8 - Stream large result set to a file - JavaScript - Node.js Client API for someone's answer on how to do this in Javascript. This question is specifically asking about XQuery.
I have a web application that consumes rest services hosted in node.js.
Node simply proxies the request to XQuery which then queries MarkLogic.
These queries already have paging setup and work fine in the normal case to return a page of data to the UI.
I need to have an export feature such that when I put a URL parameter of export=all on a request, it doesn't lookup a page anymore.
At that point it should get the whole result set, even if it's a million records, and save it to a file.
The actual request needs to return immediately saying, "We will notify you when your download is ready."
One suggestion was to use xdmp:spawn to call the XQuery in the background which would save the results to a file. My actual HTTP request could then return immediately.
For the spawn piece, I think the idea is that I run my query with different options in order to get all results instead of one page. Then I would loop through the data and create a string variable to call xdmp:save with.
Some questions, is this a good idea? Is there a better way? If I loop through the result set and it does happen to be very large (gigabytes) it could cause memory issues.
Is there no way to directly stream the results to a file in XQuery?
Note: Another idea I had was to intercept the request at the proxy (node) layer and then do an xdmp:estimate to get the record count and then loop through querying each page and flushing it to disk. In this case I would need to find some way to return my request immediately yet process in the background in node which seems to have some ideas here: http://www.pubnub.com/blog/node-background-jobs-async-processing-for-async-language/
One possible strategy would be to use a self-spawning task that, on each iteration, gets the next page of the results for a query.
Instead of saving the results directly to a file, however, you might want to consider using xdmp:http-post() to send each page to a server:
http://docs.marklogic.com/xdmp:http-post?q=xdmp:http-post&v=8.0&api=true
In particular, the server could be a Node.js server that appends each page as it arrives to a file or any other datasink.
That way, Node.js could handle the long-running asynchronous IO with minimal load on the database server.
When a self-spawned task hits the end of the query, it can again use an HTTP request to notify Node.js to close the file and report that the export is finished.
Hping that helps,

Piping a readstream into a writestream does not work

I (as the client) am trying to post an image with restify, and the server just needs to save it.
req.pipe(fs.createWriteStream('test.jpg'));
is not working. An empty file is created but nothing more. It works when I copy req.body into a buffer and then fs.writeFile(...). I have also tried req.body.pipe, but this throws an error.
You're probably using a body parser middleware that is already reading all of the data from the request so there is nothing left to read. Try adjusting the placement of your route handler and/or body parsing middleware if you want to read directly from the request object.
However, that will only work if the request contains only the image data. Typically a request is formatted as multipart/form-data if it contains at least one file, so you cannot just pipe the request and expect image data only.
So something else in your middleware chain, probably restify.bodyParser(), is already streaming the request body into a buffer or string as req.body and you can't stream something twice. Find the middleware and disable it for this route if you want to handle the streaming straight to the filesystem yourself.

Resources