in nodeJs request, how can we get the html after an ajax loading? - node.js

doing
var page_url = "http://skiferie.danskbilferie.dk/sidste_chance_uge7_norge_sverige.html";
http.get(page_url, (http_res) => {
var data = "";
http_res.on("data", function (chunk) {
data += chunk;
});
http_res.on("end", function () {
resolve({data});
});
});
get's the correct HTML from that page, but how can I wait for the table of deals to be filled?
as that page only fills the data I need after it loads, calling an ajax method to fill the data in... is there a wait for me to wait for such action to be completed?

To wait until ajax will be loaded, you need to add timeout for your request.
Your goal is to somehow tell script to wait for some period of time and than get rendered html.
You can implement such behavior as #GabrielBleu said, with puppeteer and here is nice tutorial with example: tutorial
Or with webdriver or you can try with this resource

Related

expressJS/multer multiple file upload, render for each file

I am using expressJS with multer, and want to create a website to upload multiple files.
So I already manage to get this working. Currently I am using XMLHttpRequest for POST request on the client side, and update elements on page also from the client side script. For 5 files selected to upload with one click on submit button, I can do 5 post request from the client-side and update feedback one by one.
// load file
formData.append(file1);
let req = new XMLHttpRequest();
req.onreadystatechange = function () {
if (req.readyState == XMLHttpRequest.DONE) {
updateView(); // Change layout in HTML when POST DONE
}
}
req.open("POST", "/upload");
req.send(formData);
Reapeat for multiple files [file1, file2, file3,...]
Question:
So now, I would like to use res.render() with parameters instead. I am wondering if it is possible to get one POST request and render the page multiple times. If I POST 5 files at once, I want to render every time one file is processed, and let the user on the client side see the feedback. I don't need a progress bar, I just want to show some basic informations and status of the file. Play a bit around with res.render(), but didn't find anything working as I wished.
So I can avoid adding any HTML code in my JavaScript. And just use the handlebar on the backend.
Front end:
formData.append(file1);
formData.append(file2);
formData.append(file3);
let req = new XMLHttpRequest();
req.open("POST", "/upload");
req.send(formData);
And for backend I want something like this:
router.post('/upload', upload.array('multi_files'), async function (req, res, next) {
const files = req.files;
for (const file of files) {
let result = processFile(file);
res.render('/', result);
}
});
But unfortunately I cannot res.render multiple times.

HTTP GET Request, Response

I have a file where I send GET request to another file, and I got the response show up under Network tab of Google Dev Tools, but it did not display on my browser.
This is what I do for passing the response to display in my browser.
xmlhttp.onreadystatechange = function() {
if(this.readyState == 4){
res = xmlhttp.responseText;
document.getElementById('table3').innerHTML = res;
}
}
And I want to display the response under the table of id = "table3" like below.
<td id="table3">
<td>
The content inside was passing from the response of GET request.
Any help is appreciated. Thank you
I think that your if statement in the onreadystatechange callback is wrong. xmlhttp is the instance of the XHR class, which would mean that instead of using this, you would have to use xmlhttp, and not this. this in the context of your program likely is the window object.

Is there a way to limit the amount of data that I get from a response?

Hello I've got a small challenge to do where I have to display some data that I get from an api. The main page will display the first 20 results and clicking on a button will add 20 more results from the page.
The api call that I was given returns an array with around 1500 elements and the api doesn't have a parameter to limit the amount of elements in the array so my question is if I can limit it somehow with axios or should I just fetch all of these elements and display them?
This is the api: https://api.chucknorris.io/
there are two answers for your question
the short answer is :
On your side, there's nothing you can do until pagination is implemented on API side
the second answer is :
you can handle it using http module like this
http.request(opts, function(response) {
var request = this;
console.log("Content-length: ", response.headers['content-length']);
var str = '';
response.on('data', function (chunk) {
str += chunk;
if (str.length > 10000)
{
request.abort();
}
});
response.on('end', function() {
console.log('done', str.length);
...
});
}).end();
This will abort the request at around 10.000 bytes, since the data arrives in chunks of various sizes.
Since the API has no parameter to limit the amount of results you are responsible for modifying the response.
Since you're using Axios you could do this with a response interceptor so that the response is modified before reaching your application.
You may want to consider where the best place to do this is though. If you allow the full response to come back to your application and then store it somewhere, it may be easier to return the next page of 20 results at the user's request rather than repeatedly calling the API.

Send variable from mongoose query to page without reload on click

I have a link on my site. When clicked it'll call a function that does a mongoose query.
I'd like the results of that query to be sent to the same page in a variable without reloading the page. How do I do that? Right now it is just rendering the page again with new query result data.
// List comments for specified chapter id.
commentController.list = function (req, res) {
var chapterId = req.params.chapterId;
var query = { chapterId: chapterId };
Chapter.find({ _id: chapterId }).then(function (chapter) {
var chapter = chapter;
Comment.find(query).then(function (data) {
console.log(chapter);
Chapter.find().then(function(chapters){
return res.render({'chapterlinks', commentList: data, user: req.user, chapterq: chapter, chapters:chapters });
})
});
})
};
You just need to make that request from your browser via AJAX:
https://www.w3schools.com/xml/ajax_intro.asp
This would be in the code for your client (browser), not the code for your server (nodejs).
UPDATE:
Here's a simple example, which uses jQuery to make things easier:
(1) create a function that performs and handles the ajax request
function getChapterLinks(chapterId) {
$.ajax({
url: "/chapterLinks/"+chapterId,
}).done(function(data) {
//here you should do something with data
console.log(data);
});
}
(2) bind that function to a DOM element's click event
$( "a#chapterLinks1" ).click(function() {
getChapterLinks(1);
});
(3) make sure that DOM element is somewhere in you html
<a id="chapterLinks1">Get ChapterLinks 1</a>
Now when this a#chapterLinks1 element is clicked, it will use AJAX to fetch the response of /chaptersLink/1 from your server without reloading the page.
references:
http://api.jquery.com/jquery.ajax/
http://api.jquery.com/jquery.click/

Multiple clients posting data in node js

I've read that in Node js one should treat POST requests carefully because the post data may arrive in chunks, so it has to be handled like this, concatenating:
function handleRequest(request, response) {
if (request.method == 'POST') {
var body = '';
request.on('data', function (data) {
body += data;
});
request.on('end', function () {
//data is complete here
});
}
}
What I don't understand is how this code snippet will handle several clients at the same time. Let's say two separate clients start uploading large POST data. They will be added to the same body, mixing up the data...
Or is it the framework which will handle this? Triggering different instances of handleRequest function so that they do not get mixed up in the body variable?
Thanks.
Given the request, response signature of your method, it looks like that's a listener for the request event.
Assuming that's correct, then this event is emitted for every new request, so as long as you are only concatenating new data to a body object that is unique to that handler (as in your current example), you're good to go.

Resources