Can't create a listener to the database while also being able to update it. (using References.on and References.put) - node.js

I'm trying to display a list of comments on my react page.
For this I have setup a NodeJS server which loads the data from Firebase and passes it on to React. I am able to get it to load the comments list and display them, but when I try to add a comment, the server crashes with the following error:
#firebase/database: FIREBASE WARNING: Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
This is because I am using:
firebase.database().ref('my-path').on("value", ...)
However, if I use firebase.database().ref('my-path').once("value", ...) then I lose the ability to update the comments as soon as a new comment is posted. Is there a way to be able to have a listener attached to the database and still be able to update the contents of that database?
Here is my NodeJS code:
app.get("/comments/:id", (req, res) => {
const itemsRef = firebase.database().ref(`comments/${req.params.id}`);
itemsRef.on('value', (snapshot) => {
let comments = snapshot.val();
return res.status(200).json(comments);
})
})
app.post("/comments/:id", (req, res) => {
const itemsRef = firebase.database().ref(`comments/${req.params.id}`);
itemsRef.push(req.body);
})
The error occurs after the post request is called.

You're sending a response back to the client with:
res.status(200).json(comments)
This sets at least two headers (the status, and the response type) and then sends the response. Next time you get an update from the database, this code runs again, and again tries to send the two headers. But in HTTP all headers must be before the main body of the response. So the second time this code runs, it throws an error.
If you want to keep sending more data to the client, you'll need to use more primitive methods of the response object to prevent sending headers, or other illegal data. While possible, it's more complex than you may think, as the client needs to handle this response stream, which most clients won't.
I'd highly recommend looking at Doug's alternative, which is to just use the Firebase Realtime Database from the client directly. That way you can use the client SDK that it has, which handles this (and many more complexities) behind the scenes.

Related

How to access json response data using Axios, node/express backend

I have this project I’m working on and I am using node/express + Axios to retrieve data from a third-party API.
I am attaching an image of the response I am getting from my postman but,
I am having an issue figuring out a way to access and manipulate a specific set of data.
If there are any resources anyone could share that would help I would appreciate it.
as of now I just have:
axios.get('apiUrl')
.then((response) => {
const cardData = response.data;
res.send(cardData);
}
This is the response I get:
for example, I’d like to access the “abilities” property.
Since that property is within the “0" object within the response object, I’m a bit confused as to how to navigate this in the code.
I’ve tried response.data.0 but that doesn’t seem to work.
function retrieve(callback){
//I don't get why you are using request.send here. Are you routing the response elsewhere?
//If you are just consuming a service, use Axios with a callback instead.
//If you're not routing it you won't need Express.
axios.get('apiUrl').then(response => callback(response));
}
function clbk(response){
let obj = JSON.parse(response); //In case you are receiving a stringified JSON
//Do whatever you want with the data
//If you have a number as a key, access it by using []. So, considering your example:
response.data[0]
}
//CALL:
retrieve(clbk);

Is it possible to detect an immediate when sending a chunky POST request with AXIOS

I am using AXIOS in a web client in order to POST a file to the express backend as an upload. Since both the file's size and the client user's bandwidth is variable, it may take a certain amount of time for the POST request to finish. On the backend some logic applies and the request is promptly rejected.
The problem is that the client receives the response only after the request is finished, which can be several seconds.
I have already tested that it is not the backend's fault, as the behavior is the same when POSTing to any arbitrary post-enabled url in the web, regardless of the technology. Here is an (over)simplified example of the case.
Here's the post action. Notice the commended request to the arbitrary post-enabled url. It behaves exactly the same:
try{
console.log("posting....")
const res = await axios.post("http://localhost:4000/upload", formData)
// const res = await axios.post("https://github.com/logout", formData)
console.log("result:")
console.log(res)
}catch(err){
console.error(err)
}
And the demo express backend route:
app.post("/upload", (req, res) => {
console.log("Rejecting...")
res.status(403).send()
console.log("Rejected.")
return
})
For testing purposes I choose a 3.7Mb file, and throttle down my browsers bandwidth to the Fast 3G preset.
The backend immediately outputs:
Rejecting...
Rejected.
Whereas the request is pending for about 43 seconds before returning the 403 error:
Am I missing something obvious here? It is such a common functionality, it makes me doubt that this is the correct way to be handled. And if it really is, do we have any information on whether express's thread is active during that time, or is it just a client inconvenience?
Thanks in advance!
I believe you could just use res.status(403) rather than res.status(403).send() .
You could also try using res.status(403).end() and I am not sure why you should use a return statement in the router part.
It seems that first sending the response headers and then manually destroying the request does the trick:
app.post("/upload", (req, res) => {
console.log("Rejecting...")
res.status(403).send("Some message")
return req.destroy()
})
The AXIOS request stays pending until just the current chunk is uploaded, and then immediately results in the correct status and message. In the throttled down fast 3g example, pending time went down from 43s to 900ms.
Also, this solution emerged through trial and error, so it can possibly not be the best practice.
I would still be interested in an AXIOS oriented solution, if one exists.

running function after res.send

I'm trying to run this code
module.exports = async (req, res, next) => {
res.set('Content-Type', 'text/javascript');
const response = {};
res.status(200).render('/default.js', { response });
await fn(response);
};
fn is a function that calls an api to a service that will output to the client something. but its dependent on the default.js file to be loaded first. How can do something like
res.render('/default.js', { response }).then(async() => {
await fn(response);
};
tried it, but doesn't seem to like the then()
also, fn doesn't return data to the client, it calls an api service that is connected with the web sockets opened by the code from default.js that is rendered.
do i have to make an ajax request for the fn call and not call it internally?
any ideas?
Once you call res.render(), you can send no more data to the client, the http response has been sent and the http connection is done - you can't send any more to it. So, it does you no good to try to add something more to the response after you call res.render().
It sounds like you're trying to put some data INTO the script that you send to the browser. Your choices for that are to either:
Get the data you need to with let data = await fn() before you call res.render() and then pass that to res.render() so your template engine can put that data into the script file that you send the server (before you send it).
You will need to change the script file template to be able to do this so it has appropriate directives to insert data into the script file and you will have to be very careful to format the data as Javascript data structures.
Have a script in the page make an ajax call to get the desired data and then do your task in client-side Javascript after the page is already up and running.
It looks like it might be helpful for you to understand the exact sequence of things between browser and server.
Browser is displaying some web page.
User clicks on a link to a new web page.
Browser requests new web page from the server for a particular URL.
Server delivers HTML page for that URL.
Browser parses that HTML page and discovers some other resources required to render the page (script files, CSS files, images, fonts, etc...)
Browser requests each of those other resources from the server
Server gets a request for each separate resource and returns each one of them to the browser.
Browser incorporates those resources into the HTML page it previously downloaded and parsed.
Any client side scripts it retrieved for that page are then run.
So, the code you show appears to be a route for one of script files (in step 5 above). This is where it fits into the overall scheme of loading a page. Once you've returned the script file to the client with res.render(), it has been sent and that request is done. The browser isn't connected to your server anymore for that resource so you can't send anything else on that same request.

Microsoft Graph API calendarView delta endpoint does not respond on initial request

When sending the first request to the calendarView API, the request does not return or timeout. This only happens to some of the requests and seems to happen only on the first request (perhaps because the first request has larger response sizes).
An example request:
GET /me/calendarView/delta?startDateTime=2019-06-27T22:00:00.000Z&endDateTime=2019-09-08T13:17:30.659Z
The current solution I found was reducing the odata.maxpagesize to a very small number (currently 2 is the highest number which works for all calendars I have tested).
The requests are sent using the nodejs client "#microsoft/microsoft-graph-client": "1.7.0".
// Initialize client with credentials
const client = graph.Client.init({
authProvider: done => {
done(null, credentials.access_token);
}
});
const url = "/me/calendarView/delta?startDateTime=2019-06-27T22:00:00.000Z&endDateTime=2019-09-08T13:17:30.659Z
console.log("Request start")
const result = await oAuth2Client
.api(url)
.header("prefer", "odata.maxpagesize=10")
.get();
console.log("Got result", result);
Here the last console.log is never called.
The expected result is that the request returns, at least with an error code. I also expect the API to be able to handle a lot more items than 2 per page.
The current solution with setting a small maxpagesize works temporarily, however, I expect that there is another root cause issue.
Any idea what is wrong, and how this can be resolved?
After a lot of debugging i traced the issue to the node library. When asking for a raw response from the API, I got back the result regardless of page size.
The solution was to manually parse the response myself, after asking for the raw response from the library. This was based on the implementation in the library at https://github.com/microsoftgraph/msgraph-sdk-javascript/blob/dev/src/GraphResponseHandler.ts#L98
I could not find the root cause issue in the library, and ended up just parsing it on my end. I also analysed the raw response from the API, but the content-type header was correctly application/json, and the response status was 200 OK.
const rawResponse = client.api(url).response(ResponseType.RAW).get()
const parsedResponse = await rawResponse.json()

How can I send information from NodeJS server to client side?

For example, I want to signal to the client side that a username sent via the POST method in an HTML form already exists in my database.
I know how to recuperate POST data with body-parser and I know how to look it up in a MySQL database.
I know that I could use Ajax to write an error message directly on the form. What does my NodeJS server need to send and how does it send this information?
I've searched through numerous tutorials and I just found solutions where they send a new HTML page. I want to keep my web page the same and use functions like appendChild() to post the error message.
There are a couple of ways you could send data from server-side, so NodeJS, to client-side - which I assume in your case would be some JavaScript file like main.js that handles DOM manimulation.
So, the 1st way you could send data is through a templating engine like Handlebars, for example. There is an easy to use module for express you could get here: hbs.
Now to quickly summarize how an engine like that works, we are basically sending an HTML file like you probably saw in the tutorials, however, a templating engine like Handlebars allows us to send actual data with that file dynamically, so what we would do is render a specific Handlebars template (which in core is just HTML), and pass in a JavaScript object to the render call which would contain all the data you want to pass into that file and then access it in the .hbs file.
So on the server-side, we would write something like this, assuming we have a file called home.hbs and set up Handlebars as the templating engine:
router.get('/home', function(req,res) {
var dataToSendObj = {'title': 'Your Website Title', 'message': 'Hello'};
res.render('home',dataToSendObj);
});
And access in home.hbs like this:
<html>
<header>
{{title}}
</header>
<body>
message from server: {{message}}
</body>
</html>
Now, the issue with this approach is that if you wanted to update the data on the page dynamically, without having to reload the page, using a templating engine would not be ideal. Instead, like you said, you would use AJAX.
So, the 2nd way you could send data from your NodeJS server to the front-end of your website, is using an asynchronous AJAX call.
First, add a route to whatever route handler you are using for AJAX to make a call to. This where you have some logic to perhaps access the database, make some checks and return some useful information back to client.
router.get('/path/for/ajax/call', function(req,res) {
// make some calls to database, fetch some data, information, check state, etc...
var dataToSendToClient = {'message': 'error message from server'};
// convert whatever we want to send (preferably should be an object) to JSON
var JSONdata = JSON.stringify(dataToSendToClient);
res.send(JSONdata);
});
Assuming you have some file such as main.js, create an AJAX request with callbacks to listen to certain event responses like this:
var req = new XMLHttpRequest();
var url = '/path/for/ajax/call';
req.open('GET',url,true); // set this to POST if you would like
req.addEventListener('load',onLoad);
req.addEventListener('error',onError);
req.send();
function onLoad() {
var response = this.responseText;
var parsedResponse = JSON.parse(response);
// access your data newly received data here and update your DOM with appendChild(), findElementById(), etc...
var messageToDisplay = parsedResponse['message'];
// append child (with text value of messageToDisplay for instance) here or do some more stuff
}
function onError() {
// handle error here, print message perhaps
console.log('error receiving async AJAX call');
}
To summarize the above approach using AJAX, this would be the flow of the interaction:
Action is triggered on client-side (like button pressed)
The event handler for that creates a new AJAX request, sets up the callback so it knows what to do when the response comes back from the server, and sends the request
The GET or POST request sent is caught by our route handler on the server
Server side logic is executed to get data from database, state, etc...
The new data is fetched, placed into a JSON object, and sent back by the server
The client AJAX's event listener for either load or error catches the response and executes the callback
In the case of a successful response load, we parse the response, and update the client-side UI
Hope this is helpful!

Resources