Microsoft Graph API calendarView delta endpoint does not respond on initial request - node.js

When sending the first request to the calendarView API, the request does not return or timeout. This only happens to some of the requests and seems to happen only on the first request (perhaps because the first request has larger response sizes).
An example request:
GET /me/calendarView/delta?startDateTime=2019-06-27T22:00:00.000Z&endDateTime=2019-09-08T13:17:30.659Z
The current solution I found was reducing the odata.maxpagesize to a very small number (currently 2 is the highest number which works for all calendars I have tested).
The requests are sent using the nodejs client "#microsoft/microsoft-graph-client": "1.7.0".
// Initialize client with credentials
const client = graph.Client.init({
authProvider: done => {
done(null, credentials.access_token);
}
});
const url = "/me/calendarView/delta?startDateTime=2019-06-27T22:00:00.000Z&endDateTime=2019-09-08T13:17:30.659Z
console.log("Request start")
const result = await oAuth2Client
.api(url)
.header("prefer", "odata.maxpagesize=10")
.get();
console.log("Got result", result);
Here the last console.log is never called.
The expected result is that the request returns, at least with an error code. I also expect the API to be able to handle a lot more items than 2 per page.
The current solution with setting a small maxpagesize works temporarily, however, I expect that there is another root cause issue.
Any idea what is wrong, and how this can be resolved?

After a lot of debugging i traced the issue to the node library. When asking for a raw response from the API, I got back the result regardless of page size.
The solution was to manually parse the response myself, after asking for the raw response from the library. This was based on the implementation in the library at https://github.com/microsoftgraph/msgraph-sdk-javascript/blob/dev/src/GraphResponseHandler.ts#L98
I could not find the root cause issue in the library, and ended up just parsing it on my end. I also analysed the raw response from the API, but the content-type header was correctly application/json, and the response status was 200 OK.
const rawResponse = client.api(url).response(ResponseType.RAW).get()
const parsedResponse = await rawResponse.json()

Related

How to access json response data using Axios, node/express backend

I have this project I’m working on and I am using node/express + Axios to retrieve data from a third-party API.
I am attaching an image of the response I am getting from my postman but,
I am having an issue figuring out a way to access and manipulate a specific set of data.
If there are any resources anyone could share that would help I would appreciate it.
as of now I just have:
axios.get('apiUrl')
.then((response) => {
const cardData = response.data;
res.send(cardData);
}
This is the response I get:
for example, I’d like to access the “abilities” property.
Since that property is within the “0" object within the response object, I’m a bit confused as to how to navigate this in the code.
I’ve tried response.data.0 but that doesn’t seem to work.
function retrieve(callback){
//I don't get why you are using request.send here. Are you routing the response elsewhere?
//If you are just consuming a service, use Axios with a callback instead.
//If you're not routing it you won't need Express.
axios.get('apiUrl').then(response => callback(response));
}
function clbk(response){
let obj = JSON.parse(response); //In case you are receiving a stringified JSON
//Do whatever you want with the data
//If you have a number as a key, access it by using []. So, considering your example:
response.data[0]
}
//CALL:
retrieve(clbk);

NodeJs with Express not parsing form data from node-fetch

I'm creating two APIs with NodeJS, Express, and TypeScript. One API will take a request with a content type of multipart/form-data, and will use the body in the request to make another request to a second API.
I'm using Postman, so the chain of request looks something like this
Postman -> First API -> Second API
I use node-fetch to make a request from the first API to the second one. The body of the request is a FormData, which contains some files and key-value pairs.
const form = new FormData();
// File for profile picture
const profilePictureBuffer = await (await fetch(user.profilePicture)).buffer();
form.append('profilePicture', profilePictureBuffer);
// File for ID Card
const idCardBuffer = await (await fetch(user.idCardUrl)).buffer();
form.append('idCard', idCardBuffer);
// This part iterats over the obsect of 'user',
// which contains other key-value pairs
Object.entries(user).forEach((data) => {
form.append(data[0], data[1]);
});
// Make POST request to second API
const pinterUser = await fetch(secondAPIURL, {
method: 'post',
body: form,
headers: form.getHeaders()
});
I ran both of the APIs on localhost so that I can monitor the logs for any bugs. As I make a request from Postman to the first API, then the first API make another request to the second API, I got the following error log in the terminal for the second API
TypeError: Cannot read property '0' of undefined
After some investigation, I found out that, in the second API, the req.body and req.files are empty objects. This means that Express did not parse the incoming request. Note that I've also already a multer middleware to handle the files in the request.
Furthermore, I have added the following lines of code in my server.ts file for the second API
/** Parse the body of the request */
router.use(express.urlencoded({ extended: true }));
router.use(express.json());
However, when I tried making the request from Postman, it returns a successful response.
I'm not really sure what's going on here. I've tried looking for some answer regarding this similar issue, but most of them suggest adding urlencoded and json, or using some other library to handle parsing form data.
In my case, the first suggestion doesn't solve my problem since I already added them from the start, and the latter is what I'm trying to avoid.
I wonder if anybody could point out what I was missing here? Thanks in advance

Is it possible to detect an immediate when sending a chunky POST request with AXIOS

I am using AXIOS in a web client in order to POST a file to the express backend as an upload. Since both the file's size and the client user's bandwidth is variable, it may take a certain amount of time for the POST request to finish. On the backend some logic applies and the request is promptly rejected.
The problem is that the client receives the response only after the request is finished, which can be several seconds.
I have already tested that it is not the backend's fault, as the behavior is the same when POSTing to any arbitrary post-enabled url in the web, regardless of the technology. Here is an (over)simplified example of the case.
Here's the post action. Notice the commended request to the arbitrary post-enabled url. It behaves exactly the same:
try{
console.log("posting....")
const res = await axios.post("http://localhost:4000/upload", formData)
// const res = await axios.post("https://github.com/logout", formData)
console.log("result:")
console.log(res)
}catch(err){
console.error(err)
}
And the demo express backend route:
app.post("/upload", (req, res) => {
console.log("Rejecting...")
res.status(403).send()
console.log("Rejected.")
return
})
For testing purposes I choose a 3.7Mb file, and throttle down my browsers bandwidth to the Fast 3G preset.
The backend immediately outputs:
Rejecting...
Rejected.
Whereas the request is pending for about 43 seconds before returning the 403 error:
Am I missing something obvious here? It is such a common functionality, it makes me doubt that this is the correct way to be handled. And if it really is, do we have any information on whether express's thread is active during that time, or is it just a client inconvenience?
Thanks in advance!
I believe you could just use res.status(403) rather than res.status(403).send() .
You could also try using res.status(403).end() and I am not sure why you should use a return statement in the router part.
It seems that first sending the response headers and then manually destroying the request does the trick:
app.post("/upload", (req, res) => {
console.log("Rejecting...")
res.status(403).send("Some message")
return req.destroy()
})
The AXIOS request stays pending until just the current chunk is uploaded, and then immediately results in the correct status and message. In the throttled down fast 3g example, pending time went down from 43s to 900ms.
Also, this solution emerged through trial and error, so it can possibly not be the best practice.
I would still be interested in an AXIOS oriented solution, if one exists.

Can't create a listener to the database while also being able to update it. (using References.on and References.put)

I'm trying to display a list of comments on my react page.
For this I have setup a NodeJS server which loads the data from Firebase and passes it on to React. I am able to get it to load the comments list and display them, but when I try to add a comment, the server crashes with the following error:
#firebase/database: FIREBASE WARNING: Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
This is because I am using:
firebase.database().ref('my-path').on("value", ...)
However, if I use firebase.database().ref('my-path').once("value", ...) then I lose the ability to update the comments as soon as a new comment is posted. Is there a way to be able to have a listener attached to the database and still be able to update the contents of that database?
Here is my NodeJS code:
app.get("/comments/:id", (req, res) => {
const itemsRef = firebase.database().ref(`comments/${req.params.id}`);
itemsRef.on('value', (snapshot) => {
let comments = snapshot.val();
return res.status(200).json(comments);
})
})
app.post("/comments/:id", (req, res) => {
const itemsRef = firebase.database().ref(`comments/${req.params.id}`);
itemsRef.push(req.body);
})
The error occurs after the post request is called.
You're sending a response back to the client with:
res.status(200).json(comments)
This sets at least two headers (the status, and the response type) and then sends the response. Next time you get an update from the database, this code runs again, and again tries to send the two headers. But in HTTP all headers must be before the main body of the response. So the second time this code runs, it throws an error.
If you want to keep sending more data to the client, you'll need to use more primitive methods of the response object to prevent sending headers, or other illegal data. While possible, it's more complex than you may think, as the client needs to handle this response stream, which most clients won't.
I'd highly recommend looking at Doug's alternative, which is to just use the Firebase Realtime Database from the client directly. That way you can use the client SDK that it has, which handles this (and many more complexities) behind the scenes.

Express request is called twice

To learn node.js I'm creating a small app that get some rss feeds stored in mongoDB, process them and create a single feed (ordered by date) from these ones.
It parses a list of ~50 rss feeds, with ~1000 blog items, so it's quite long to parse the whole, so I put the following req.connection.setTimeout(60*1000); to get a long enough time out to fetch and parse all the feeds.
Everything runs quite fine, but the request is called twice. (I checked with wireshark, I don't think it's about favicon here).
I really don't get it.
You can test yourself here : http://mighty-springs-9162.herokuapp.com/feed/mde/20 (it should create a rss feed with the last 20 articles about "mde").
The code is here: https://github.com/xseignard/rss-unify
And if we focus on the interesting bits :
I have a route defined like this : app.get('/feed/:name/:size?', topics.getFeed);
And the topics.getFeed is like this :
function getFeed(req, res) {
// 1 minute timeout to get enough time for the request to be processed
req.connection.setTimeout(60*1000);
var name = req.params.name;
var callback = function(err, topic) {
// if the topic has been found
if (topic) {
// aggregate the corresponding feeds
rssAggregator.aggregate(topic, function(err, rssFeed) {
if (err) {
res.status(500).send({error: 'Error while creating feed'});
}
else {
res.send(rssFeed);
}
},
req);
}
else {
res.status(404).send({error: 'Topic not found'});
}};
// look for the topic in the db
findTopicByName(name, callback);
}
So nothing fancy, but still, this getFeed function is called twice.
What's wrong there? Any idea?
This annoyed me for a long time. It's most likely the Firebug extension which is sending a duplicate of each GET request in the background. Try turning off Firebug to make sure that's not the issue.
I faced the same issue while using Google Cloud Functions Framework (which uses express to handle requests) on my local machine. Each fetch request (in browser console and within web page) made resulted in two requests to the server. The issue was related to CORS (because I was using different ports), Chrome made a OPTIONS method call before the actual call. Since OPTIONS method was not necessary in my code, I used an if-statement to return an empty response.
if(req.method == "OPTIONS"){
res.set('Access-Control-Allow-Origin', '*');
res.set('Access-Control-Allow-Headers', 'Content-Type');
res.status(204).send('');
}
Spent nearly 3hrs banging my head. Thanks to user105279's answer for hinting this.
If you have favicon on your site, remove it and try again. If your problem resolved, refactor your favicon url
I'm doing more or less the same thing now, and noticed the same thing.
I'm testing my server by entering the api address in chrome like this:
http://127.0.0.1:1337/links/1
my Node.js server is then responding with a json object depending on the id.
I set up a console log in the get method and noticed that when I change the id in the address bar of chrome it sends a request (before hitting enter to actually send the request) and the server accepts another request after I actually hit enter. This happens with and without having the chrome dev console open.
IE 11 doesn't seem to work in the same way but I don't have Firefox installed right now.
Hope that helps someone even if this was a kind of old thread :)
/J
I am to fix with listen.setTimeout and axios.defaults.timeout = 36000000
Node js
var timeout = require('connect-timeout'); //express v4
//in cors putting options response code for 200 and pre flight to false
app.use(cors({ preflightContinue: false, optionsSuccessStatus: 200 }));
//to put this middleaware in final of middleawares
app.use(timeout(36000000)); //10min
app.use((req, res, next) => {
if (!req.timedout) next();
});
var listen = app.listen(3333, () => console.log('running'));
listen.setTimeout(36000000); //10min
React
import axios from 'axios';
axios.defaults.timeout = 36000000;//10min
After of 2 days trying
you might have to increase the timeout even more. I haven't seen the express source but it just sounds on timeout, it retries.
Ensure you give res.send(); The axios call expects a value from the server and hence sends back a call request after 120 seconds.
I had the same issue doing this with Express 4. I believe it has to do with how it resolves request params. The solution is to ensure your params are resolved by for example checking them in an if block:
app.get('/:conversation', (req, res) => {
let url = req.params.conversation;
//Only handle request when params have resolved
if (url) {
res.redirect(301, 'http://'+ url + '.com')
}
})
In my case, my Axios POST requests were received twice by Express, the first one without body, the second one with the correct payload. The same request sent from Postman only received once correctly. It turned out that Express was run on a different port so my requests were cross origin. This caused Chrome to sent a preflight OPTION method request to the same url (the POST url) and my app.all routing in Express processed that one too.
app.all('/api/:cmd', require('./api.js'));
Separating POST from OPTIONS solved the issue:
app.post('/api/:cmd', require('./api.js'));
app.options('/', (req, res) => res.send());
I met the same problem. Then I tried to add return, it didn't work. But it works when I add return res.redirect('/path');
I had the same problem. Then I opened the Chrome dev tools and found out that the favicon.ico was requested from my Express.js application. I needed to fix the way how I registered the middleware.
Screenshot of Chrome dev tools
I also had double requests. In my case it was the forwarding from http to https protocol. You can check if that's the case by looking comparing
req.headers['x-forwarded-proto']
It will either be 'http' or 'https'.
I could fix my issue simply by adjusting the order in which my middlewares trigger.

Resources