Do we need multithreading for facebook chatbot server? - multithreading

We are working on a facebook chatbot. It looks facebook sends bunch of messages together in a json array and waits until that request is processed. It looks like at least in development phase it does not make multiple web request to server. So do I need to consider multi threading or any other strategy to serve concurrent user?
Example response:
{
"object":"page",
"entry":[
{
"id":"PAGE_ID",
"time":1458692752478,
"messaging":[
{
"sender":{
"id":"USER_ID"
},
"recipient":{
"id":"PAGE_ID"
},
...
}
]
}
]
}

Related

How to apply filter for Gmail watch push notification api

I am trying to subscribe/registering for push notification and push that to mine server.
Here is what I am trying
axios.post(`https://www.googleapis.com/gmail/v1/users/me/watch`, {
"topicName": "projects/gpubsub/topics/subcription"
}, util.authHeader(token))
The API is working correctly. Now I have some questions like
How to apply a filter like getting notified only with the particular user or subject. I have saw filters are using in Microsoft flow for Gmail
How to send extra data so that I can receive it in
attributes property on pushing notification to my server
Filters are sent as part of optional parms.
watch (cb) {
const params = {
userId: 'me',
resource: {
labelIds: ['INBOX'],
labelFilterAction: 'include',
topicName: configure.Google.TopicName
}
};
this.gmail.users.watch(params, cb);
}
remember lableid is the id of the label not its name use lables.list to find the ids.
I recommend you look into using the official Google apis node.js library rather than coding this yourself.

React app with Server-side rendering crashes with load

I'm using react-boilerplate (with react-router, sagas, express.js) for my React app and on top of it I've added SSR logic so that once it receives an HTTP request it renders react components to string based on URL and sends HTML string back to the client.
While react rendering is happening on the server side, it also makes fetch request through sagas to some APIs (up to 5 endpoints based on the URL) to get data for components before it actually renders the component to string.
Everything is working great if I make only several request to the Node server at the same time, but once I simulate load of 100+ concurrent requests and it starts processing it then at some point it crashes with no indication of any exception.
What I've noticed while I was trying to debug the app is that once 100+ incoming requests begin to be processed by the Node server it sends requests to APIs at the same time but receives no actual response until it stops stacking those requests.
The code that's used for rendering on the server side:
async function renderHtmlDocument({ store, renderProps, sagasDone, assets, webpackDllNames }) {
// 1st render phase - triggers the sagas
renderAppToString(store, renderProps);
// send signal to sagas that we're done
store.dispatch(END);
// wait for all tasks to finish
await sagasDone();
// capture the state after the first render
const state = store.getState().toJS();
// prepare style sheet to collect generated css
const styleSheet = new ServerStyleSheet();
// 2nd render phase - the sagas triggered in the first phase are resolved by now
const appMarkup = renderAppToString(store, renderProps, styleSheet);
// capture the generated css
const css = styleSheet.getStyleElement();
const doc = renderToStaticMarkup(
<HtmlDocument
appMarkup={appMarkup}
lang={state.language.locale}
state={state}
head={Helmet.rewind()}
assets={assets}
css={css}
webpackDllNames={webpackDllNames}
/>
);
return `<!DOCTYPE html>\n${doc}`;
}
// The code that's executed by express.js for each request
function renderAppToStringAtLocation(url, { webpackDllNames = [], assets, lang }, callback) {
const memHistory = createMemoryHistory(url);
const store = createStore({}, memHistory);
syncHistoryWithStore(memHistory, store);
const routes = createRoutes(store);
const sagasDone = monitorSagas(store);
store.dispatch(changeLocale(lang));
match({ routes, location: url }, (error, redirectLocation, renderProps) => {
if (error) {
callback({ error });
} else if (renderProps) {
renderHtmlDocument({ store, renderProps, sagasDone, assets, webpackDllNames })
.then((html) => {
callback({ html });
})
.catch((e) => callback({ error: e }));
} else {
callback({ error: new Error('Unknown error') });
}
});
}
So my assumption is that something is going wrong once it receives too many HTTP requests which in turn generates even more requests to API endpoints to render react components.
I've noticed that it blocks event loop for 300ms after renderAppToString() for every client request, so once there are 100 concurrent requests it blocks it for about 10 seconds. I'm not sure if that's a normal or bad thing though.
Is it worth trying to limit simultaneous requests to Node server?
I couldn't find much information on the topic of SSR + Node crashes. So I'd appreciate any suggestions as to where to look at to identify the problem or for possible solutions if anyone has experienced similar issue in the past.
In the above image, I am doing ReactDOM.hydrate(...) I can also load my initial and required state and send it down in hydrate.
I have written the middleware file and I am using this file to decide based on what URL i should send which file in response.
Above is my middleware file, I have created the HTML string of the whichever file was requested based on URL. Then I add this HTML string and return it using res.render of express.
Above image is where I compare the requested URL path with the dictionary of path-file associations. Once it is found (i.e. URL matches) I use ReactDOMserver render to string to convert it into HTML. This html can be used to send with handle bar file using res.render as discussed above.
This way I have managed to do SSR on my most web apps built using MERN.io stack.
Hope my answer helped you and Please write comment for discussions
1. Run express in a cluster
A single instance of Node.js runs in a single thread. To take
advantage of multi-core systems, the user will sometimes want to
launch a cluster of Node.js processes to handle the load.
As Node is single threaded the problem may also be in a file lower down the stack were you are initialising express.
There are a number of best practices when running a node app that are not generally mentioned in react threads.
A simple solution to improve performance on a server running multiple cores is to use the built in node cluster module
https://nodejs.org/api/cluster.html
This will start multiple instance of your app on each core of your server giving you a significant performance improvement (if you have a multicore server) for concurrent requests
See for more information on express performance
https://expressjs.com/en/advanced/best-practice-performance.html
You may also want to throttle you incoming connections as when the thread starts context switching response times drop rapidly this can be done by adding something like NGINX / HA Proxy in front of your application
2. Wait for the store to be hydrated before calling render to string
You don't want to have to render you layout until your store has finished updating as other comments note this is a blocks the thread while rendering.
Below is the example taken from the saga repo which shows how to run the sagas with out the need to render the template until they have all resolved
store.runSaga(rootSaga).done.then(() => {
console.log('sagas complete')
res.status(200).send(
layout(
renderToString(rootComp),
JSON.stringify(store.getState())
)
)
}).catch((e) => {
console.log(e.message)
res.status(500).send(e.message)
})
https://github.com/redux-saga/redux-saga/blob/master/examples/real-world/server.js
3. Make sure node environment is set correctly
Also ensure you are correctly using NODE_ENV=production when bundling / running your code as both express and react optimise for this
The calls to renderToString() are synchronous, so they are blocking the thread while they are running. So its no surprise that when you have 100+ concurrent requests that you have an extremely blocked up queue hanging for ~10 seconds.
Edit: It was pointed out that React v16 natively supports streaming, but you need to use the renderToNodeStream() method for streaming the HTML to the client. It should return the exact same string as renderToString() but streams it instead, so you don't have to wait for the full HTML to be rendered before you start sending data to the client.

Request/response "language" design when using Web Sockets

I am using socket.io with nodejs on the server side to provide two-way communication between my web application and the server.
I am using the socket.io API to issue commands and receive responses, but I am not sure if there is a more methodical way of defining a "language" for sending commands to the server and receiving results from it.
For sending commands to the server, I am emitting events from the web application like the following (I am using pseudo code below):
socket.emit('commandRequest', {
msg_id: '...'
username: '...',
command: '...'
});
The server evaluates the command and emits responses like the following:
socket.on('commandRequest', (data) => {
// parse and execute data.command
socket.emit('commandResponse', {
msg_id: data.msg_id,
username: data.username,
response: ...,
error: ...
});
})
Finally, the web application is listening to responses from the server and it updates the app content accordingly:
socket.on('commandResponse', (data) => {
if (data.error) {
...
} else {
// interpret data.response
}
})
So I am using the commandRequest/commandResponse event naming paradigm and the event data structure has corresponding {command: ...} and {response: ...} properties.
Is there a more formal way of defining a request/response "language" that can be used for more complex client/server interactions? Something similar to what REST APIs achieve with HTTP request, but using web sockets?
You could try looking into Primus and its Responder plugin as described in this blog post: Request-Response Oriented Websockets
An excerpt from the post:
It allows to request the opposite peer for a response using a simple API known from other libraries.
Primus Responder adds two things to Primus' core functionality:
A new event called request is emitted as soon as a peer requests a response.
The new method named writeAndWait(data, fn) writes data to a peer and runs given callback function as the peer sends its response.
Primus Responder is available on the client too.

emitting a nodejs event when an event on google calendar is added or updated

Is this possible to emit a nodejs event when a Google calendar event is added or modified? On my nodejs server, I can get/list all the events on the calendar. But I would like to retrieve the events based on the nodejs event rather than checking it manually after a regular interval. Appreciate any ideas!
Actually, it is possible, as stated here: https://developers.google.com/google-apps/calendar/v3/push
You can register a callback URL so you app receives information when your event changes or gets modified. With the Node.JS API it's something like:
calendar.events.watch({
auth:jwtClient,
resource: {
id: "yourChannelId",
type: 'web_hook',
address: "https://www.yoursite.com/notifications"
},
calendarId: "yourcalendarId"
}, function(error, response) {
if (error) {
console.log(error);
return;
}
console.log(response);
});
Not possible. Something like that would require Google to know about your app (to send events or push data to). Google's APIs is only for meant to be accessed. It cannot "tell" your app anything. Your app has to be the one that "asks" Google whether or not something it wants exists or has happened.

Using firebase authentication for a nodejs application

I don't know if this will work out, or is it the right thing to do.
I have created an angularjs application and used firebase to provide my application a "backend", or to contain any data that my application needs.
Also I do not want to bother myself when dealing with authentication, and FirebaseSimpleLogin is just awesome tool for the job.
I could do:
resolve : {
'isAuthenticated': isLoggedIn
}
in my routes so I would be able to prevent them from moving to secured routes. So there is no problem, I already have an authenticated user.
The problem is, i only used firebase to save user data and for auth, and nothing else.
Now I want to do some server tasks in my server, but I want only authenticated users to do that.
How would I determine that the user is authenticated in firebase?
Is this what firebase token generator for.
Or should I just, create an authentication system using nodejs?
Check out the queue pattern. Have the user write items to the queue, have the server respond to them.
The really great part of using Firebase as the API/middle-man is that the worker (i.e. server) does not need to worry about if the client has authenticated. Security rules take care of this.
Just write a rule to only allow logged-in users to write into the queue:
{
"rules": {
"queue": {
"in": {
// I can only write if logged in
".write": "auth !== null",
"user_id": {
// I can only write to the queue as myself, this tells the server which
// out/ queue the user will be listening on
".validate": "auth.uid === newData.val()"
}
},
"out": {
"$userid": {
// I can only listen to my out queue
".read": "auth.uid === $userid"
}
}
}
}
}
Now the user simply writes a record to in/ using push(), then listens on out/ until the server replies.
The server reads records out of the in/ queue, processes them, and writes them back to the out/user_id path.
No RESTful protocols, no express servers, no headaches.

Resources