I have a mock backend in nodejs/express that I need to get working. There is an SSE setup like this:
app.get("/api/sseConnect" ,(req, res) => {
headers = {
"Content-Type": "text/event-stream",
Connection: "keep-alive",
"Access-Control-Allow-Origin": "*",
"Cache-Control": "no-transform",
};
res.writeHead(200, headers);
let intervalID = setInterval(() => {
res.write(`data: ${JSON.stringify(Math.random())}\n\n`);
}, 5000);
res.on("close", () => {
clearInterval(intervalID);
res.end();
});
});
This is working great - the client hits this route, the connection is established, and the client recieves the message every 5 seconds.
There are other standard routes, which when accessed, modify some data in the database. Once the data is modified, I need to send a server-sent-event. So for example:
app.post("/api/notifications/markasread", (req, res) => {
let { ids } = req.body;
ids.forEach(id => database.notifications[id].read = true) // or whatever
// Send a server-sent-event with some information that contains id, message status, etc
})
I know this seems really silly (why not just send a response???), but this is the way that the live api is set up - there is no response from this post route (or certain other routes). They needs to trigger an SSE, which is listened for on the front end with an EventSource instance. Based on what is heard in the eventSource.onmessage listener, a whole bunch of things happen on the front end (react-redux).
How can I 'hijack' the SSEs and trigger a response from a standard POST or GET route?
Related
Main Question:
Do I need to pass the header in every internal microservice request to have it in the other microservice if it is created by common proxy server?
Example:
Structure:
In Nginx.conf:
proxy_set_header X-Request-ID $request_id;
and when I request to any of these microservices, I can get the header value by using req.headers["x-request-id"]
But, when I do like:
// calling ms2 from ms1
const response = await fetch("http://ms2/");
In ms1 (the 1st hop) I can get the header as I show before, but in ms2 I can't.
I know express server processes a bunch of requests concurrently and it because of that. But anyways it feels weird when you need to pass the header every time with the internal requests:
// calling ms2 from ms1
const response = await fetch("http://ms2/", {
headers: {
"X-Request-ID": req.headers["x-request-id"],
},
});
// when do like this I can get the value in ms2.
Is there any way, or it is what it is?
If you need to perform the request to ms2 for every request to (a) specific endpoint(s) on ms1, you could create an Express middleware that would at least save you the trouble of having to perform the request each time manually:
const makeMs2Request = async (req, res, next) => {
const response = await fetch("http://ms2/", {
headers: {
"X-Request-ID": req.headers["x-request-id"],
}
);
req.ms2data = await response.json();
next();
});
app.get('/my/endpoint/on/ms1', makeMs2Request, (req, res) => {
// here you can use `req.ms2data`
…
});
I'm setting up a Nodejs express server in Firebase. I have a Dashboard page. The user can save items to their dashboard at any time using a Chrome Extension. I want their new saved items to appear regularly on the dashboard.
Busy polling seems straightforward to run with
setInterval( () => {
// async call to api with paging cursor
But that seems to be a waste of resources.
I read about Server Side Events and tried to implement them with code. All the SSE examples I've seen look something like this:
var clients = {}; // <- Keep a map of attached clients
app.get('/events/', function (req, res) {
req.socket.setTimeout(Number.MAX_VALUE);
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
});
res.write('\n');
(function (clientId) {
clients[clientId] = res;
req.on("close", function () {
delete clients[clientId]
});
})(++clientId)
});
setInterval(function () {
var msg = Math.random();
console.log("Clients: " + Object.keys(clients) + " <- " + msg);
for (clientId in clients) {
clients[clientId].write("data: " + msg + "\n\n"); // <- Push a message to a single attached client
};
}, 2000);
Getting the SSE events to work is no problem in the test examples, and it works for my use cases.
However, my concerns:
Every express response is stored in memory. 100k users is a lot of memory
The API POST method needs access to the clients/responses. A response object can't easily be stored in a DB.
Ideally, each request is authenticated with header auth bearer tokens. This does seem to be possible with browser EventSource
How do enterprise level apps actually implement effective SSE?
We are using microservices architecture for our project. Our project is similar to a blog. There is an activities service, which logs all the activities done by the user, like, adding post, comment, replying to a comment and so on.
Now for each activity, we need to send an SSE notification to the involved users. For this, we are using another service called notifications. So whenever an activity occurs, an HTTP request will be sent to notifications service which handles sending various SSE events.
However, we are facing some issues in sending SSE. The two major issue we are facing are memory leak and an error Error: write after end
Route
router
.route("/")
.get(controller.eventStream)
.post(controller.sendNotification);
Controller
import axios from "axios";
import eventEmitter from '../services';
const controller = {
eventStream: (req, res, next) => {
console.log("Inside event stream");
res.writeHead(200, {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
"Connection": "keep-alive"
});
eventEmitter.on("sse", (event, data) => {
console.log('Event Triggered');
res.write(`event: ${event}\ndata: ${JSON.stringify(data)} \n\n`);
});
req.on("close", () => {
console.log("Inside Close");
res.end();
});
},
sendNotification: (req, res, next) => {
try {
const {
userId,
action,
type,
item_id,
} = req.body;
// First check the type of activity that has been performed
switch (type) {
case "topic":
// Then check the type of action that has been done
switch (action) {
case "edit":
console.log("Topic edited");
const data= 'John Doe has edited a topic';
eventEmitter.emit("sse", `edit-topic`, data);
break;
}
break;
}
res.send('sse successfully send');
} catch (error) {
res.status(500).json({error: 'SSE failed'});
}
}
};
export default controller;
Service
export default new events.EventEmitter();
Initally client side will send a GET request which executes the eventStream controller.
Now for each activity, an SSE needs to be sent through this stream
I believe Error: write after end is because the event is triggered after sending a respose. This can be fixed by removing res.send('sse successfully send'); However, node will throw the timeout error.
I am not sure how to send an SSE from sendNotification controller. Moreover, I am not sure whether this is the right approach. Any guidence would be much appreciated.
I have two expressJS web applications that run on two different ports (88 & 8443). One application needs to send a POST request to the other. I am using axios ( have also tried requestjs) but I'm not getting any response from the other server.
I'm not sure if the server is sending any request at all since after few minutes it gets the "connect ETIMEDOUT" error.
Is there any better way to communicate to another application on a different port ?
CODE
// Posting Server
data = JSON.stringify({key: 'pair'});
axios.post(url, data, {
headers: {
'Content-Type': 'application/json',
}})
.then((resp) => {
if (resp.status == 200)
console.log('Done")
})
.catch((error) => console.log(error));
// Receiving Server
app.post('/', (req, res) => {
res.sendStatus(200)
});
I'm trying to use SSE with node + express: I intercept requests using an express route, then I initiate a SSE session by directly writing headers:
res.writeHead(200, {
"content-type": "text/event-stream",
"cache-control": "no-cache"
});
I proceed with writing intermittent payloads using "res.write()"s.
This works well with Chrome's EventSource, up until the time when I call ".close()" to end the session. Then, the connection keeps hanging: Chrome doesn't reuse the connection to initiate additional EventSource requests (or any other requests), and node never triggers a "close" event on the IncomingMessage instance.
My question is: How do I handle "eventSource.close()" properly using node's http API?
It's worth noting that:
Since I don't set a "content-length", Node automatically assumes "chunked" transfer encoding (this shouldn't be a problem AFAIK). It also defaults to "connection: keep-alive".
The session terminates OK when I'm using Firefox.
When the browser closes the event source it does let the server side know. On the server side, the response socket object (res.socket) will generate an end event followed by a close event. You can listen to this event and respond appropriately.
E.g.
res.socket.on('end', e => {
console.log('event source closed');
sseResponses = sseResponses.filter(x => x != res);
res.end();
});
If your server is trying to write to a socket closed on the browser, it should not raise an error, but will return false from res.write.
If both your server side code and client side code are hanging after you close the event source, you may have bugs on both sides.
More complete prototype, with your writeHead code from above.
var app = new (require('express'));
var responses = [];
app.get("/", (req, res) => {
res.status(200).send(`
<html>
<script>
var eventsource = null;
function connect() {
if (!eventsource) {
eventsource = new EventSource("/sse");
eventsource.onmessage = function(e) {
var logArea = window.document.getElementById('log');
logArea.value += e.data;
};
}
}
function disconnect() {
if (eventsource) {
var myeventsource = eventsource;
eventsource = null;
myeventsource.close();
}
}
</script>
<div>
<span>
Connect
Disconnect
<span>
</div>
<textarea id="log" style="width: 500px; height: 500px"></textarea>
</html>`);
});
app.get("/sse", (req, res) => {
res.writeHead(200, {
"content-type": "text/event-stream",
"cache-control": "no-cache"
});
res.socket.on('end', e => {
responses = responses.filter(x => x != res);
res.end();
});
responses.push(res);
});
app.listen(8080);
setInterval(() => {
responses.forEach(res => {
res.write('data: .\n\n');
});
}, 100);