We are using microservices architecture for our project. Our project is similar to a blog. There is an activities service, which logs all the activities done by the user, like, adding post, comment, replying to a comment and so on.
Now for each activity, we need to send an SSE notification to the involved users. For this, we are using another service called notifications. So whenever an activity occurs, an HTTP request will be sent to notifications service which handles sending various SSE events.
However, we are facing some issues in sending SSE. The two major issue we are facing are memory leak and an error Error: write after end
Route
router
.route("/")
.get(controller.eventStream)
.post(controller.sendNotification);
Controller
import axios from "axios";
import eventEmitter from '../services';
const controller = {
eventStream: (req, res, next) => {
console.log("Inside event stream");
res.writeHead(200, {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
"Connection": "keep-alive"
});
eventEmitter.on("sse", (event, data) => {
console.log('Event Triggered');
res.write(`event: ${event}\ndata: ${JSON.stringify(data)} \n\n`);
});
req.on("close", () => {
console.log("Inside Close");
res.end();
});
},
sendNotification: (req, res, next) => {
try {
const {
userId,
action,
type,
item_id,
} = req.body;
// First check the type of activity that has been performed
switch (type) {
case "topic":
// Then check the type of action that has been done
switch (action) {
case "edit":
console.log("Topic edited");
const data= 'John Doe has edited a topic';
eventEmitter.emit("sse", `edit-topic`, data);
break;
}
break;
}
res.send('sse successfully send');
} catch (error) {
res.status(500).json({error: 'SSE failed'});
}
}
};
export default controller;
Service
export default new events.EventEmitter();
Initally client side will send a GET request which executes the eventStream controller.
Now for each activity, an SSE needs to be sent through this stream
I believe Error: write after end is because the event is triggered after sending a respose. This can be fixed by removing res.send('sse successfully send'); However, node will throw the timeout error.
I am not sure how to send an SSE from sendNotification controller. Moreover, I am not sure whether this is the right approach. Any guidence would be much appreciated.
Related
I created an adapter-node Sveltekit API endpoint, which streams quotes using a readable stream. When I quit the client route The streaming has to stop. This works fine in development using Sveltekit "npm run dev" (vite dev) or using a windows desktop container (node build).
onDestroy(async () => {
await reader.cancel(); // stop streaming
controller.abort(); // signal fetch abort
});
But when I build and deploy the node container on Google Cloud Run the streaming works fine. Except when I quit the client route: the API endpoint keeps on streaming. The log shows: enqueus for 5 more minutes followed by a delayed Readablestream cancel() on the API server.
Why this 5 minutes between the client cancel / abort and the cancel on the server?
The API +server.js
import { YahooFinanceTicker } from "yahoo-finance-ticker";
/** #type {import('./$types').RequestHandler} */
export async function POST({ request }) {
const { logging, symbols } = await request.json();
const controller = new AbortController();
const ticker = new YahooFinanceTicker();
ticker.setLogging(logging);
if (logging) console.log("api ticker", symbols);
const stream = new ReadableStream({
start(controller) {
(async () => {
const tickerListener = await ticker.subscribe(symbols);
tickerListener.on("ticker", (quote) => {
if (logging) console.log("api", JSON.stringify(quote, ["id", "price", "changePercent"]));
controller.enqueue(JSON.stringify(quote, ["id", "price", "changePercent"]));
});
})().catch(err => console.error(`api listen exeption: ${err}`));
},
cancel() { // arrives after 5 minutes !!!
console.log("api", "cancel: unsubscribe ticker and abort");
ticker.unsubscribe();
controller.abort();
},
});
return new Response(stream, {
headers: {
'content-type': 'text/event-stream',
}
});
}
Route +page.svelte
const controller = new AbortController();
let reader = null;
const signal = controller.signal;
async function streaming(params) {
try {
const response = await fetch("/api/yahoo-finance-ticker", {
method: "POST",
body: JSON.stringify(params),
headers: {
"content-type": "application/json",
},
signal: signal,
});
const stream = response.body.pipeThrough(new TextDecoderStream("utf-8"));
reader = stream.getReader();
while (true) {
const { value, done } = await reader.read();
if (logging) console.log("resp", done, value);
if (done) break;
... and more to get the quotes
}
} catch (err) {
if (!["AbortError"].includes(err.name)) throw err;
}
}
...
The behavior you are observing is expected, Cloud Run does not support client-side disconnects yet.
It is mentioned in this article, that
Cloud Run (fully managed) currently only supports server-side
streaming. Having only "server-side streaming" basically means when
the "client" disconnects, "server" will not know about it and will
carry on with the request. This happens because "server" is not
connected directly to the "client" and the request from the "client"
is buffered (in its entirety) and then sent to the "server".
You can also check this similar thread
It is a known issue, there is already a public issue exists for the same. You can follow that issue for future updates and also add your concerns there.
I have a mock backend in nodejs/express that I need to get working. There is an SSE setup like this:
app.get("/api/sseConnect" ,(req, res) => {
headers = {
"Content-Type": "text/event-stream",
Connection: "keep-alive",
"Access-Control-Allow-Origin": "*",
"Cache-Control": "no-transform",
};
res.writeHead(200, headers);
let intervalID = setInterval(() => {
res.write(`data: ${JSON.stringify(Math.random())}\n\n`);
}, 5000);
res.on("close", () => {
clearInterval(intervalID);
res.end();
});
});
This is working great - the client hits this route, the connection is established, and the client recieves the message every 5 seconds.
There are other standard routes, which when accessed, modify some data in the database. Once the data is modified, I need to send a server-sent-event. So for example:
app.post("/api/notifications/markasread", (req, res) => {
let { ids } = req.body;
ids.forEach(id => database.notifications[id].read = true) // or whatever
// Send a server-sent-event with some information that contains id, message status, etc
})
I know this seems really silly (why not just send a response???), but this is the way that the live api is set up - there is no response from this post route (or certain other routes). They needs to trigger an SSE, which is listened for on the front end with an EventSource instance. Based on what is heard in the eventSource.onmessage listener, a whole bunch of things happen on the front end (react-redux).
How can I 'hijack' the SSEs and trigger a response from a standard POST or GET route?
How do we implement server sent events in FeathersJS, normally for express based applications we have
app.get('/sse', (req, res, next) => {
res.status(200).set({
"Connection": "keep-alive",
"cache-control" : "no-cache",
"Content-Type" : "text/event-stream"
});
how to do the same in feathers JS any help would be appreciated
I got SSE running in a FeathersJS app - it works great, except that the app seems to hang while a SSE request is pending (listening on events):
import { createSession, createChannel } from 'better-sse';
import { IncomingMessage, ServerResponse } from 'http';
export default function (app: any): void {
const debugSseSource = require('debug')('sseSource');
const sseChannel = createChannel();
// Order creations
const orders: any = app.service('orders'); // use endpoint URL
app.get('/v1/admin/orders', async (req: IncomingMessage, res: ServerResponse) => {
// on new connections, add SSE session to SSE channel
const sseSession = await createSession(req, res);
sseChannel.register(sseSession);
debugSseSource('Client (orders) has joined.');
//res.end(); // #TODO/PROBLEM without the whole FeathersJS app hangs - but it stops the request...
});
orders.on('created', (order: any) => {
// broadcast to all clients
debugSseSource('order created');
sseChannel.broadcast(order, 'created');
});
};
Related FeathersJS GitHub issue: https://github.com/feathersjs/feathers/issues/369
I'm building an Express app using Twilio to allow a group of people to communicate via SMS without having to install an app or deal with the limitations on group texts that some phones/carriers seem to have. It's deployed via Azure, but I'm reasonably sure I'm past the configuration headaches. As an early test that I can make this work and for a bit of flavor, I am trying to set up a feature so you can text "joke" (ideally case-insensitive) and it will send a random joke from https://icanhazdadjoke.com/. If anything else is texted, for now it should basically echo it back.
I get the sense this has to do with js being asynchronous and the code moving on before the GET comes back, so I'm trying to use promises to get the code to wait, but the conditional nature is a new wrinkle for me. I've been looking for answers, but nothing seems to work. I've at least isolated the problem so the non-joke arm works correctly.
Here is the function for retrieving the joke, the console.log is outputting correctly:
const rp = require('request-promise-native');
var options = {
headers: {
'Accept': 'application/json'
}
}
function getJoke() {
rp('https://icanhazdadjoke.com/', options) //add in headers
.then(joke => {
theJoke = JSON.parse(joke).joke
console.log(theJoke)
return theJoke
});
}
}
Here is the part of my router that isn't working quite right. If I text something that isn't "joke", I get it echoed back via SMS. If I text "joke", I don't get a reply SMS, I see "undefined" in the Kudu log (from below), and then I see the log of the POST, and then afterward I see the joke from the function above having run.
smsRouter.route('/')
.post((req, res, next) => {
const twiml = new MessagingResponse();
function getMsgText(request) {
return new Promise(function(resolve, reject) {
if (req.body.Body.toLowerCase() == 'joke') {
resolve(getJoke());
}
else {
resolve('You texted: ' + req.body.Body);
}
})
}
getMsgText(req)
.then(msg => {
console.log(msg);
twiml.message(msg);
res.writeHead(200, {'Content-Type': 'text/xml'});
res.end(twiml.toString());
})
})
How can I make it so that getMsgText() waits for the getJoke() call to fully resolve before moving on to the .then?
I think this is what you're looking for.
Note that I've used async/await rather than promise chaining.
// joke.get.js
const rp = require('request-promise-native');
var options = {
headers: {
'Accept': 'application/json'
}
}
async function getJoke() {
const data = await rp('https://icanhazdadjoke.com/', options) //add in headers
return JSON.parse(data).joke;
}
// route.js
smsRouter.route('/')
.post(async (req, res, next) => {
const twiml = new MessagingResponse();
async function getMsgText(request) {
if(req.body.Body.toLowerCase() === 'joke'){
return await getJoke();
}
return `You texted: ${req.body.Body}`
}
const msg = await getMsgText(req);
twiml.message(msg);
res.status(200).send(twiml.toString());
})
async/await in JS
I'm trying to create a basic app in node.js that a) tracks a keyword in twitter and temporarily stores messages relating to that keyword, b) after enough messages have been accumulated, return it to the user. I'm using the ntwitter library.
I've a basic long polling system implemented on my client and server side, but I'm having some trouble on verification. The way I set it up currently, it verifies the user each time /api/streamfeed is called, so potentially every 30sec (since I have a 30s timeout schedule) before checking the stream. I'm thinking this will get me into trouble since I believe verification is rate-limited? Is there a way to check whether I'm verified without having to ping Twitter's API (perhaps store a boolean after the first attempt)?
Client side:
//upon receiving a response, poll again
function getStreamFeed() {
console.log('calling getStreamFeed');
$http.get('/api/streamfeed').success(function(data) {
console.log(data)
getStreamFeed();
});
};
setTimeout(getStreamFeed, 1000);
Server side:
app.get('/api/streamfeed', function(req, res) {
/*
...
polling code
...
*/
twit.verifyCredentials(function(err, data) {
if (err) res.send(404);
twit.stream('statuses/filter', {
track: 'justin bieber'
}, function(stream) {
stream.on('data', function(data) {
console.log(data.text)
messages.push(data.text);
});
})
});
});
I'd send the credentials back and resend them again... this could be a bool, or actual credentials to use. these aren't your private keys or anything, only the user's.
could also be sent in headers and cookies and properly hashed etc.
this just simply shows a pattern that should work.
client side:
function getStreamFeed(credentials) {
//upon receiving a response, poll again
console.log('calling getStreamFeed');
var url = '/api/streamfeed';
if (credentials) {
url += '&credentials=' + credentials;
}
$http
.get(url)
.success(function(data) {
console.log(data)
getStreamFeed(true);
});
};
setTimeout(getStreamFeed, 1000);
Server side:
app.get('/api/streamfeed', function(req, res) {
function twitStream () {
twit.stream('statuses/filter', {track: 'justin bieber'}, function(stream) {
stream.on('data', function(data) {
console.log(data.text)
messages.push(data.text);
});
}
}
var credentials = req.query.credentials;
if (credentials) {
twitStream()
}
twit.verifyCredentials(function(err, data) {
if (err) res.send(404);
twitStream()
});
});