I'm setting up a Nodejs express server in Firebase. I have a Dashboard page. The user can save items to their dashboard at any time using a Chrome Extension. I want their new saved items to appear regularly on the dashboard.
Busy polling seems straightforward to run with
setInterval( () => {
// async call to api with paging cursor
But that seems to be a waste of resources.
I read about Server Side Events and tried to implement them with code. All the SSE examples I've seen look something like this:
var clients = {}; // <- Keep a map of attached clients
app.get('/events/', function (req, res) {
req.socket.setTimeout(Number.MAX_VALUE);
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
});
res.write('\n');
(function (clientId) {
clients[clientId] = res;
req.on("close", function () {
delete clients[clientId]
});
})(++clientId)
});
setInterval(function () {
var msg = Math.random();
console.log("Clients: " + Object.keys(clients) + " <- " + msg);
for (clientId in clients) {
clients[clientId].write("data: " + msg + "\n\n"); // <- Push a message to a single attached client
};
}, 2000);
Getting the SSE events to work is no problem in the test examples, and it works for my use cases.
However, my concerns:
Every express response is stored in memory. 100k users is a lot of memory
The API POST method needs access to the clients/responses. A response object can't easily be stored in a DB.
Ideally, each request is authenticated with header auth bearer tokens. This does seem to be possible with browser EventSource
How do enterprise level apps actually implement effective SSE?
Related
I have a mock backend in nodejs/express that I need to get working. There is an SSE setup like this:
app.get("/api/sseConnect" ,(req, res) => {
headers = {
"Content-Type": "text/event-stream",
Connection: "keep-alive",
"Access-Control-Allow-Origin": "*",
"Cache-Control": "no-transform",
};
res.writeHead(200, headers);
let intervalID = setInterval(() => {
res.write(`data: ${JSON.stringify(Math.random())}\n\n`);
}, 5000);
res.on("close", () => {
clearInterval(intervalID);
res.end();
});
});
This is working great - the client hits this route, the connection is established, and the client recieves the message every 5 seconds.
There are other standard routes, which when accessed, modify some data in the database. Once the data is modified, I need to send a server-sent-event. So for example:
app.post("/api/notifications/markasread", (req, res) => {
let { ids } = req.body;
ids.forEach(id => database.notifications[id].read = true) // or whatever
// Send a server-sent-event with some information that contains id, message status, etc
})
I know this seems really silly (why not just send a response???), but this is the way that the live api is set up - there is no response from this post route (or certain other routes). They needs to trigger an SSE, which is listened for on the front end with an EventSource instance. Based on what is heard in the eventSource.onmessage listener, a whole bunch of things happen on the front end (react-redux).
How can I 'hijack' the SSEs and trigger a response from a standard POST or GET route?
A bit of a background, I come from a Java background and I already got this problem working. However, I'm trying to transition to Node/Express/Mongoose for the same problem.
To summarize, I have a REST endpoint to fetch a list of events (meetings, parties, etc.) sorted by date. However, I'd like to stream these results from the Mongo database one JSON document at a time with a delay of 100ms.
Right now, my code returns all the events as one result instead of chopping them up into individual documents and streaming them one at a time.
We're using Angular with EventSource() as the client. I know my Angular code works because everything works fine with my Java backend code.
Here's what I presently have:
app.get('/events', (req, res) => {
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
});
Event.find({}).sort({ date:1 }).then(events => {
console.log(events);
res.write(JSON.stringify(events) + '\n');
}).catch(err => {
res.status(500).send({
message: err.message
});
});
});
I also tried this:
var stream = Event.find({}).sort({ date: 1 }).stream();
stream.on('data', function(doc) {
//console.log(doc);
res.write(JSON.stringify(doc));
});
stream.on('end', function() {
res.end();
});
And this:
var cursor = Event.find().sort({ date: 1}).cursor();
cursor.on('data', function(doc) {
res.write(JSON.stringify(doc));
});
cursor.on('close', function() {
res.end();
});
I'm hoping the answer would be simple, but due to my newbie nature in Node, I'm not even sure what I'm looking for. Any help would be great.
Thanks.
UPDATE
This issue is partially resolved, the problem now lies in authenticating the ApiGateway request. I am unsure of how to acquire the necessary tokens to send with the request so that it is valid, because this is a [serverless-framework] service so I can't use the AWS Console to copy paste the tokens into the request's json data. Moreover, I wouldn't know what json key they'd have to be under anyways. So I guess this question has changed considerably in scope.
I need to respond/delete an active websocket connection established through AWS ApiGatewayV2, in a Lambda. How do I use node js to send a POST request that ApiGateway can understand?
I saw on the websocket support announcement video that you could issue an HTTP POST request to respond to a websocket, and DELETE request to disconnect a websocket. Full table from the video transcribed here:
Connection URL
https://abcdef.execute-api.us-west-1.amazonaws.com/env/#connections/connectionId
Operation Action
POST Sends a message from the Server to connected WS Client
GET Gets the latest connection status of the connected WS Client
DELETE Disconnect the connected client from the WS connection
(this is not documented anywhere else, AFAIK)
Seeing as the AWS SDK does not provide a deleteConnection method on ApiGatewayManagementApi, I need to be able to issue requests directly to the ApiGateway anyways.
const connect = async (event, context) => {
const connection_id = event.requestContext.connectionId;
const host = event.requestContext.domainName;
const path = '/' + event.requestContext.stage + '/#connections/';
const json = JSON.stringify({data: "hello world!"});
console.log("send to " + host + path + connection_id + ":\n" + json);
await new Promise((resolve, reject) => {
const options = {
host: host,
port: '443',
path: path + connection_id,
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': Buffer.byteLength(json)
}
};
const req = https.request(
options,
(res) => {
res.on('data', (data) => {
console.error(data.toString());
});
res.on('end', () => {
console.error("request finished");
resolve();
});
res.on('error', (error) => {
console.error(error, error.stack);
reject();
});
}
);
req.write(json);
req.end();
});
return success;
};
When I use wscat to test it out, this code results in the console.log showing up in CloudWatch:
send to ********.execute-api.us-east-2.amazonaws.com/dev/#connections/*************:
{
"data": "hello world!"
}
...
{
"message": "Missing Authentication Token"
}
...
request finished
And wscat says:
connected (press CTRL+C to quit)
>
But does not print hello world! or similar.
Edit
I was missing
res.on('data', (data) => {
console.error(data.toString());
});
in the response handler, which was breaking things. This still doesnt work, though.
You're likely missing two things here.
You need to make an IAM signed request to the API Gateway per the documentation located here: Use #connections Commands in Your Backend Service
You'll need to give this lambda permission to invoke the API Gateway per the documentation here: Use IAM Authorization
I hope this helps!
Background:
Angular recommends observables for app development - they are different from Promises because promises complete on a resolution or a rejection (one event) whilst observables can handle a stream of data (multiple events). My question is not about choosing one over the other, it is about reading a stream of data from NodeJS.
Digging deeper in observable
we see observable reading from stream of events, stream of data generated on client-side (form array, at a setInterval) etc.
My question is about observables (publisher - subscriber model) being used to subscribe to a nodeJS/expressJS backend, which will be the publisher here - which is sending simple res.write at a fixed interval.
What I tried and What happened...
when we use observable with this stream of string data being generated, the results are only displayed once the backend completes... the observable call is left hanging...
What I'd like to happen?
using Angular and observables, what will be the way to subscribe to this data and show it (on client-side) as and when it is generated (at server-side)
server-side code using NodeJS / Express JS
app.get('/obs/responseWriteNEW', cors(), function(req, res) {
var completedWrite = false;
try {
var j = 0;
const headers = {
'Content-Type': 'text',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'OPTIONS, POST, GET',
'Access-Control-Max-Age': 2592000, // 30 days
};
res.writeHead(200, headers);
var i = 0;
setInterval(function() {
if (i < 100) {
var myObj = {
responseWriteNEW: i
};
var myStr = JSON.stringify(myObj);
console.log(myStr);
res.write(myStr);
i++;
} else {
res.end();
}
}, 250);
} catch (expp) {
console.log(expp);
}
});
client-side code
https://stackblitz.com/edit/angular-44sess
using vanilla-JS, we got first chunk of data printed after the length of the streamed data went over a 1030.
xhr = new XMLHttpRequest();
xhr.addEventListener("progress", function(ev) { console.log(ev.target.responseText); $("#ajaxLog").append(ev.target.responseText); });
xhr.open("GET", 'http://localhost:2025/obs/responseWriteNEW');
xhr.send();
I'm trying to use recaptcha on my website. Nodejs server with express framework. The site isn't being hosted, I'm still working on it locally. On the homepage, after the user enters his info to create an account, and solves the recaptcha, I send the results
$("#g-recaptcha-response").val()
to the server. And on my server,
https.get("https://www.google.com/recaptcha/api/siteverify?secret=" + SECRET + "&response=" + key, function(res) {
var data = "";
res.on('data', function (chunk) {
data += chunk.toString();
});
res.on('end', function() {
try {
var parsedData = JSON.parse(data);
console.log(parsedData);
callback(parsedData.success);
} catch (e) {
callback(false);
}
});
});
where key is the response and SECRET is the secret key they give you. I declared
a variable SECRET and stored the secret key as a string in it.
Every single time, the for the
console.log(parsedData);
It's saying
{ success: false, 'error-codes': [ 'invalid-input-secret' ] }
I copied and pasted the secret key, how could it be invalid. It's only supposed to show this error if "The secret parameter is invalid or malformed" as it says on their website. I followed this tutorial.
I followed the tutorial too and then bumped into the same error that you have reported here. Looking closely at the screenshot in the tutorial it shows
Send a GET request with these parameters
And checking the Google reCaptcha website it says
Send a POST request with these parameters
I am curious whether Google changed their mind about POST instead of GET or the screenshot in the tutorial is from a different source.
Regardless, I have tweaked the version of code in the tutorial to make POST request (below code uses querystring module), see below:
var SECRET = "YourSecretHere";
// Helper function to make API call to recatpcha and check response
function verifyRecaptcha(key, callback) {
//declare above var querystring = require('querystring') on top
var post_data = querystring.stringify({
'secret' : SECRET,
'response': key
});
var post_options = {
host: 'www.google.com',
port: '443',
method: 'POST',
path: '/recaptcha/api/siteverify',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
'Content-Length': Buffer.byteLength(post_data)
}
};
var req = https.request(post_options, function(res) {
var data = "";
res.on('data', function (chunk) {
data += chunk.toString();
});
res.on('end', function() {
try {
var parsedData = JSON.parse(data);
callback(parsedData.success);
} catch (e) {
callback(false);
}
});
});
req.write(post_data);
req.end();
req.on('error',function(err) {
console.error(err);
});
}
I also wanted to add that remoteip field is optional but, you can pass that value too if you want too. In order to do that, you need to retrieve the remoteIpAddress from connection object or simply enable trust proxy on your app as shown below:
app.enable('trust proxy');
and then pass the ip address to the verifyRecaptcha and the call would look like follow:
verifyRecaptcha(req.ip, req.body["g-recaptcha-response"], function(success) {
if(success) { ...} else { ... }
});
You then need to modify the post_params to include remoteip field as follow:
var post_data = querystring.stringify({
'secret' : SECRET,
'response': key,
'remoteip': ip
});
app.enable('trust proxy'); allows req.ip and req.ips which is an array of ip addresses. For more info on getting the ip address of request see this SO question.
If you are developing and you get fed up with all tricky famous and the most annoying Street Names reCaptcha, then I recommend that you use the test Site and Secret keys provided by Google to override captcha solving in order to speed up development. See here
This is really stupid, and I can't believe I wasted this much time on it but instead of using the variable SECRET, I just added my secret key to the url and it worked.