Getstream.io how to include notification count in realtime subscription - getstream-io

When subscribing to a notification feed, is there any way to get the unseen and unread notification counts from the payload given by the getstream realtime subscription promise? If not, are there plans to implement this in the future?
I'm aware I can do a notificationFeed.get({limit: 0}) to retrieve that data, however, within our current system, it would be a lot more convenient if that count came with the subscription payload.
this.notificationFeed
.subscribe((payload) => {
console.log(payload)
})
.then(() => {
//console.log('Full (Global Feed Flat): Connected to faye channel, waiting for realtime updates');
}, (err) => {
console.error('Full (Notification Feed): Could not establish faye connection', err);
});
Currently, the payload doesn't include the unseen or unread counts when giving back data from a notification feed.

The realtime notification payload does not include unseen / unread counts for a feed. If you listen for changes on a feed and want to update counters you need to perform an API call and retrieve the counters from there.

Related

io.to(targetSocketID).emit() is there a way to emit to just sender and receiver

I'm trying to create a private messaging functionality using socket.io with React and Node.
I am able to send a message to a particular socket like so:
io.to(targetSocketID).emit('privateMessage' message)
it successfully sends it to that specific user but not the sender. is there a way to just emit a message to sender and the target user to make this simple? From what I can see there is two approaches here.
When a message is created push the sender messages into a messages array setMessages([...messages, senderMessages]) and also push the socket message into that array setMessages([...messages, receivedMessages]). this approach seems sloppy and like to avoid this route as it can become problematic.
generate a unique room for each user and send the room to the server and join it:
//server
socket.on('joinRoom', room => {
socket.join(room)
socket.on('privateMessage', message => {
socket.on(room).emit('messageResponse', message)
})
})
I would like to know if there is a better way to do this.
that allows me to emit a message to just sender AND targeted receiver.

How to send notifications from node.js server to android client.

What technologies do I need to use to send notification from the node.js server to the android client.For example, user A adds user B to friends, at this time user B should receive a notification to his android device that user A wants to add it to friends. I'm new to node.js, could you help me what exactly should I use to implement sending such notifications.
You could use MQTT or AMQP messaging, these are very flexible technologies, well suited to push messages to clients.
https://en.wikipedia.org/wiki/MQTT
https://en.wikipedia.org/wiki/Advanced_Message_Queuing_Protocol
Node.js has very good support for both.
Android has an MQTT client available with an example here: http://androidkt.com/android-mqtt/.
Essentially you can push messages to clients with something like:
client.publish (topic, message).
And clients would subscribe like:
client.on('message', function (topic, message) {
// Messages are Buffer objects.
console.log(message.toString())
client.end()
})
Clients would received this using either a callback or by polling.
Both technologies use a Broker that acts as a go between for the messages.
There are free online Brokers you can use to test messaging, e.g. mqtt://test.mosquitto.org
In Express, once you have your messaging client initialised, you can message on new events, POSTS, PUTS, etc.
app.post("/addFriend", function(req, res, next){
console.log("Friend request added");
// Write to db.
// Send a message
mqttClient.publish('friends-topic', JSON.stringify({event: 'newfriend', id: '10122', name: 'Mark' }))
res.end('ok', 200);
});
On the server side you need something to work with Google's Cloud Messaging service, for instance the node-gcm module
https://github.com/ToothlessGear/node-gcm

How To Rate-Limit Google Cloud Pub/Sub Queue

I'm using Google's Pub/Sub queue to handle messages between services. Some of the subscribers connect to rate-limit APIs.
For example, I'm pushing street addresses onto a pub/sub topic. I have a Cloud function which subscribes (via push) to that topic, and calls out to an external rate-limited geocoding service. Ideally, my street addresses could be pushed onto the topic with no delay, and the topic would retain those messages - calling the subscriber in a rate-limited fashion.
Is there anyway to configure such a delay, or a message distribution rate limit? Increasing the Ack window doesn't really help: I've architected this system to prevent long-running functions.
Because there's no answer so far describing workarounds, I'm going to answer this now by stating that there is currently no way to do this. There are workarounds (see the comments on the question that explain how to create a queueing system using Cloud Scheduler), but there's no way to just set a setting on a pull subscription that creates a rate limit between it and its topic.
I opened a feature request for this though. Please speak up on the tracked issue if you'd like this feature.
https://issuetracker.google.com/issues/197906331
An aproach to solve your problem is by using: async.queue
There you have a concurrency attribute wich you can manage the rate limit.
// create a queue object with concurrency 2
var q = async.queue(function(task, callback) {
console.log('hello ' + task.name);
callback();
}, 2);
// assign a callback
q.drain = function() {
console.log('all items have been processed');
};
// add some items to the queue
q.push({name: 'foo'}, function(err) {
console.log('finished processing foo');
});
// quoted from async documentation
GCP cloud task queue enables you to limit the number of tasks. Check this doc

Send batches to web API

I have a mongodb and NodeJS setup on expressJS. What this API basically does is storing e-mail adresses and other information about users.
These are called personas and are stored in a MongoDB database. What I'm trying to do now is calling a url in my app, which sends all personas to the Mailchimp API.
However, as the amount of personas that are stored is quite high (144.000), I can not send them in one batch to the Mailchimp API. What I'm trying to do is send them in batches, without much luck.
How would I go about to set this up? Currently I'm using the Async package to limit the simultaneous sends to the Mailchimp API. But I'm not sure if this is the correct way to go.
I guess the code below is not working, as the personas-array I collect is too big to fit in the memory. But I'm not sure how to chunk it up in a correct way.
//This is a model function which searches the database to collect all personas
Persona.getAllSubscriptions(function(err, personas) {
//Loop send each persona to mailchimp
var i = 1;
//This is the async module I'm using to limit the simultaneous requests to Mailchimp
async.forEachLimit(personas, 10, function (persona, callback) {
//This is the function to send one item to mailchimp
mailchimpHelper.sendToMailchimp(persona, mailchimpMergefields, function(err,body){
if(err) {
callback(err);
} else if(!body) {
callback(new Error("No response from Mailchimp"));
} else {
console.log(i);
i++;
callback();
}
});
}, function(err) {
if (err) console.log(err);
//Set a success message
res.json({error: false, message: "All personas updated"});
});
});
I ran into a similar problem with a query to a collection that could return more than 170,000 documents. I ended up using the "stream" API to build batches to be processed. You could do something similar to "build" batches to send to MailChimp.
Here's an example.
var stream = db.collection.find().stream(); //be sure find is returning a cursor
var batch = []
this.stream.on('data', function(data){
batch.push(data);
if(batch.length >= maxBatchSize){
stream.pause();
// send batch to mail chimp
}
});
this.stream.on('pause', function(){
// send batch to mailChimp
// when mailChimp has finished
stream.resume();
});
this.stream.on('end', ()=>{
// data finished
});
You can look at the documentation for cursor and stream here
Hope this helps.
Cheers.
There seem to be some things that I wouldn't do so like you described. You are trying to quite heavy processing inside the node server. The trigger by url could cause you a lot of problems if you do not secure it.
Also, this is a heavy process which is better to be implemented as queue-worker approach separated from the server. This would give you more control over the process, some of the email sendings might fail or error might occur on the mailchimp side(API is down etc). So instead of triggering directly sending, just trigger worker and process emails as chunks as #jackfrster described.
Make sure you have checked the Mailchimp API limits. Do you have considered alternatives like creating campaign and send out the campaign so you would not need to sending for each person in list ?

How to send a notification to multiple subscribers using web-push package of Node.js?

How we can send a notification to multiple subscribers using web-push package of Node.js?
We can loop through for list of subscribers but it won't be scalable for millions of subscribers.
Example:
for(var myKey in subData) {
endp = subData[myKey]['ENDPOINT'];
enckey = subData[myKey]['ENCKEY'];
webPush.sendNotification(endp, 200, enckey, JSON.stringify({
title: 'Test Title',
msg: 'Test Notification',
}));
}
Unfortunately, the Web Push standard currently doesn't support sending a notification to multiple subscribers.
Indeed, the key is related to a specific user. If all users shared the same key, the encrypted data would be readable by others. Read [1] for a more thorough explanation.
I'd suggest implementing a solution that sends notifications in batches, waiting a little between the batches. Usually the notifications don't have to be real-time, so this is usually feasible.

Resources