Better way to send email to multiple recipients with AdonisJs - node.js

I want to send newsletters to multiple users, but when I try my code, the emails can be seen by everyone. And if I send it using looping, I'm afraid it will take a long time because there are many users. Is there a good solution to send multiple emails?
This is my code (I use SMTP):
await Mail.send('newsletter', data, (message) => {
message
.from('newsletter#web.com', 'Admin')
.subject('New Events Notification')
.to(emails) // array consist of multiple emails
})

You can use the sendLater method to send multiple emails. The Adonis js uses the memory queue inside this method. So it will not block your HTTP request.
After replacing the method your code will look like this .
await Mail.sendLater('newsletter', data, (message) => {
message
.from('newsletter#web.com', 'Admin')
.subject('New Events Notification')
.to(emails) // array consist of multiple emails
})
Now you can add your loop and Adonis js will send email in background.

Related

Changing Event Arguments while Executing

I am currently developing a Discord Bot focused on sharing Twitter Tweets to Discord Channels, coded in Node JS and using npm module Twit for Twitter API.
Twit, basically creates a stream event, that occurs everytime one of the users from the IDs Array Tweets Something and then a .on with the tweet information defined as tweet.
stream = twitterClient.stream('statuses/filter', { follow: followArray })
stream.on('tweet', tweet => {
})
The followArray is the Array containing the Users IDs, defined like: followArray = ['3083945176', '34507480']
However, if I want to add or remove an ID from the Array, I would have to restart the Bot, instead of being able to just followArray.push(userID). How could I update the Array without having to restart the Bot?

Sequentially execute webhooks received in node application

I have a node application using koa. It receiving webhooks from external application on specific resources.
To illustrate let say the webhook send me with POST request an object of this type :
{
'resource_id':'<SomeID>',
'resource_origin':'<SomeResourceOrigin>',
'value' : '<SomeValue>'
}
I would like to execute sequentially any resources coming from the same origin to avoid desynchronization of resources related to my execution.
I was thinking to use database as lock and use cron to sequentially executing my process for each resources of same origin.
But I'm not sure it's the most efficient method.
So my question is here :
Do you know some method/package/service allowing me to use global queues that I could implement for each origin insuring resources from same origin will be executed synchronously without making all webhooks processed sequentially ? If it do not use database it's better.
If I were you I would start by serializing the handling of all your webhooks. In other words, I suggest you handle them one at a time no matter their origin. Use a simple queue inside your nodejs application.
(Once you've convinced yourself that works correctly, you can then serialize them based on origin.)
First, structure your function (let's call it handleOneWebhook()) for handling incoming webhooks as a Promise or an async function. Then you could invoke them using code with this outline.
let busy= false
async function handleManyWebhooks (queue) {
if (busy) return
busy = true
while (queue.length > 0) {
const item = queue.shift()
await handleOneWebhook (item)
}
busy = false
}
The queue you pass to handleManyWebhooks is a simple array, where each element is the object from a POST request. You use it as a queue: push() each object to put it into the queue, and shift() to remove it.
Then, whenever you receive a webhook POST object you use code with this outline.
const queue = []
...
function handlePostObject (postObject) {
queue.push(postObject)
handleManyWebooks (queue)
}
Even though you call handleManyWebhooks once for each incoming object, the busy flag makes sure it handles only one at a time.
Notice this is a very simple solution. Once you have it working correctly, two possible refinements suggest themselves.
Use something more efficient for your queue than a simple array. shift() is not very fast.
Create a separate queue object with its own busy flag for each separate origin. Then you will be able to parallelize the handling of webhooks from different origins while still serializing the stream of webhooks from each origin.
Solution I decide to use
Small brief of the post discussion
As Ivan Rubinson let me know my problem is just a producer-consumer problem.
So I finally chose to use RabbitMQ because I have a huge amount of webhook to process. For peoples having a small amount of request to process and do not want use external tools O. Jones answer is a real good way to solve the problem.
Solution design
I finally install and configure a RabbitMQ server, then I created for each origin of my web-hooks one queue.
Producer
On the producer side when I receive the web-hook data I send a message to the queue corresponding to the origin of my web-hook with serialized information needed to process in fact id of the row in the Database to make messages as light as possible.
Consumer
On the consumer side I create a consumer function for each origin queue and set the fetch policy to one to process message one by one in each queue finally I set the channel policy to wait an acknowledgement message before to send the next message . Wit this configuration consumers proceed message by message and solve the initial problem.
Implementation
Producer
async function create(){
await amqp.connect(RBMQ_CONNECTION_STRING).then(async (conn)=>{
await conn.createChannel().then(async (ch)=>{
global.channel_publisher=ch;
});
});
}
async function sendtask(queue,task){
if(!global.channel_publisher){
await create();
}
global.channel_publisher.assertQueue(queue).then((ok)=>{
global.channel_publisher.sendToQueue(queue, Buffer.from(task));
});
}
I use the sendtask(queue,task) function at the place I received my web-hook
Consumer
async function create(){
await amqp.connect(RBMQ_CONNECTION_STRING).then(async (conn)=>{
await conn.createChannel().then(async (ch)=>{
ch.prefetch(1);
global.channel_consumer=ch;
});
});
}
async function consumeTask(queue){
if(!global.channel_consumer){
await create();
}
global.channel_consumer.assertQueue(queue).then((ok)=>{
global.channel_consumer.consume(queue,(message)=>{
const args=message.content.toString().split(';');
await processWebhooks(args);
global.channel_consumer.ack(message);
});
});
}
I use the consumeTask(queue) when I had to process a new origin of web-hooks. Also I use it for initialize my application with all known origins in the database.

Returning multiple asynchronous responses

I'm currently looking to set up an endpoint that accepts a request, and returns the response data in increments as they load.
The application of this is that given one upload of data, I would like to calculate a number of different metrics for that data. As each metric gets calculated asynchronously, I want to return this metric's value to the front-end to render.
For testing, my controller looks as follows, trying to use res.write
uploadData = (req, res) => {
res.write("test");
setTimeout(() => {
res.write("test 2");
res.end();
}, 3000);
}
However, I think the issue stems from my client-side which I'm writing in React-Redux, and calling that route through an Axios call. From my understanding, it's because the axios request closes once receiving the first response, and the connection doesn't stay open. Here is what my axios call looks like:
axios.post('/api', data)
.then((response) => {
console.log(response);
})
.catch((error) => {
console.log(error);
});
Is there an easy way to do this? I've also thought about streaming, however my concern with streaming is that I would like each connection to be direct and unique between clients that are open for short amount of time (i.e. only open when the metrics are being calculated).
I should also mention that the resource being uploaded is a db, and I would like to avoid parsing and opening a connection multiple times as a result of multiple endpoints.
Thanks in advance, and please let me know if I can provide any more context
One way to handle this while still using a traditional API would be to store the metrics in an object somewhere, either a database or redis for example, then just long poll the resource.
For a real world example, say you want to calculate the following metrics of foo, time completed, length of request, bar, foobar.
You could create an object in storage that looks like this:
{
id: 1,
lengthOfRequest: 123,
.....
}
then you would create an endpoint in your API that like so metrics/{id}
and would return the object. Just keep calling the route until everything completes.
There are some obvious drawbacks to this of course, but once you get enough information to know how long the metrics will take to complete on average you can tweak the time in between the calls to your API.

Error: Can't set headers after they are sent only on page refresh

I have this problem only when I try refresh the page and I can not solve it, I tried everything but still happens the same. It began to happen when I add socket.io at the project. The project run in several servers which are connected one each other throught sockets.
TEST CASES: When I render the page, at the first time everything goes well but, if I refresh the same page, I get this error:
ERROR: "Error: Can't set headers after they are sent. at ServerResponse.OutgoingMessage.setHeader (_http_outgoing.js:344:11)"
ATTENTION: when get in IF() and send "return res.end('The Activation Code is INVALID!');" it DOESN'T HAPPEND! I refresh it and refresh it and everything goes well. My problem is in the RENDER.
MY CODE BELOW:
activationUser = function(req,res,next){
var data = {
activationCode : req.params.activationCode,
now : new Date().valueOf(),
ip : req.connection.remoteAddress,
fId : frontalId
}
socketCore.emit('activationUser', data);
socketCore.on(frontalId + 'activationUserResp', function(data){
if(data.msg == "CHECKED!"){
next();
}else{
return res.end(data.msg);
}
});
}
router.get('/activationUser/:activationCode',activationUser,function(req,res){
var data = {
activationCode : req.params.activationCode,
fId : frontalId
}
socketCore.emit('step2', data);
socketCore.on(frontalId + 'step2Resp', function(data){
if(data.msg == 'err'){
return res.end('The Activation Code is INVALID!');
}else{
return res.render('registro2', {title: 'title | '+ data.name + ' ' + data.lastname, user:data});
}
});
});
Thank you!
The particular error you are getting happens when you try to send anything on the res object after the complete response has already been sent. This often occurs because of errors in asynchronous logic. In your particular case, it apepars to be because you are assigning a new event handler with socketCore.on() every single time the router is hit. Those event handlers will accumulate and after the first time the route is hit, they will execute multiple times triggering the sending of multiple responses on the same response object, thus trigger that error.
The main ways to fix your particular problem are:
Use .once() instead of .on() so the event handler automatically removes itself after being triggered.
Manually remove the .on() event handler after you get the response.
Move the event handler outside of the route so it's only ever installed once.
In your particular case, since socketCore is a shared object available to all requests, it appears that you also have a race condition. If multiple users trigger the '/activationUser/:activationCode' route in the same general time frame, then you will register two event handlers with socketCore.on() (one for each route that is hit) and you will do two socketCore.emit('step2', data);. But, you have no way of associating which response belongs with which request and the two responses could easily get mixed up - going to the wrong request.
This highlights how socket.io connections are not request/response. They are message/answer, but unless you manually code a correspondence between a specific message request and a specific answer, there is no way to correlate which goes with which. So, without assigning some particular responseID that lets you know which response belongs to which message, you can't use a socket.io connection like this in a multi-user environment. It will just cause race conditions. It's actually simpler to use an HTTP request/response for this type of data fetching because each response goes only with the request that made it in the HTTP architecture.
You can change your architecture for making the socketCore request, but you will have to manually assign an ID to each request and make sure the server is sending back that ID with the response that belongs to that request. Then, you can write a few lines of code on the receiving side of things that will make sure the right response gets fed to the code with the matching request.

Getting just email body using mail-listener-2

I am using mail-listener-2 and node.js to receive emails. I have it all setup doing what I want except for one thing.
I can not figure out how to get just the text of the current message. This is what I mean.
1. Send an email in my application with a unique id in the subject.
2. User receives email in inbox and replies.
3. Mail-Listener-2 grabs the email reply and saves it under the unique id.
Code:
mailListener.on("mail", function(mail, seqno, attributes){
var body = mail.text
});
Now when step 3 occurs, it's grabbing the entire thread message. Instead of just the message the user sent. Meaning, my original message plus the users new message is included in the reply.
Is there a way to grab just the users reply? Or do I just need to do something like putting a line in the message and then parse everything before the line?
Try this one:
mailListener.on("mail", function(mail, seqno, attributes){
text:mail.text
});

Resources