I have a problem with understanding of working SQS from AWS in NodeJS. I'm creating a simple endpoint which receive data sended to my server and add to SQS queue:
export const receiveMessageFromSDK = async (req: Request, res: Response) => {
const payload = req.body;
try {
await sqs.sendMessage({
QueueUrl: process.env.SQS_QUEUE_URL,
MessageBody: payload
}).promise();
} catch (error) {
//something else
}
}
and ok, it working and MessageBody is adding to my SQS queue. And right now I have a problem with processing of the data from this queue... How to do this?
In Google Cloud I'm creating simple request queue with sending (in body payload) also url of endpoint which should process data from queue, then gcloud send to this endpoint this body and my server starting business logic on this data. But how to do this in SQS? How to receive data from this queue and start processing data on my side?
I'm thinking about QueueUrl param but in docs is written that this value should be an url from aws console like https://sqs.eu-central-1.amazonaws.com... so I have no idea how to process this data from queue on my side.
So..., can anybody help me?
Thanks, in advice for any help!
Should I call launch this function on cron or AWS can eq. send request on providing from me endpoint with this data etc
SQS is for pulling data, which means that you have to have a cron job working (or any equivalent system in your app) that iteratively pulls your queue for messages using receiveMessage, , e.g. every 10 seconds.
If you want push type messaging system, then you have to use SNS instead of SQS. SNS can push messages to your HTTP/HTTPS endpoint automatically if your application exposes any such endpoint for the messages.
Related
I have a NodeJS server running in a container in Google Cloud Run. It publishes messages to PubSub.
// during handling a request,
const topic = pubsub.topic(topicName, {
batching: {
maxMessages: 1000,
maxMilliseconds: 1000,
},
});
// publish some messages
someArray.forEach((item) => {
topic.publishJSON(item);
});
Let's assume someArray.length is less than maxMessages.
What happens if the Node sends a response before maxMilliseconds has elapsed? Are the messages sent? Does Google Cloud Run kill the container upon the http response, or does it somehow know that the PubSub library has set a timeout?
Cloud Run doesn't kill the container immediately, but it limits its access to CPU after processing the request. As the documentation suggests finish all the asynchronous tasks within request handling. Try async/await like in the example.
I am currently using nodemailer to send Email from my node.js application.
What i am looking for is "something" that queues and schedules the sending of large number of Emails(500+ per user) by multiple users.
This should run as a separate process in background and must be triggered by my node.js application.
Email sending should not affect my Node.Js application(server) from responding to requests.
I know the above statements seems more like a s/w requirement rather than a problem to fix but i need a robust solution.Please give a brief solution(i.e., what "somethings" to use).
Create a lambda function to send one email only with your nodeMailer script.
In this lambda I would cache the mailer outside the actual exports function like:
const nodemailer = require('nodemailer');
exports.handler = (event, context, callback) => {
// ...NodeMailer stuff
callback(null, event)
};
Then trigger this lambda with an SQS event as seen here:
SQS as an event source to trigger Lambda | Medium
You can add a message to the queue using the JS AWS-SDK from your node js app as seen here:
Calling the sendMessage operation | AWS Docs
In my scenario I'm trying to implement server less backend that runs pretty long time consuming calculations. This calculations is managed by Lambda that refers to some external API.
In oder to request this I'm using Amazon API Gateway which has 10 seconds execution limitation. However Lambda runs about 100 seconds.
To avoid this limitation I'm using 2nd Lambda function to execute this time consuming calculation & report that calculation is started.
I looks very similar to this:
var AWS = require('aws-sdk');
var colors = require('colors');
var functionName = 'really-long'
var lambda = new AWS.Lambda({apiVersion: '2015-03-31'});
var params = {
FunctionName: functionName,
InvocationType: 'Event'
};
lambda.invoke(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(functionName.green + " was successfully executed and returned:\n" + JSON.stringify(data, null, 2).gray); // successful response
});
console.log("All done!".rainbow);
This code is executed over AWS API Gateway by thousands of clients browsers independently.
To inform each particular client that his Lambda function execution was successfully done I'v planed to use AWS SQS (because of long polling and some other useful functionalities out of the box).
So my question is:
How can I determine on the client which message in the queue belongs to this particular client? Or should I iterate over all queue to find proper messages by some request ID parameter in every client browser? I guess that this method will be inefficient when 1000 client will be simultaneously waiting for their results.
I do understand that I can write results to DynamoDB for example and periodically poll DB for the result via some homemade API. But is there any elegant solution to notify browser based client about completion of execution of time consuming Lambda function based on some Amazon PaaS solution?
Honestly the DynamoDB route is probably your best bet. You can generate a uuid in the first Lambda function executed by the API Gateway. Pass that uuid to the long-running Lambda function. Before the second function completes have it write to a DynamoDB table with two columns: uuid and result.
The API Gateway responds to the client with the uuid it generated. The client then long-polls with a getItem request against your DynamoDB table (either via the aws-sdk directly or through another API Gateway request). Once it responds successfully, remove said item from the DynamoDB table.
The context object of the lambda function will have the AWS request ID returned to the client that invoked the Lambda function.
So, client will have the lambda request ID of Lambda 1, Lambda 1 Context object will have the same request Id (irrespective of lambda retries, request ID remains same). So pass this request ID to Lambda 2 there by actual request ID is chained till the end.
Polling using the request id from client is fairly easy on any data store like dynamodb.
My question is short, but I think is interesting:
I've a queue from Amazon SQS service, and I'm polling the queue every second. When there's a message I process the message and after processing, go back to polling the queue.
Is there a better way for this?, some sort of trigger? or which approach will be the best in your opinion, and why.
Thanks!
A useful and easily to use library for consuming messages from SQS is sqs-consumer
const Consumer = require('sqs-consumer');
const app = Consumer.create({
queueUrl: 'https://sqs.eu-west-1.amazonaws.com/account-id/queue-name',
handleMessage: (message, done) => {
console.log('Processing message: ', message);
done();
}
});
app.on('error', (err) => {
console.log(err.message);
});
app.start();
It's well documented if you need more information. You can find the docs at:
https://github.com/bbc/sqs-consumer
yes there is:
http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-long-polling.html
you can configure the SQS queues to have a "receive message wait time" and do long polling.
so you can set it to say 10 seconds, and the call will come back only if you have a message or after the 10 sec timeout expires. you can continuously poll the queue in this scenario.
As mentioned by Mircea, long polling is one option.
When you ask for a 'trigger', I believe you are looking for something other than continuously polling SQS on your own. If that is the case, I'd suggest you look at AWS Lambda. It allows you to put code in the cloud, which automatically gets triggered on your configured events, such as SNS event, a file pushed to S3 etc.
http://aws.amazon.com/lambda/
Folks,
I would like to set up a message queue between our Java API and NodeJS API.
After reading several examples of using aws-sdk, I am not sure how to make the service watch the queue.
For instance, this article Using SQS with Node: Receiving Messages Example Code tells me to use the sqs.receiveMessage() to receive and sqs.deleteMessage() to delete a message.
What I am not clear about, is how to wrap this into a service that runs continuously, which constantly takes the messages off the sqs queue, passes them to the model, stores them in mongo, etc.
Hope my question is not entirely vague. My experience with Node lies primarily with Express.js.
Is the answer as simple as using something like sqs-poller? How would I implement the same into an already running NodeJS Express app? Quite possibly I should look into SNS to not have any delay in message transfers.
Thanks!
For a start, Amazon SQS is a pseudo queue that guarantees availability of messages but not their sequence in FIFO fashion. You have to implement sequencing logic into your app if you want it to work that way.
Coming back to your question, SQS has to be polled within your app to check if there are new messages available. I implemented this in an app using setInterval(). I would poll the queue for items and if no items were found, I would delay the next call and in case some items were found, the next call would be immediate bypassing the setInterval(). This is obviously a very raw implementation and you can look into alternatives. How about a child process on your server that pings your NodeJS app when a new item is found in SQS ? I think you can implement the child process as a watcher in BASH without using NodeJS. You can also look into npm modules if there is already one for this.
In short, there are many ways you can poll but polling has to be done one way or the other if you are working with Amazon SQS.
I am not sure about this but if you want to be notified of items, you might want to look into Amazon SNS.
When writing applications to consume messages from SQS I use sqs-consumer:
const Consumer = require('sqs-consumer');
const app = Consumer.create({
queueUrl: 'https://sqs.eu-west-1.amazonaws.com/account-id/queue-name',
handleMessage: (message, done) => {
console.log('Processing message: ', message);
done();
}
});
app.on('error', (err) => {
console.log(err.message);
});
app.start();
See the docs for more information (well documented):
https://github.com/bbc/sqs-consumer