Creating SQS Queues with lambda - node.js

I'm working on a collaborative document project (basically a clone of google docs), where client-programs post their actions to an API on amazon's API gateway, then receive messages of other client's actions via an SQS queue. The API calls trigger Node.js lambda functions that create a message, publish it to an SNS which then notifies each client's SQS.
My current hurdle is in dynamically creating/destroying SQS queues for these clients as they join/leave a document, however my googlefu is weak and I have failed to find anything that could help me. I'd like to keep the queue management server-side and ideally in lambda, but if that's impossible I will accept other solutions.

You can simply use the AWS SDK for Javascript in your AWS Lambda function (it's already pre-installed there) and use it to manage any kind of AWS resources, e.g. the requested creation and deletion of SQS queues.

Related

Real time email feature using AWS services

I am trying to use AWS services to implement a real-time email-sending feature for one of my projects. It is like someone uses my app to schedule a reminder from my project and then the email will be sent to them nearby or at the actual time that he scheduled.
I know the AWS services such as AWS CloudWatch rules (CRONs) and DynamoDB stream (TTL based). But that is not perfect for such a feature. Can anyone please suggest a better way to implement such a feature?
Any type of guidance is acceptable.
-- Thanks in advance.
Imagine your service at huge scale. At such scale, there are likely to be multiple messages going off every minute. Therefore, you could create:
A database that stores the reminder times (this could be DynamoDB or an Amazon RDS database)
An AWS Lambda function that is configured to trigger every minute
When the Lambda function is triggered, it should check the database to retrieve all reminders that should be sent for this particular minute. It can then use Amazon Simple Email Service (SES) to send emails.
If the number of emails to be sent is really big, then rather than having the Lambda function call SES in series, it could put a message into an Amazon SQS queue for each email to be sent. The SQS queue could then trigger another Lambda function that sends the email via SES. This allows the emails to be sent in parallel.

Subscribe non-exposed NodeJS Application to SNS

On my current architecture, I have a NodeJS application that posts a message on SQS, which triggers a lambda function that (finally) puts the result on MongoDB. While the lambda is running, the NodeJS app pools MongoDB until the status field changes to SUCCESS or FAILED.
I would like to change this architecture to be event-driven rather than relying on pooling. To achieve that I considered changing both the NodeJS app to subscribe to a SNS topic and the Lambda function to post the result to that topic.
However, I faced a challenge when attempting to subscribe to the SNS topic. The subscribe method demands an Endpoint to confirm the subscription, and the NodeJS in question is not exposed (it's not an API). So how could the subscription be confirmed?
I understand that AWS might want to avoid spams by implementing subscription confirmation for SMS and Email, but on this case, the subscriber is a simple application...
Is there any way to subscribe tot he topic without exposing the NodeJS application? Or SNS is not the appropriate solution here?
I have used RabbitMQ for this on the past, but I would rather not deploy an instance and instead leverage a platform-as-a-service type of product.

Using Pub/Sub for Google Cloud Storage with GKE

I have a GKE application that currently is driven by Notifications from a Google Cloud Storage bucket. I want to convert this node.js application to be triggered instead by PubSub notifications. I've been crawling through Google documentation pages most of the day, and do not have a clear answer. I see some python code that might do it, but it's not helping much.
The code as it is currently written is working - an image landing in my GCS bucket triggers a notification to my GKE pod(s), and my function runs. Trying to understand what I need to do inside my function to subscribe to a Pub/Sub topic to trigger the processing. Any and all suggestions welcome.
Firstly thanks, I didn't know the notification capability of GCS!!
The principle is close but you use PubSub as intermediary. Instead of notify directly your application with a watchbucket command, you notif a PubSub topic.
From there, the notifications arrive in PubSub topic, now you have to create a subscription. 2 types are possible:
Push: you specify an HTTP URL that is called with a POST request, and the body contain the notification message.
Pull: your application need to create a connection with the PubSub subscription and to read the messages.
Pro and cons
Push requires an authentication from the PubSub push subscription to your application. And if you use internal IP, you can't use this solution (URL endpoint must be publicly accessible). The main advantage is the scalability and the simplicity of the model.
Pull require an authentication of the subscriber (here your application) and thus, even if your application is privately deployed, you can use Pull subscription. Pull is recommended for high throughput but require higher skill in processing, concurrency/multi-threading programming. You don't scale on request rate (as with Push model) but according to the number of message that you read. And you need to acknowledge manually the messages.
Data model is mentioned here. Your pubsub message is like that
{
"data": string,
"attributes": {
string: string,
...
},
"messageId": string,
"publishTime": string,
"orderingKey": string
}
The attributes are discribed in the documentation and the payload (base64 encoded, be carefull) has this format. Very similar of what you get today.
So, why the attributes? Because you can use the filter feature on PubSub to create subscription with only a subset of messages.
You can also shiht gears and use Cloud Event (base on Knative events) if you use Cloud Run for Anthos in your GKE cluster. Here, the main advantage is the portability of the solution, because the messages are compliant with Cloud Event format and not specific to GCP.

AWS lambda event reprocess request

Sometimes my backend database goes offline and the AWS lambda execution which requires this backend fails. Can I ask AWS to reprocess the same event in a later time hoping that the backend goes online by that time? I'm using node.js for my lambda code.
Yes, you can use AWS lambda's DLQ service. It sends all failed executions to SNS/SQS, you can review them and reprocess them from their.
Link to doc https://docs.aws.amazon.com/lambda/latest/dg/invocation-async.html (scroll to AWS Lambda function dead-letter queues)

How to read tags of SQS in AWS?

We are reading the subscriptions of SNS using listSubscriptionsByTopic() method which returns all the subscriptions for that topic. Subscribers includes various SQS and we would like to take different action on based on tags defined on the SQS.
Is it possible to read the tags associated with a particular SQS? We are using JavaScript SDK for AWS using Node js.
The ListSubscriptionsByTopic() method returns a protocol and endpoint, which can tell you which Amazon SQS queue is subscribed to the Amazon SNS topic.
You could then use the SQS ListQueueTags() call to retrieve tags associated with a particular queue.

Resources