Is it possible to an SNS Topic indefinitely? - node.js

I'm interested in publishing SES analytics data to SNS and publishing the SNS topic to somewhere (S3?) to store the data permanently.
Our control flow is as follows:
Create SES config set
Create SNS topic
Set config set destination to SNS topic
publish topic to s3?
The SNS subscribe() function says after confirming the subscription, it will last for 3 days. I'd like to make it last indefinitely so we can gather email analytics longer than 3 days.
If this is a reasonable approach, how would someone remove that expiration?
If this is the wrong approach, how should I approach storing SES analytics data permanently?
Thank you!

Going through SNS is too much round trip.
SES (Events) --> Firehose --> (S3 / Redshift / Elastisearch)
You can configure SES to send analytics data to Kinesis Firehose. You can configure Kinesis Firehose to deliver to S3 or Redshift or Elastisearch, depending upon your needs.
SES Events to Firehose:
http://docs.aws.amazon.com/ses/latest/DeveloperGuide/event-publishing-retrieving-firehose-contents.html
Event Data Transformation with Lambda:
With an intermediate Data transformation with Lambda you can manipulate the data before sending it to the desired destination.
http://docs.aws.amazon.com/firehose/latest/dev/data-transformation.html
Hope it helps.

Related

Real time email feature using AWS services

I am trying to use AWS services to implement a real-time email-sending feature for one of my projects. It is like someone uses my app to schedule a reminder from my project and then the email will be sent to them nearby or at the actual time that he scheduled.
I know the AWS services such as AWS CloudWatch rules (CRONs) and DynamoDB stream (TTL based). But that is not perfect for such a feature. Can anyone please suggest a better way to implement such a feature?
Any type of guidance is acceptable.
-- Thanks in advance.
Imagine your service at huge scale. At such scale, there are likely to be multiple messages going off every minute. Therefore, you could create:
A database that stores the reminder times (this could be DynamoDB or an Amazon RDS database)
An AWS Lambda function that is configured to trigger every minute
When the Lambda function is triggered, it should check the database to retrieve all reminders that should be sent for this particular minute. It can then use Amazon Simple Email Service (SES) to send emails.
If the number of emails to be sent is really big, then rather than having the Lambda function call SES in series, it could put a message into an Amazon SQS queue for each email to be sent. The SQS queue could then trigger another Lambda function that sends the email via SES. This allows the emails to be sent in parallel.

How to read tags of SQS in AWS?

We are reading the subscriptions of SNS using listSubscriptionsByTopic() method which returns all the subscriptions for that topic. Subscribers includes various SQS and we would like to take different action on based on tags defined on the SQS.
Is it possible to read the tags associated with a particular SQS? We are using JavaScript SDK for AWS using Node js.
The ListSubscriptionsByTopic() method returns a protocol and endpoint, which can tell you which Amazon SQS queue is subscribed to the Amazon SNS topic.
You could then use the SQS ListQueueTags() call to retrieve tags associated with a particular queue.

Sending notifications to Amazon SNS on AWS DMS task progress

Is it possible to pull AWS DMS replication task percentage progress and use that to create an AWS Lambda function for sending notifications to Amazon SNS for let's say, every 10% completion? I can't find any metrics/event categories anywhere in relation to this while I was browsing. Thanks.
You don't need a lambda here. You can monitor the progress using CloudWatch then set up an SNS notification.
monitoring: https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Monitoring.html
sns for cloudwatch: https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/US_SetupSNS.html

Aws s3 event don't support multiple events for same bucket

i want on s3 put bucket should emit multiple events i.e lambda,sqs & sns services. i tried single service to emit on bucket put. but now i want multiple service to be emitted by bucket on put. to perform different task.
how can i achieve this, i googled it but no use.
any help will be appreciated.
thank you in advance.
Your best bet can be to use something like Fanout scenario.
Create a topic using SNS and your bucket can publish a message to this topic on put Event.
Lambda and SQS can subscribe to this SNS topic. Whenever topic gets message, it will be received by all subscribers. Lambda will be invoked and queue will receive message, along with other subscriber(s).
SQS And SNS

Designing SNS receiver in python script

Hi I am new to Python as well as AWS SNS services.
I want to develop a Python utility which would perform a role of subscriber as well as publisher to automate some simulations.
I am stuck where I need to design endpoint which would act as a SNS message receiver.
Could you guys please guide me on this topic.
AWS Lambda is a nice service for such scenarios and also it is quite cheap so will keep your AWS bills low.
You need publisher and subscriber for SNS in python so use AWS Lambda which will give you a nice connectivity with AWS SNS and there are few samples for subscribers which will help you in configuring the same.
So now your events get into AWS SNS and through subscription of lambda you can invoke Lambda function and take the action accordingly.
Event ---> SNS ---> Lambda Subscriber ---> Python Functions
Here is a nice reference for the same.
For publisher, you can use scheduled Lambda Function which will keep on pooling for some event to occur and then depending upon event you can send the notification to same SNS endpoint or different.
Event <--(Pooling)- Lambda Publisher(Cron Based) --(Event Occurred)-> SNS
Here is the AWS Tutorial for the same.
I hope this helps.
PS: Just keep in mind that maximum runtime for any Lambda function should not exceed certain limit which is 5 mins currently.

Resources