Sometimes my backend database goes offline and the AWS lambda execution which requires this backend fails. Can I ask AWS to reprocess the same event in a later time hoping that the backend goes online by that time? I'm using node.js for my lambda code.
Yes, you can use AWS lambda's DLQ service. It sends all failed executions to SNS/SQS, you can review them and reprocess them from their.
Link to doc https://docs.aws.amazon.com/lambda/latest/dg/invocation-async.html (scroll to AWS Lambda function dead-letter queues)
Related
In case of a event driven serverless stack with a HTTPS call in a Rest API and the trigger of another lambda to answer in a Websocket API, what is the best solution between SNS lambda trigger or aws sdk invoke in a asynchronus event architecture?
I can clearly see why choosing aws sdk invoke:
No need to implement another service, easier
Invoke is made by direct invoke from the code of lambda (fast, no duplicates, ...)
So I can't see the usefullness of adding SNS in between insted of a lot more configuration.
Just for more information, the second lambda is like a microservice to answer on websocket.
I am trying to use AWS services to implement a real-time email-sending feature for one of my projects. It is like someone uses my app to schedule a reminder from my project and then the email will be sent to them nearby or at the actual time that he scheduled.
I know the AWS services such as AWS CloudWatch rules (CRONs) and DynamoDB stream (TTL based). But that is not perfect for such a feature. Can anyone please suggest a better way to implement such a feature?
Any type of guidance is acceptable.
-- Thanks in advance.
Imagine your service at huge scale. At such scale, there are likely to be multiple messages going off every minute. Therefore, you could create:
A database that stores the reminder times (this could be DynamoDB or an Amazon RDS database)
An AWS Lambda function that is configured to trigger every minute
When the Lambda function is triggered, it should check the database to retrieve all reminders that should be sent for this particular minute. It can then use Amazon Simple Email Service (SES) to send emails.
If the number of emails to be sent is really big, then rather than having the Lambda function call SES in series, it could put a message into an Amazon SQS queue for each email to be sent. The SQS queue could then trigger another Lambda function that sends the email via SES. This allows the emails to be sent in parallel.
I do not want to wait for 14 seconds to read CloudWatch Lambda logs.
Is there any way to read them in realtime in invoking shell on premises (outside VPC)?
Invoke async
result = lam.invoke(
InvocationType='Event',
FunctionName='my-lambda-func',
Payload=json.dumps(dict(test='test'))
)
Lambda writes logs to a service in question.
Loop/wait for real-time results in the same on separate shell.
A common way to use async invocation is to have your lambda publish your results to SQS queue or SNS topic. This way you can pull SQS queue on premise for results, or setup http endpoint subscription which will be automatically invoked by SNS when it gets a message from the lambda.
If you only want to focus on CloudWatch logs, then you can setup subscription filter on the logs for real-time processing of the incoming log events from lambda. Depending on your exact setup, you can use Kinesis, firehose or other lambda to get the log messages.
My requirement is to send data received in Lambda from DynamoDB to Azure Queue in node.js.
Steps taken - AWS Side
1. Created DDB Table
2. Added Stream and Trigger
3. Wrote Lambda
Steps taken - Azure Side
1. Created an Azure Queue (Service Bus)
So far so good. I can see DDB Events making its way to Lambda.
My question is now I want to send these events to Azure Queue, I could not find any online google result for this. Is it possible to put elements in Azure Queue from AWS Lambda?
You can use Azure Service Bus REST API in order to send your messages:
POST http{s}://{serviceNamespace}.servicebus.windows.net/{queuePath|topicPath}/messages
https://learn.microsoft.com/en-us/rest/api/servicebus/send-message-to-queue
I'm working on a collaborative document project (basically a clone of google docs), where client-programs post their actions to an API on amazon's API gateway, then receive messages of other client's actions via an SQS queue. The API calls trigger Node.js lambda functions that create a message, publish it to an SNS which then notifies each client's SQS.
My current hurdle is in dynamically creating/destroying SQS queues for these clients as they join/leave a document, however my googlefu is weak and I have failed to find anything that could help me. I'd like to keep the queue management server-side and ideally in lambda, but if that's impossible I will accept other solutions.
You can simply use the AWS SDK for Javascript in your AWS Lambda function (it's already pre-installed there) and use it to manage any kind of AWS resources, e.g. the requested creation and deletion of SQS queues.