Is it possible to call Azure Queue from AWS Lambda - node.js

My requirement is to send data received in Lambda from DynamoDB to Azure Queue in node.js.
Steps taken - AWS Side
1. Created DDB Table
2. Added Stream and Trigger
3. Wrote Lambda
Steps taken - Azure Side
1. Created an Azure Queue (Service Bus)
So far so good. I can see DDB Events making its way to Lambda.
My question is now I want to send these events to Azure Queue, I could not find any online google result for this. Is it possible to put elements in Azure Queue from AWS Lambda?

You can use Azure Service Bus REST API in order to send your messages:
POST http{s}://{serviceNamespace}.servicebus.windows.net/{queuePath|topicPath}/messages
https://learn.microsoft.com/en-us/rest/api/servicebus/send-message-to-queue

Related

How to read ServiceBus Queue message in Azure function using Logic App?

I'm trying to implement a Poc considering the scenario as,
Trigger LogicApp whenever message arrives in ServiceBus Queue. The message will be sent to/read by Azure Function which will be the next action within the LogicApp. After performing some business processes Azure function will return the response back to Logic App. Based on the response from Azure function, LogicApp will trigger few more functions and then they will send the response back to ServiceBus Queue.
I'm able to invoke/trigger LogicApp when a message arrives in ServiceBus queue. Since I'm newbie to Azure and LogicApp, I'm not sure how can I pass message to Azure Function to read and perform business validation within the LogicApp.
Thanks in advance!
Firstly, you need connect service bus to Azure Logic App then in next step by clicking on (+) symbol you can type azure function and follow the below process:
Then click on your Function app:
Then Click on the Azure Function you have in your function app:
Then click on request body and then on Service Bus Message:
So, by this way you can handle Azure Function using Azure Logic Apps:
Taken References from:
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-azure-functions?tabs=consumption
Expose APIs from functions using Azure API Management | Microsoft Docs

Send slack message on Firebase Cloud Functions deployment event

How can I send a slack message whenever a deployment happens on Firebase Cloud functions.
I need to send following data in message: project id, functions deployed, and deployment message if any.
I know how to send slack messages via its api but I'm not aware of any event which gets triggered on Firebase deployment.
There is no simple and automatic way to accomplish the task you are looking for within Firebase. But there is a way in which you can implement it.
To do that you have to create a logging sink that matches the audit log produced when a function is deployed (google.cloud.functions.v1.CloudFunctionsService.CreateFunction) and when a function is updated (google.cloud.functions.v1.CloudFunctionsService.UpdateFunction). Now you can use the logging sink to trigger a Pub/Sub topic as mentioned in this document. Then you can create a Cloud Function which is triggered by that Pub/Sub topic as described in this document so that the Cloud Function will be called on every new deployment. By doing this the entire log is passed into the Cloud Function which will have project id, functions deployed etc. and now within the Cloud Function you can call the Slack API to send the message.

AWS lambda event reprocess request

Sometimes my backend database goes offline and the AWS lambda execution which requires this backend fails. Can I ask AWS to reprocess the same event in a later time hoping that the backend goes online by that time? I'm using node.js for my lambda code.
Yes, you can use AWS lambda's DLQ service. It sends all failed executions to SNS/SQS, you can review them and reprocess them from their.
Link to doc https://docs.aws.amazon.com/lambda/latest/dg/invocation-async.html (scroll to AWS Lambda function dead-letter queues)

Azure Service Bus Topic as Google PubSub Push Subscriber

The requirement is that we have an Azure Service Bus topic which we want to set up as Push Subscriber in Google PubSub Topic. This way any messages published to Google PubSub Topic will be pushed to Azure SB Topic without any intermediate layer involved.
On Paper this should work because messages can be published to Azure SB Topic using API framework and Google PubSub can also configure API as Push Subscribers.
I have gone through following articles but couldn't make this linking workout.
Azure SB as API: https://learn.microsoft.com/en-us/rest/api/servicebus/send-message-to-queue
Google PubSub Push Subscriptions: https://cloud.google.com/pubsub/docs/push
Has anyone done this kind of linking before?
Thanks in Advance
It would be nice to create a pub/sub push subscription, which pushes to Azure Event Hub. I ran into the same problem when configuring this setup.
Pub/sub push subscriptions currently do not support custom Authorization headers, which are needed as per the Event Hub documentation.
POST https://your-namespace.servicebus.windows.net/your-event-hub/messages?timeout=60&api-version=2014-01 HTTP/1.1
Authorization: SharedAccessSignature sr=your-namespace.servicebus.windows.net&sig=your-sas-key&se=1403736877&skn=RootManageSharedAccessKey
Content-Type: application/atom+xml;type=entry;charset=utf-8
Host: your-namespace.servicebus.windows.net
{ "DeviceId":"dev-01", "Temperature":"37.0" }
So the only two options I see here:
Push setup: Create a cloud function or dataflow job in Google Cloud. Push pub/sub events to this endpoint and then pass on the event to Azure Event hub with the appropriate headers.
Pull setup: Poll a pub/sub pull subscription from the Azure side with an Azure function or WebJob.
Both options require extra compute resources, so definitely not the preferred way of doing this. I would always try a push setup first, since then you don't have t o continuously have a polling job running in the background.
I have hopes that pub/sub push subscriptions will support custom headers anywhere in the future. Lately some other useful features have been added like: Dead Lettering, Message Ordering and Partitioning (Lite Topics). Hopefully they will add custom headers as well.
I have a workaround on this problem.
1- you can create cloud function on google cloud that can push data for you on azure service bus.
2- you can develop web job on azure that will run continuously and check google pub/sub topic with the help of provided connection string and google pub/sub supporting libraries.
From above mentioned solution you can get data from google cloud and push to your azure service bus.

Creating SQS Queues with lambda

I'm working on a collaborative document project (basically a clone of google docs), where client-programs post their actions to an API on amazon's API gateway, then receive messages of other client's actions via an SQS queue. The API calls trigger Node.js lambda functions that create a message, publish it to an SNS which then notifies each client's SQS.
My current hurdle is in dynamically creating/destroying SQS queues for these clients as they join/leave a document, however my googlefu is weak and I have failed to find anything that could help me. I'd like to keep the queue management server-side and ideally in lambda, but if that's impossible I will accept other solutions.
You can simply use the AWS SDK for Javascript in your AWS Lambda function (it's already pre-installed there) and use it to manage any kind of AWS resources, e.g. the requested creation and deletion of SQS queues.

Resources