Send slack message on Firebase Cloud Functions deployment event - node.js

How can I send a slack message whenever a deployment happens on Firebase Cloud functions.
I need to send following data in message: project id, functions deployed, and deployment message if any.
I know how to send slack messages via its api but I'm not aware of any event which gets triggered on Firebase deployment.

There is no simple and automatic way to accomplish the task you are looking for within Firebase. But there is a way in which you can implement it.
To do that you have to create a logging sink that matches the audit log produced when a function is deployed (google.cloud.functions.v1.CloudFunctionsService.CreateFunction) and when a function is updated (google.cloud.functions.v1.CloudFunctionsService.UpdateFunction). Now you can use the logging sink to trigger a Pub/Sub topic as mentioned in this document. Then you can create a Cloud Function which is triggered by that Pub/Sub topic as described in this document so that the Cloud Function will be called on every new deployment. By doing this the entire log is passed into the Cloud Function which will have project id, functions deployed etc. and now within the Cloud Function you can call the Slack API to send the message.

Related

How to read ServiceBus Queue message in Azure function using Logic App?

I'm trying to implement a Poc considering the scenario as,
Trigger LogicApp whenever message arrives in ServiceBus Queue. The message will be sent to/read by Azure Function which will be the next action within the LogicApp. After performing some business processes Azure function will return the response back to Logic App. Based on the response from Azure function, LogicApp will trigger few more functions and then they will send the response back to ServiceBus Queue.
I'm able to invoke/trigger LogicApp when a message arrives in ServiceBus queue. Since I'm newbie to Azure and LogicApp, I'm not sure how can I pass message to Azure Function to read and perform business validation within the LogicApp.
Thanks in advance!
Firstly, you need connect service bus to Azure Logic App then in next step by clicking on (+) symbol you can type azure function and follow the below process:
Then click on your Function app:
Then Click on the Azure Function you have in your function app:
Then click on request body and then on Service Bus Message:
So, by this way you can handle Azure Function using Azure Logic Apps:
Taken References from:
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-azure-functions?tabs=consumption
Expose APIs from functions using Azure API Management | Microsoft Docs

Using Pub/Sub for Google Cloud Storage with GKE

I have a GKE application that currently is driven by Notifications from a Google Cloud Storage bucket. I want to convert this node.js application to be triggered instead by PubSub notifications. I've been crawling through Google documentation pages most of the day, and do not have a clear answer. I see some python code that might do it, but it's not helping much.
The code as it is currently written is working - an image landing in my GCS bucket triggers a notification to my GKE pod(s), and my function runs. Trying to understand what I need to do inside my function to subscribe to a Pub/Sub topic to trigger the processing. Any and all suggestions welcome.
Firstly thanks, I didn't know the notification capability of GCS!!
The principle is close but you use PubSub as intermediary. Instead of notify directly your application with a watchbucket command, you notif a PubSub topic.
From there, the notifications arrive in PubSub topic, now you have to create a subscription. 2 types are possible:
Push: you specify an HTTP URL that is called with a POST request, and the body contain the notification message.
Pull: your application need to create a connection with the PubSub subscription and to read the messages.
Pro and cons
Push requires an authentication from the PubSub push subscription to your application. And if you use internal IP, you can't use this solution (URL endpoint must be publicly accessible). The main advantage is the scalability and the simplicity of the model.
Pull require an authentication of the subscriber (here your application) and thus, even if your application is privately deployed, you can use Pull subscription. Pull is recommended for high throughput but require higher skill in processing, concurrency/multi-threading programming. You don't scale on request rate (as with Push model) but according to the number of message that you read. And you need to acknowledge manually the messages.
Data model is mentioned here. Your pubsub message is like that
{
"data": string,
"attributes": {
string: string,
...
},
"messageId": string,
"publishTime": string,
"orderingKey": string
}
The attributes are discribed in the documentation and the payload (base64 encoded, be carefull) has this format. Very similar of what you get today.
So, why the attributes? Because you can use the filter feature on PubSub to create subscription with only a subset of messages.
You can also shiht gears and use Cloud Event (base on Knative events) if you use Cloud Run for Anthos in your GKE cluster. Here, the main advantage is the portability of the solution, because the messages are compliant with Cloud Event format and not specific to GCP.

Accessing and editing the the Pub/Sub topic on the Device Access Console

I'm building automation for a Nest thermostat. The automation infrastructure is attempting to use GCP's Pub/Sub and Cloud Functions services.
When creating a new project on Google's Device Access Console I don't see a way to update the Pub/Sub topic. Technically, there is an edit button, but the topic text field is greyed out and the string can't be changed. Nor do I see a way to access the auto-populated topic from inside my GCP project. As a result, I don't see how I can build a Function subscriber to the topic from my GCP project.
Interestingly, there is a way to create a Pub/Sub subscriber because that interface provides a method to manually enter the Pub/Sub topic that is shown on the Device Access Console. I've done this and verified that device data flows correctly.
There is no manual entry option when creating a Function that subscribes to a topic.
When I click on the topic listed on the Pub/Sub Subscriber, I'm presented with an error message,
You do not have sufficient permissions to view this page
How can I build a Cloud Function that responds to Pub/Sub events for the device?

GCP: Is it possible for a Cloud function to trigger when an Alert fires?

I am using Google Cloud Monitoring of Google Cloud Platform.
I have created some alert policies for objects that I monitor. However , When there is an alert that fires a, there are some pieces of information that are not included that I want included into the email. So I am thinking to use a cloud function that will trigger upon one of the policies I have created if that is possible to do in this case .
If it is possible please provide advice in this issue.
Cloud Monitoring supports using Pub/Sub as a notification channel: https://cloud.google.com/monitoring/support/notification-options#pubsub
You should be able to write a Cloud Function that acts as a Pub/Sub trigger to respond to these events: https://cloud.google.com/functions/docs/calling/pubsub

Is it possible to call Azure Queue from AWS Lambda

My requirement is to send data received in Lambda from DynamoDB to Azure Queue in node.js.
Steps taken - AWS Side
1. Created DDB Table
2. Added Stream and Trigger
3. Wrote Lambda
Steps taken - Azure Side
1. Created an Azure Queue (Service Bus)
So far so good. I can see DDB Events making its way to Lambda.
My question is now I want to send these events to Azure Queue, I could not find any online google result for this. Is it possible to put elements in Azure Queue from AWS Lambda?
You can use Azure Service Bus REST API in order to send your messages:
POST http{s}://{serviceNamespace}.servicebus.windows.net/{queuePath|topicPath}/messages
https://learn.microsoft.com/en-us/rest/api/servicebus/send-message-to-queue

Resources