Accessing and editing the the Pub/Sub topic on the Device Access Console - nest-api

I'm building automation for a Nest thermostat. The automation infrastructure is attempting to use GCP's Pub/Sub and Cloud Functions services.
When creating a new project on Google's Device Access Console I don't see a way to update the Pub/Sub topic. Technically, there is an edit button, but the topic text field is greyed out and the string can't be changed. Nor do I see a way to access the auto-populated topic from inside my GCP project. As a result, I don't see how I can build a Function subscriber to the topic from my GCP project.
Interestingly, there is a way to create a Pub/Sub subscriber because that interface provides a method to manually enter the Pub/Sub topic that is shown on the Device Access Console. I've done this and verified that device data flows correctly.
There is no manual entry option when creating a Function that subscribes to a topic.
When I click on the topic listed on the Pub/Sub Subscriber, I'm presented with an error message,
You do not have sufficient permissions to view this page
How can I build a Cloud Function that responds to Pub/Sub events for the device?

Related

Using Pub/Sub for Google Cloud Storage with GKE

I have a GKE application that currently is driven by Notifications from a Google Cloud Storage bucket. I want to convert this node.js application to be triggered instead by PubSub notifications. I've been crawling through Google documentation pages most of the day, and do not have a clear answer. I see some python code that might do it, but it's not helping much.
The code as it is currently written is working - an image landing in my GCS bucket triggers a notification to my GKE pod(s), and my function runs. Trying to understand what I need to do inside my function to subscribe to a Pub/Sub topic to trigger the processing. Any and all suggestions welcome.
Firstly thanks, I didn't know the notification capability of GCS!!
The principle is close but you use PubSub as intermediary. Instead of notify directly your application with a watchbucket command, you notif a PubSub topic.
From there, the notifications arrive in PubSub topic, now you have to create a subscription. 2 types are possible:
Push: you specify an HTTP URL that is called with a POST request, and the body contain the notification message.
Pull: your application need to create a connection with the PubSub subscription and to read the messages.
Pro and cons
Push requires an authentication from the PubSub push subscription to your application. And if you use internal IP, you can't use this solution (URL endpoint must be publicly accessible). The main advantage is the scalability and the simplicity of the model.
Pull require an authentication of the subscriber (here your application) and thus, even if your application is privately deployed, you can use Pull subscription. Pull is recommended for high throughput but require higher skill in processing, concurrency/multi-threading programming. You don't scale on request rate (as with Push model) but according to the number of message that you read. And you need to acknowledge manually the messages.
Data model is mentioned here. Your pubsub message is like that
{
"data": string,
"attributes": {
string: string,
...
},
"messageId": string,
"publishTime": string,
"orderingKey": string
}
The attributes are discribed in the documentation and the payload (base64 encoded, be carefull) has this format. Very similar of what you get today.
So, why the attributes? Because you can use the filter feature on PubSub to create subscription with only a subset of messages.
You can also shiht gears and use Cloud Event (base on Knative events) if you use Cloud Run for Anthos in your GKE cluster. Here, the main advantage is the portability of the solution, because the messages are compliant with Cloud Event format and not specific to GCP.

Azure Service Bus Topic as Google PubSub Push Subscriber

The requirement is that we have an Azure Service Bus topic which we want to set up as Push Subscriber in Google PubSub Topic. This way any messages published to Google PubSub Topic will be pushed to Azure SB Topic without any intermediate layer involved.
On Paper this should work because messages can be published to Azure SB Topic using API framework and Google PubSub can also configure API as Push Subscribers.
I have gone through following articles but couldn't make this linking workout.
Azure SB as API: https://learn.microsoft.com/en-us/rest/api/servicebus/send-message-to-queue
Google PubSub Push Subscriptions: https://cloud.google.com/pubsub/docs/push
Has anyone done this kind of linking before?
Thanks in Advance
It would be nice to create a pub/sub push subscription, which pushes to Azure Event Hub. I ran into the same problem when configuring this setup.
Pub/sub push subscriptions currently do not support custom Authorization headers, which are needed as per the Event Hub documentation.
POST https://your-namespace.servicebus.windows.net/your-event-hub/messages?timeout=60&api-version=2014-01 HTTP/1.1
Authorization: SharedAccessSignature sr=your-namespace.servicebus.windows.net&sig=your-sas-key&se=1403736877&skn=RootManageSharedAccessKey
Content-Type: application/atom+xml;type=entry;charset=utf-8
Host: your-namespace.servicebus.windows.net
{ "DeviceId":"dev-01", "Temperature":"37.0" }
So the only two options I see here:
Push setup: Create a cloud function or dataflow job in Google Cloud. Push pub/sub events to this endpoint and then pass on the event to Azure Event hub with the appropriate headers.
Pull setup: Poll a pub/sub pull subscription from the Azure side with an Azure function or WebJob.
Both options require extra compute resources, so definitely not the preferred way of doing this. I would always try a push setup first, since then you don't have t o continuously have a polling job running in the background.
I have hopes that pub/sub push subscriptions will support custom headers anywhere in the future. Lately some other useful features have been added like: Dead Lettering, Message Ordering and Partitioning (Lite Topics). Hopefully they will add custom headers as well.
I have a workaround on this problem.
1- you can create cloud function on google cloud that can push data for you on azure service bus.
2- you can develop web job on azure that will run continuously and check google pub/sub topic with the help of provided connection string and google pub/sub supporting libraries.
From above mentioned solution you can get data from google cloud and push to your azure service bus.

How to consume events delivered by Azure Event Grid to GCP

Basically what I understood from few Azure topics is as below:
Azure Event Hub - where data is received initially and converted into events
Service Bus- acting as a queue
Azure Event Grid - where events converted in hub are transferred here.
so the connection is like below:
Hub -> Service Bus -> Event Grid -> Pub Sub -> Storage
I understood this concept. My problem is I want data to be pushed from the event grid to GCP (subscription / topics). My question are:
How can I establish this using PUSH method?
What do I need to develop exactly?
How can I push things from grid to pubsub/subscriptions?
I found this link where data is getting published into Event Grid but I want to push data from the event grid to gcp. Can anybody explain me where am I going wrong or what exactly should I start with. I am new to this and its very confusing so I just need little bit of guidance over here.
I have below doubts:
Is there any direct subscriber option available with event grid listener? I mean can I directly link my google storage account with this listener so, whenever there is an event triggered it will be directly pushed to my GCP account(I don't have Azure account with me right now since access issue is in progress so I can't see it that's why I am asking here)
Suppose I have 20 columns in my data but I want only 16 columns to be pushed in GCP so is there any customization possible while sending data from event grid/event hub to pub/sub
If I write custom connectors code as per the links provided in the below answers then how can I run it? I mean where I can deploy those scripts on the cloud so that they will be triggered automatically whenever an event is triggered?
Can I implement webhooks in this scenario? (as an alternative to connectors), If yes then how can I do it and on which side do I need to create it?
Also, I read some articles and I came to know from a few guys that they experienced data loss in this entire process. So, what's the possibility over here and how can it be avoided
Can anybody explain me where am I going wrong or what exactly should I start with.
It's right here:
so the connection is like below:
Hub -> Service Bus -> Event Grid -> Pub Sub -> Storage
Although this might be the case, it sounds very much as if you're looking at one (very) specific scenario where data flows in this exact way.
Azure Event Hub, Azure Service Bus and Azure Event Grid can work together, but can also be used completely separate from each other.
Event Grid
The purpose of Event Grid is to enable Reactive programming. Use this when you want to react to (status) changes.
Event Hubs
Event Hubs facilitate a big data pipeline. Use this when you need telemetry and distributed data streaming.
Service Bus
The purpose of Service bus is to enable High-value enterprise messaging. Use this when you want to do something like Order processing and financial transactions.
In some cases, you use the services side by side to fulfill distinct roles. For example, an ecommerce site can use Service Bus to process the order, Event Hubs to capture site telemetry, and Event Grid to respond to events like an item was shipped.
In other cases, you link them together to form an event and data pipeline. You use Event Grid to respond to events in the other services. For an example of using Event Grid with Event Hubs to migrate data to a data warehouse, see Stream big data into a data warehouse.
Taken from the very interesting and important documentation article Choose between Azure messaging services - Event Grid, Event Hubs, and Service Bus
EDIT
My problem is I want data to be pushed from event grid to GCP (subscription / topics). So how can I establish this using PUSH method??
Possibly the simplest solution is to have an Event Grid Event trigger a webhook (which might run an Azure Function or a Google Cloud Function) which in turn puts the event/message on the GCP Topic.
Publishing messages is quite well documented. There are examples on how to do so with a REST call, command-line, C#, Go, JAVA, NodeJS, PHP, Python and Ruby.
EDIT 2
What you need to do is create an Event Grid Subscription to listen to and handle Event Grid Events.
Here's an example screenshot on how to listen for events for a specific Storage Account and call a WebHook whenever such an event occurs:
Pay attention to the "Endpoint Details": that's where you can specify to, for instance, call a webhook every time an event is triggered.
The easiest way to transfer the EventHub generated events would probably be to create an EventHub event receiver in Node.js (which you mentioned in your comments) as described here, which receives events and publishes them to Cloud Pub/Sub directly, as described in the Cloud Pub/Sub publisher documentation for Node.js.

Can we subscribe an email ID or Cell number as subscriber to Azure event hubs/notification hubs?

In my python application, if any bad/good event happens, I want to send the event details as notification message to user's email addresses or phone #s that have been subscribed to this application. So I am looking for publisher-subscriber model azure cloud
Looks like multiple Azure services achieving similar goal but having a thin line of differences. Event hubs and notification hubs seems promising. So my question is as follows:
Can email ID/phone # be subscribed to Azure event hub and receive the message being sent/produced to Azure event hub?
If not event hub, what is the correct option? Can I achieve it with Service bus or Notification hub?
In AWS, there is a service called SNS (Simple Notification Service) where one can subscribe email/phone number and opt for receiving event messages about that application. I am looking for equivalent to that in Azure.
You can use the Azure Logic Apps / Azure Functions with Event Hubs to achieve this easily.
Using logic apps you can do like simple as below image.
Logic Apps has many in-build connectors for most all Azure Services, you can use Event-hubs,Service bus,SQL etc.,
You can find all the list of available connectors here
Update 1
Once you connected the Event-Hubs to send an Email connector, you will automatically get all the available source data from event-hubs to email task. See below
You can achieve this by using Azure Application Insights. With this, you will be able to monitor your application and receive alerts during application unavailabillity, failures or even during performance issue.
Check this https://learn.microsoft.com/en-us/azure/application-insights/app-insights-tutorial-alert

How to forward a message from Azure estate to NodeJS app?

I'm creating an IoT solution. On my web app i want to have a table on the dashboard that updates alert events in realtime.
I currently have an API, written in NodeJS that receives the JSON and invokes sockets.io to update the table. This solution seems a bit clunky.
I'm wondering is there a more seamless way to do this, similar to how the different components within azure link together.
I've looked into Azure Queues and having nodeJS subscribe and consume from the queue but as far as I could find there's no way to persist a queue connection from NodeJS and I would have had to continually poll.
I've looked into Power BI as well but I need to be able to completely change and modify every aspect of the design too.
The following is what I have so far:
Devices send data to IoT Hub. This is then processed by an azure stream analytics job and if a certain criteria is hit it sends the message to a documentDB for storage, and also to a service bus queue. I have a logic app which is triggered by a message arriving on the service bus queue and then POSTs the data to my API.

Resources