We have a inhouse small application that has a Azure Easy Table with a list of orders.
I would like that when one-person changes anything in this table a notification is send to the other PCs running the application to update this item.
Is there any way to monitor the Easy table for changes and then send update notification, or change the .js file of the Easy Table to do something on update?
Related
Is there a way to trigger a Logic App on a deletion of a record in an Azure SQL table?
I've checked the SQL Connector and there is only When an item is created and When an item is modified, which gives me the C and U in CRUD, but sadly there isn't an out-of-the-box trigger for the D.
I can think of some awful way of polling to get record deletions, but I'm hoping that there is a cleaner solution that some bright person has come up with, however I've had no joy with the Google searching.
I would look at the Azure Event Grid. Azure Event Grid allows you to easily build applications with event-based architectures. First, select the Azure resource you would like to subscribe to, and then give the event handler or WebHook endpoint to send the event to. Event Grid has built-in support for events coming from Azure services, like storage blobs and resource groups. Event Grid also has support for your own events, using custom topics.
I would suggest monitoring the resource group and triggering it off the deletion from the RG. There is a tutorial that shows this same concept with a VM but you should be able to modify it to meet your needs with an Azure SQL DB.
https://learn.microsoft.com/en-us/azure/event-grid/monitor-virtual-machine-changes-event-grid-logic-app
I added an on delete trigger which adds the id of the deleted record to a secondary table. I have the logic app look for modifications on the secondary table.
I am new to Azure so forgive me if my question sounds unclear but I will try to explain the best I can.
I think it is common nowadays to need some kind of push notifications rather than pulling data in a timed interval.
So, if I have a mobile app, a web app, and a desktop app all talking to Azure, if one of these apps updates something in Azure SQL database, I would like to avoid to have to pull for this change in my other apps but instead get pushed these changes automatically to them.
I think there should be some kind of mechanism in Azure notifying the applications (whether web, mobile, desktop) about these changes.
Is there something like that? What should I look into?
UPDATE 1
Assume I have an web (Angular or whatever) Azure app talking to Azure SQL database storing cars information. This app allows me to do CRUD operations so I can add, update, delete, read cars from database.
Database currently has info about BMW and Toyota only
User logs into my web app and sees info about BMW and Toyota which is info
existing in my Azure SQL database.
User logs into my mobile app which connects to azure and pulls info from database and shows BMW and Toyota on screen
User logged in web app adds new info about Honda (or deletes an existing car, or updates an existing car), info is stored in
database and Honda shows in web app
User logged into mobile app would now have to tap on refresh button to pull the latest data from database.
How can this data be automatically and immediately pushed to mobile app instead of having mobile app pull it in some interval or on tap on a Refresh button?
Absolutely! Azure has several messaging solutions to solve this type of problem. But without fully understanding your problem space, the semantics of your data or your overall architecture, it's hard to give precise and tailored guidance. Given that you've mentioned that some of your clients will be mobile devices, you should opt for something lightweight such as Azure Notification Hubs. You should also review the Service Bus and Event Hubs here.
With whatever service you choose, I wouldn't recommend sending the actual data itself in these push notifications. Keep notifications extremely lightweight. A client, upon receiving the notification, can react by polling your backend for the actual data change.
I have a bunch of sensors (currently 350) that are sending a total of around 500000 messages a day to an Azure IoT Hub. The sensors are grouped in to differently sized studies and i need to report on those studies each month.
I've tried to use stream analytics but couldn't find a way to dynamically route the messages to their respective locations. I dont want to have to add an individual output for each study.
Can anyone suggest a way to get these messages in to an Azure Data Lake so that each study's messages get put in their own folder eg \{StudyID}\{Year}\{Month}\[Message Data].
Sorry for the inconvenience. At this time, you cannot add custom variable in the output structure when exporting data from ASA to Azure Data Lake.
We're keep this request in our backlog for future updates of the product.
You can also add suggestions in our User Voice portal here: https://feedback.azure.com/forums/270577-azure-stream-analytics
Thanks,
JS
I have a service fabric cluster on Azure and it has a very simple app running on it. The app is from this tutorial.
When running the app locally, the Visual Studio Diagnostic Events shows 3 events.
CRM
MasterCRM
ServiceMessage
I believe the CRM and MasterCRM are related to the cluster manager and the ServiceMessage shows events from my app, in this case just a message saying the current value of a counter.
This data is also saved in a table storage, I was wondering is there any way for me to control what gets saved to the table storage? Right now my table consists of pages and pages of CRM and MasterCRM messages and I've yet to see messages from my app, I'm sure if I keep going I might eventually see it, but so far no luck.
I'd like to just save the events from my app to the table storage and ignore the rest. I've looked around and found no way to do it.
The events you refer to are coming from ETW from the fabric runtime (CRM, MasterCRM) and your application (ServiceMessage) like you mentioned. The diagnostics viewer in Visual Studio is getting these events directly from ETW and not Azure Table Storage. If you want to filter the events showing up in the diagnostics viewer you can click the gear icon and edit the sources listed.
*CRM comes from Microsoft-ServiceFabric:5:0x4000000000000000.
Controlling what events get uploaded to Azure Table Storage in an Azure hosted cluster would require editing the ARM template's diagnostics section similarly.
I am trying to workout the best implementation/approach to the following problem
I have customers using our win forms application which has a plugin which will connect to the Azure Queue to check if there are awaiting invoices for the connecting customer at pre conf intervals. If there is then the plugin will download the invoices into the customers local db. There are lots of customers using this application so all of them will connect to the queue. They will all need to download their own invoices
How I thought of implementing this was by having named queues for each customer (the customer GUID will identify the queue). So all the customers will use the same Account key/name to connect to the queue. Now the problem with this is that each customer has the account key/name in the dll which they can reflect and retrieve (smart customers). So is there a way I can encrypt the key/name or is there a better solution that somebody can suggest
I think the only secure option is to stand up a web service somewhere that acts as a front-end to the queues. Otherwise, as you said, you're leaking the account key to the client, which would allow any customer to read/change/delete any data in the account.