AFAIK, Azure Logic App can be triggered on events (eg: creation of a record in Dynamics CRM).
If Logic Apps is triggered based on an event, why do we need a 'frequency' field?
I understand that as the frequency increases (check every minute instead of checking every hour), the costs increase. Is that right?
If I am not mistaken, this is still not an 'event', more something of a 'pulling event' from Azure. Logic apps will look for new items in this case. Hence you only set how often the trigger will do the pull event which you pay for every time it is triggered.
Related
I have a Logic App running every minute that checks the time that data was last received in a table. If it has been enough time since the data was updated I want to receive an alert. I really like the Action Groups used by the Alerts in Azure. They are clean and have lots of options like email, SMS, and Phone. How can I trigger an Action Group from my Logic App?
I know I can recreate the email, SMS, and Phone connections in the Logic App, but then it's harder to maintain. I'm already using the same Action Group for other Alerts. It would be easier to maintain if I could reuse this Action Group.
There is ton online about triggering a Logic App from an Action Group. This is NOT what I'm trying to do. I want the reverse. I want to trigger an Action Group from a Logic App.
How can I trigger an Action Group from my Logic App?
Currently as per the documentation We can trigger a particular logic app using the action group but there is no way to trigger a particular action group using logic app.
It would be easier to maintain if I could reuse this Action Group.
Yes, you can use same action group in multiple alert mechanisms.
Would suggest you to raise a feature request using this azure support link.
You should be able to send data to a custom log in Log Analytics from your Logic App using Azure Log Analytics Data Collector.
Then you can use a Log Analytics query to evaluate resources logs every set frequency, and fire an alert based on the results. Rules can trigger one or more actions using Action Groups. - see Create, view, and manage log alerts using Azure Monitor.
I have an Azure Logic app that dynamically gets Blob contents from my azure storage account and sends an email with the attachment. I want to set a schedule for my logic app to run once a week.
Any idea how I can achieve this?
Here's my current workflow:
It depends on what you are trying to do. If you want to get an email every time your blob is updated, your current Logic App is the way to go. If you change the trigger to a Reccurrence trigger as Rob Ert stated than you could potentially lose updates (the blob could have many updates in a week). If you don't care about the individual updates, then Reccurrence is the proper trigger.
I think you are looking for Recurrence trigger's.
It's possible to set something like time triggers from regular Azure Functions.
Here's instruction how to create one in your logic app.
https://learn.microsoft.com/en-us/azure/connectors/connectors-native-recurrence
I'm working on an Azure Logic app that should trigger when a new resource is created.
However, if I trigger the app based on a webhook using monitor alerts or an event subscription, I run into the problem of each creation event will have 2 identical events with all the output being identical which means I can't filter it out and therefore triggers the logic app twice.
If there a different route around to just get the app to trigger once?
I believe the multiple events are because the event type for both create or update is the same as documented.
One way to workaround this would be to keep track of resourceIds touched by your Logic App, OR add a tag to your resource which signals that it has been touched. This way you wouldn't need an extra store for this metadata.
I have looked through documentation for WebJobs, Functions and Logic Apps in Azure but I cannot find a way to schedule a one-time execution of a process through code. My users need to be able to schedule notifications to go out at a specific time in the future (usually within a few hours or a day from being scheduled). Everything I am reading on those processes is using CRON expressions which is not designed for one-time executions. I realize that I could schedule the job to run on intervals and check the database to see if the rest of the job needs to run, but I would like to avoid running the jobs unnecessarily if possible. Any help is appreciated.
If it is relevant, I am using C#, ASP.NET MVC Core, App Services and a SQL database all hosted in Azure. My plan was to use Logic apps to check the database for a scheduled event and send notifications through Twilio, SendGrid, and iOS/Android push notifications.
One option is to create Azure Service Bus Messages in your App using the ScheduledEnqueueTimeUtc property. This will create the message in the queue, but will only be consumable at that time.
Then a Logic App could be listening to that Service Bus Queue and doing the further processing, e.g. SendGrid, Twilio, etc...
HTH
You could use Azure Queue trigger with deferred visibility. This will keep the message invisible for a specified timeout. This conveniently acts as a timer.
CloudQueue queueOutput; // same queue as trigger listens on
var strjson = JsonConvert.SerializeObject(message); // message is your payload
var cloudMsg = new CloudQueueMessage(strjson);
var delay = TimeSpan.FromHours(1);
queueOutput.AddMessage(cloudMsg, initialVisibilityDelay: delay);
See https://learn.microsoft.com/en-us/dotnet/api/microsoft.azure.storage.queue.cloudqueue.addmessage?view=azure-dotnet for more details on this overload of AddMessage.
You can use Azure Automation to schedule tasks programmatically using REST API. Learn about it here.
You can use Azure Event Grid also. Based on this article you can “Extend existing workflows by triggering a Logic App once there is a new record in your database".
Hope this helps.
The other answers are all valid options, but there are some others as well.
For Logic Apps you can build this behavior into the app as described in the Scheduler migration guide. The solution described there is to create a logic app with a http trigger, and pass the desired execution time to that trigger (in post data or query parameters). The 'Delay Until' block can then be used to postpone the execution of the following steps to the time passed to the trigger.
You'd have to change the logic app to support this, but depending on the use case that may not be an issue.
For Azure functions a similar pattern could be achieved using Durable Functions which has support for Timers.
In our app (Azure hosted) we produce invoices, these have to be injected into an on premise accounting software. It is not possible to host an API that would be reachable from the Azure to post the invoices to.
Is it possible to create an exe that runs on-premise an that get's triggered by Azure Q-messages like WebJobs can ? Once triggered retrieve the invoice from a blob-storage-object.
Other suggestions are also welcome.
One important thing I want to mention is that even WebJobs poll the queue at predetermined interval (I believe the default is 30 seconds). Azure Queues don't support triggering mechanism like you think.
What you want to do is entirely possible though. What you could do is write a Windows Service, that essentially wakes up at a predetermined interval and checks for messages in the queue. If it finds messages, then it processes those messages otherwise go back to sleep again.