I'm fetching data from API, but I want to fetch data from the last time when logic run app to current time(to reduce redundancy).So where can I store the last date-time so I can use it in API. API provide that feature to pass time but in azure logic app where can I store last date information?
current logic app design
Logic App Designer
Try this, it was helpful to me:
azure-logic-apps-storing-variables-between-runs
For now logic app doesn't support such variable could be shared in different runs.
So if you want to implement this feature, you could use azure storage queue or service bus to do it, create a message to store value you want to use in next run. And every time after getting the value remember to delete the original data, also do put message action to store new data to queue or service bus.
Hope this could help you, if you still have other problem please feel free to let me know.
Related
I want add a functionality in my ADF pipeline which will send me email notification in case of failure. On searching the internet, I came to know that Azure Logic Apps helps with this. I am trying to follow below link to achieve this.
https://microsoft-bitools.blogspot.com/2018/03/add-email-notification-in-azure-data.html
I have tried searching up many tutorials, guides and the official docs as well. However, all of them have some templates already there in the Logic Apps Designer. I cannot find the templates and the 'When a HTTP request is received' trigger is also not available in the drop-down.
Please let me know how to proceed.
EDIT :
If you start with a blank Logic App, search for 'HTTP' or 'Request' and select Request.
On the next screen under triggers, select "When a HTTP request is received" and you should be good to go.
EDIT:
It looks like you created a Standard Logic App, which works in a slightly different way. For instance it can contain multiple workflows, which means you create workflows yourself. In the Consumption model, there's one workflow withing a Logic App, so you can open up the editor for that one directly. If there's no explicit reason for you to use Standard, a Consumption Logic App will be easier to work with.
If you really need a Standard Logic App, go to 'Workflows' and create a new workflow:
Then click the newly created workflow to edit it, go to 'Designer' and search for 'HTTP' to add an HTTP trigger:
Here's some information on the Consumption model for Logic Apps:
Resource type
Benefits
Resource sharing and usage
Limits management
Logic App (Consumption) Host environment: Multi-tenant Azure Logic Apps
- Easiest to get started - Pay-for-what-you-use - Fully managed
A single logic app can have only one workflow. Logic apps created by customers across multiple tenants share the same processing (compute), storage, network, and so on.
Azure Logic Apps manages the default values for these limits, but you can change some of these values, if that option exists for a specific limit.
See Resource type and host environment differences for a comparison with the other hosting options.
I was able to solve this. I wasn't able to view a few functionalities because of another error : Functions runtime error Microsoft.WindowsAzure.Storage: Value cannot be null. (Parameter 'connectionString').
AzureWebJobsStorage App Setting was missing which caused the error. I added that and now I can see the triggers and other stuff.
Thanks #rickvdbosch
I'm currently trying to work on my first Azure Function to act as an intermediary between Defender for Endpoints, and Azure Sentinel. It runs every 5 minutes, and collects data matching specific filters from the Defender API to then forward as custom logs to Azure Sentinel. Due to the authentication measures in place on Defender, I've set my script up using ADAL to do a device code logon the first time, then use the refresh tokens to do its scheduled running.
This is where I've come across the problem; since Azure Functions are serverless by design, holding this refresh token somewhere for the next invocation has proven troublesome. I'm trying to use Durable Functions, but the documentation for such a use case seems non-existent.
Are there other appropriate methods to store a singular variable across invocations of an Azure Function?
There are more than one way to solve the problem you are facing with holding the refresh token for every new invocation Azure Functions.
One way to solve the problem is by using a Azure Function Timer Trigger to request new access tokens and Azure Key Vault to store these tokens securely. We want to save them in Key Vault so the next time we invoke our function to refresh our tokens again, we will use the updated values and the next function will be able to obtain that value when invocated.
Check this document to access secrets from key vault.
Another way is enabling the token store in azure function and store it in blob storage. Check this document for more information.
I have an Azure Logic app that dynamically gets Blob contents from my azure storage account and sends an email with the attachment. I want to set a schedule for my logic app to run once a week.
Any idea how I can achieve this?
Here's my current workflow:
It depends on what you are trying to do. If you want to get an email every time your blob is updated, your current Logic App is the way to go. If you change the trigger to a Reccurrence trigger as Rob Ert stated than you could potentially lose updates (the blob could have many updates in a week). If you don't care about the individual updates, then Reccurrence is the proper trigger.
I think you are looking for Recurrence trigger's.
It's possible to set something like time triggers from regular Azure Functions.
Here's instruction how to create one in your logic app.
https://learn.microsoft.com/en-us/azure/connectors/connectors-native-recurrence
I've been searching for some time now for a way to interact with our Salesforce org easily through Azure functions and have been coming up dry. I guess where I am confused is that through Azure Logic Apps I can pretty simply connect into Salesforce and post data through them and I assume on their backend they must have some sort of built in connector to Salesforce.
Is this a package somewhere that I can utilize in Azure functions? This would simplify so much of what we are trying to accomplish with some of our integrations.
There isn't a built in Salesforce binding for Azure Functions, but one option you do have is to invoke your Logic Apps workflow from Azure Functions with the relevant payload, which would allow you to leverage all the built in connectors they have.
You don't really need any fancy connector.
For most use cases, you can use the Salesforce REST API:
https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/intro_rest.htm
You can use Postman to test this API and get code samples which could easily be adapted to your Function App:
https://www.postman.com/salesforce-developers/workspace/salesforce-developers/collection/12721794-67cb9baa-e0da-4986-957e-88d8734647e2?ctx=documentation
But be careful!
Salesforce limits your API calls and it is very easy to blow these limits.
The naïve approach would be to post one record at a time. Instead, try to collect as many records as possible and post them all at once.
If Salesforce does not need the data immediately, you can build a cache that will store data somewhere else (e.g. a MongoDb) and then periodically forward the data to Salesforce after it has been allowed to accumulate to some threshold quantity and/or a certain amount of time has elapsed.
I have looked through documentation for WebJobs, Functions and Logic Apps in Azure but I cannot find a way to schedule a one-time execution of a process through code. My users need to be able to schedule notifications to go out at a specific time in the future (usually within a few hours or a day from being scheduled). Everything I am reading on those processes is using CRON expressions which is not designed for one-time executions. I realize that I could schedule the job to run on intervals and check the database to see if the rest of the job needs to run, but I would like to avoid running the jobs unnecessarily if possible. Any help is appreciated.
If it is relevant, I am using C#, ASP.NET MVC Core, App Services and a SQL database all hosted in Azure. My plan was to use Logic apps to check the database for a scheduled event and send notifications through Twilio, SendGrid, and iOS/Android push notifications.
One option is to create Azure Service Bus Messages in your App using the ScheduledEnqueueTimeUtc property. This will create the message in the queue, but will only be consumable at that time.
Then a Logic App could be listening to that Service Bus Queue and doing the further processing, e.g. SendGrid, Twilio, etc...
HTH
You could use Azure Queue trigger with deferred visibility. This will keep the message invisible for a specified timeout. This conveniently acts as a timer.
CloudQueue queueOutput; // same queue as trigger listens on
var strjson = JsonConvert.SerializeObject(message); // message is your payload
var cloudMsg = new CloudQueueMessage(strjson);
var delay = TimeSpan.FromHours(1);
queueOutput.AddMessage(cloudMsg, initialVisibilityDelay: delay);
See https://learn.microsoft.com/en-us/dotnet/api/microsoft.azure.storage.queue.cloudqueue.addmessage?view=azure-dotnet for more details on this overload of AddMessage.
You can use Azure Automation to schedule tasks programmatically using REST API. Learn about it here.
You can use Azure Event Grid also. Based on this article you can “Extend existing workflows by triggering a Logic App once there is a new record in your database".
Hope this helps.
The other answers are all valid options, but there are some others as well.
For Logic Apps you can build this behavior into the app as described in the Scheduler migration guide. The solution described there is to create a logic app with a http trigger, and pass the desired execution time to that trigger (in post data or query parameters). The 'Delay Until' block can then be used to postpone the execution of the following steps to the time passed to the trigger.
You'd have to change the logic app to support this, but depending on the use case that may not be an issue.
For Azure functions a similar pattern could be achieved using Durable Functions which has support for Timers.