I have an Azure Logic app that dynamically gets Blob contents from my azure storage account and sends an email with the attachment. I want to set a schedule for my logic app to run once a week.
Any idea how I can achieve this?
Here's my current workflow:
It depends on what you are trying to do. If you want to get an email every time your blob is updated, your current Logic App is the way to go. If you change the trigger to a Reccurrence trigger as Rob Ert stated than you could potentially lose updates (the blob could have many updates in a week). If you don't care about the individual updates, then Reccurrence is the proper trigger.
I think you are looking for Recurrence trigger's.
It's possible to set something like time triggers from regular Azure Functions.
Here's instruction how to create one in your logic app.
https://learn.microsoft.com/en-us/azure/connectors/connectors-native-recurrence
Related
I´ve created an Azure Synapse Analytics Pipeline that must be triggered by the creation of a file within a Azure Gen2 storage account.
Somehow the blob creation event (i.e. when I upload the file in the corresponding container and folder) doesn´t fire anything and the pipeline does not start. I´ve registered the Microsoft.EventGrid and Microsoft.Synapse resource providers in the subscription, as suggested by the Microsoft official documentation.
Am I missing anything? As far as I know, and according to the Microsoft documentation and the many tutorials I've read, I don´t need any Event Topic/Event subscription...
Can you please check the content type of the file :
usually when that is blank, event trigger is not initiated
I tried to reproduce your scenario in my environment, and it works for me (i.e., when I upload the file in the corresponding container and folder). Let me share my implementation and then you can compare with yours.
This is the setup for the trigger
The trigger is firing as expected.
Files uploaded date time
Trigger firing date time
I still didn´t figure out what is not working, so I implemented a workaround: a simple ADF pipeline looping for files in the landing zone. The pipeline is associated with a normal schedule trigger (it runs 3 times a day) and it calls in turn the pipeline I originally wanted to be triggered by the file creation trigger.
What is the best way of using Azure technology (azure function apps, durable function app, azure web jobs, ?) to get a schedule from a database depending on the specific task that needs to be performed.
The schedule that would be read from the database, would then be converted to a CRON expression. I have hourly, daily, and weekly triggers to run specific tasks.
The schedule within the database could possibly change, hence why it would need to be dynamic.
I could create individual specific function apps (over 20) but wanting it to be more flexible and dynamic.
How would i go about this? what azure tech would i use?
Thanks
Two options I can think of, depending on the details of what you're after:
Use timer-triggered Azure Functions with the %appsetting% format for the timer expression. Then have a separate function that's timed to run regularly or triggered when the schedules update, get the current schedule and update the function app settings with any changes to the CRON expressions (using a managed identity that gives it the necessary permissions).
If you're just running on the hour/day/week, you could have a function that runs every hour, reads the current schedule and triggers any functions that are scheduled for that time (basically your own simple timer).
I'm fetching data from API, but I want to fetch data from the last time when logic run app to current time(to reduce redundancy).So where can I store the last date-time so I can use it in API. API provide that feature to pass time but in azure logic app where can I store last date information?
current logic app design
Logic App Designer
Try this, it was helpful to me:
azure-logic-apps-storing-variables-between-runs
For now logic app doesn't support such variable could be shared in different runs.
So if you want to implement this feature, you could use azure storage queue or service bus to do it, create a message to store value you want to use in next run. And every time after getting the value remember to delete the original data, also do put message action to store new data to queue or service bus.
Hope this could help you, if you still have other problem please feel free to let me know.
Is Azure functions a good alternative to Azure Data Factory to use as scheduler? It has blob trigger to monitor and can use C# to trigger databricks jobs using API. But is it a viable alternative.
Edited to add more information. Wanted to trigger a databricks job based on a trigger file but do not want to use Azure Data Factory or Data bricks job.
I would probably use simple logic app with Event Grid trigger on blob storage event blob created event. Based on trigger data I would call Databricks Job REST API.
I did entire demo below working in under 10 minutes so its fast to set up.
With this demo I used
And logic app setup as trigger
Where I strongly suggest to add prefix filter like
/blobServices/default/containers/<container_name>
So you don't fire too many logic apps from different containers as event grid reacts to all events in entire storage account.
And HTTP call like so
Of course at this point simply change clusters list to submitting job REST call.
And see execution like
Just make sure that EventGrid resource provider is registered or logic app will never fire off.
AFAIK, Azure Logic App can be triggered on events (eg: creation of a record in Dynamics CRM).
If Logic Apps is triggered based on an event, why do we need a 'frequency' field?
I understand that as the frequency increases (check every minute instead of checking every hour), the costs increase. Is that right?
If I am not mistaken, this is still not an 'event', more something of a 'pulling event' from Azure. Logic apps will look for new items in this case. Hence you only set how often the trigger will do the pull event which you pay for every time it is triggered.