I am new to cloud and Azure, so I do not know how logical my thoughts seem. Please tell me if there are better approaches to solve the issue.
There is a SharePoint list of jobs with their corresponding Cron expressions. The jobs all execute same tasks but with different parameters and schedules.
What already is done is a logic app that is scheduled to run every 5 minutes to scan the list and run the jobs which match the current timestamp but I do not find it really logical.
I was thinking of something like having a script which creates separate jobs for each list item (like dynamic DAG creation in Airflow based on a template); however, I cannot find any resources to learn how to do it.
Can I create separate logic apps by using JSON representations of tasks for each item in SharePoint list with their corresponding parameters and deploying them on Azure using a Python script?
Below is something that you can follow to achieve your requirement, I have created the same flow as yours over logic apps expect for the part where I have added Azure function which creates separate jobs for each list item through HTTP trigger. Below is the flow of my logic app.
I'm using the below sample code in my Function.
import logging
import azure.functions as func
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
body = req.params.get('name')
return func.HttpResponse(f"{body}")
However, you can write your custom code inside it.
IN SHAREPOINT FILE
RESULTS:
Related
I have a custom handler written in Go running as an Azure Function. It has an endpoint with two methods:
POST /entities
PUT /entities
It was easy to make my application run as an Azure function: I added "enableForwardingHttpRequest": true to host.json, and it just works.
What I need to achieve: life happened and now I need to enqueue a message when my entities change, so it will trigger another function that uses a queueTrigger to perform some async stuff.
What I tried: The only way I found so far was to disable enableForwardingHttpRequest, and change all my endpoints to accept the Azure Function's raw JSON input and output, and then output a message in one of the fields (as documented here.
It sounds like a huge change to perform something simple... Is there a way I can enqueue a message without having to change all the way my application handles requests?
As per this Github document as of now custom handlers related to go lang in Azure functions having a bug and which need to be fixed.
I'm guessing this has been asked before, but I haven't been able to find the answer... Is there a way to look at a trigger to see which functions are being started by it?
We are building functions that will be run on different timed triggers. We'd like to be able to see which ones set to the hourly trigger or the daily trigger (as an example).
We were finally able to find an answer to this question... Inside your DataFactory...
Go to Manage
Select Triggers
View the code for the trigger
The associated functions are shown under the pipelines as pipelineReferences
Timer Trigger function is a non Http-Triggered Function & lets you run a function on a schedule based on the CORN Expression.
You can look at the CRON expression which was declared in the function.json file of a particular function.
You can also refer the below documentation to get more understanding about the corn expressions.
Alternatively, you can look at the logs of the function app based on the logs you validate which function is running at hourly basis & daily basis.
You can refer the below SO thread for more information about how to fetch the logs of a particular function under functionapp.
I have job(script) which is written in nodejs.
I have another API which writes data (Id and time-t1) to the Cloud Spanner. As soon as the API is a hit I want to run the same job at given time-t1 and pass id as parameter
Can I write some code in my API which will trigger the job at a given time (Note - for a single hit on API job should run job only once). I tried searching on the net but could only find periodic scheduler.
In order to schedule a task for a specific dynamic time you can use Google Cloud Task and Google Cloud Functions
Read it here:
https://cloud.google.com/tasks/docs/tutorial-gcf
I have created a function which is triggered by a timer. It runs fine when triggered from the Azure portal by clicking on the Run option on the exact time provided through the database. But does not fire automatically for the scheduled date taken from a database. The timer schedule here is not static but the value is taken from the database and converted to CRON expression. Used INameResolver to resolve the name and overwrite it with the database date and time by converting it to CRON expression.
It runs fine when a constant CRON expression is passed to TimerTrigger attribute.
Any help on this matter would be appreciated.
There is a doc about this: Dynamic update of Azure Web Job time schedule.
It uses NameResolver to get dynamic bindings from AppSettings, for more details you could refer to this: Custom binding expressions.
So you could set dynamic %TriggerSchedule% to AppSettings with code like this answer. Then combine the webjob Function code.
Is it possible to use dynamic content in the POST body for a scheduled job in Azure scheduler?
I am writing a logic app that I would like to be able to pass a start time and a look back minute count to so that a failed invocation can be re-run across the same underlying data by passing the same parameters. I'm wondering if there are functions or operations similar to what can be found in logic apps for values such as utcNow()
We do not support dynamic content in Scheduler, you may find some timestamp in the request header in the calls Scheduler made though.
Why are you not using Logic Apps when it can perform what you need?