I'm having problems designing recurrence triggers with logic apps. As far as I know, the logic apps do not support CRON expressions, and running a daily trigger with conditions does not seem to be enough, so I am totally at an loss.
Edit:
To be more precise about my problem, The logic app is for moving files from one server to other and outside constraints dictate that this move should be completed once every month, and the move should happen on the third business day (monday-friday) of the month.
I'm currently pondering on either saving a global variable to tell me whether the app has succesfully ran this month, and using conditions to check every day whether it should run on the day, or starting the running a script which determines if current date is the third weekday of current month, and using that to determine if the logic app should execute or terminate.
You can use recurrence trigger in the logic app to trigger the workflow for every three weeks on Monday .
For more information about recurrence trigger you can refer this documentation.
Updated Answer:
As per the requirement, we have created a logic app with recurrence as a trigger & frequency as day. This workflow will fires every day and will validates whether current date is in between (3,4,5) or not.
if the condition is succeeded it will further executes the logic app actions.
Here is the logic app that we have created:
Here is the code view of the logic app:
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose_2": {
"inputs": "#formatDateTime(utcNow(), 'dd')",
"runAfter": {},
"type": "Compose"
},
"Condition": {
"actions": {
"Compose": {
"inputs": "#utcNow()",
"runAfter": {},
"type": "Compose"
}
},
"expression": {
"or": [
{
"equals": [
"#int(outputs('Compose_2'))",
3
]
},
{
"equals": [
"#int(outputs('Compose_2'))",
4
]
},
{
"equals": [
"#int(outputs('Compose_2'))",
5
]
}
]
},
"runAfter": {
"Compose_2": [
"Succeeded"
]
},
"type": "If"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {},
"triggers": {
"Recurrence": {
"evaluatedRecurrence": {
"frequency": "Day",
"interval": 1,
"schedule": {
"hours": [
"17"
],
"minutes": [
16
]
},
"startTime": "2021-12-28T17:14:00",
"timeZone": "India Standard Time"
},
"recurrence": {
"frequency": "Day",
"interval": 1,
"startTime": "2021-12-28T17:14:00",
"timeZone": "India Standard Time"
},
"type": "Recurrence"
}
}
},
"parameters": {}
}
Note:
Execution of logic app on every day may result in more billing & also logic app doesn't support any CORN expressions. In these scenarios it is suggested to use Azure time trigger functions instead of logic app.
There are three major options I can see, two of which I would entertain the thought of here. Main reason, due to cost, I've left out the use of the Inline Javascript action because it requires an integration account which will incur substantially more cost than either of the options below.
Prerequisites
The trigger should be a recurrence every day. It will still need to run, it's just whether or not it does anything. There's no getting away from that.
Initialise a variable of type String with the current date. I've included a formula that takes UTC and converts it to your local timezone, you'd just need to change it accordingly.
convertTimeZone(utcNow(), 'UTC', 'AUS Eastern Standard Time')
Option 1 - Azure Function
An Azure Function is by far the easiest way. Create a .NET function app (or whatever language you're comfortable with but you won't be able to use the code below) and from there, create a function with the following code called GetBusinessDayOfMonthForDate ...
public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
DateTime dateToProcess = DateTime.Parse(req.Query["Date"]);
int businessDayCount = 0;
var tempDate = dateToProcess;
while (tempDate.Month == dateToProcess.Month)
{
businessDayCount += (tempDate.DayOfWeek.ToString().Substring(0, 1) != "S") ? 1 : 0;
tempDate = tempDate.AddDays(-1);
}
return new OkObjectResult(businessDayCount.ToString());
}
Now call that from Logic Apps directly after the previous action ...
Option 2 - Standard Actions
The basic premise of this approach is the same as the Azure Function but naturally, it's a lot more long winded.
This is the JSON definition for that approach, it includes everything you need to test it in your own environment.
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Condition": {
"actions": {},
"else": {
"actions": {
"Terminate": {
"inputs": {
"runStatus": "Cancelled"
},
"runAfter": {},
"type": "Terminate"
}
}
},
"expression": {
"and": [
{
"equals": [
"#variables('Business Days Up to Today')",
3
]
}
]
},
"runAfter": {
"Until_Day_Index_=_-5": [
"Succeeded"
]
},
"type": "If"
},
"Initialize_Business_Days_Up_to_Today": {
"inputs": {
"variables": [
{
"name": "Business Days Up to Today",
"type": "integer",
"value": 0
}
]
},
"runAfter": {
"Initialize_Temp_Date": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Initialize_Current_Date": {
"inputs": {
"variables": [
{
"name": "Current Date",
"type": "string",
"value": "#{convertTimeZone(utcNow(), 'UTC', 'AUS Eastern Standard Time')}"
}
]
},
"runAfter": {},
"type": "InitializeVariable"
},
"Initialize_Day_Index": {
"inputs": {
"variables": [
{
"name": "Day Index",
"type": "integer",
"value": 0
}
]
},
"runAfter": {
"Initialize_Business_Days_Up_to_Today": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Initialize_Temp_Date": {
"inputs": {
"variables": [
{
"name": "Temp Date",
"type": "string",
"value": "#variables('Current Date')"
}
]
},
"runAfter": {
"Initialize_Current_Date": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Until_Day_Index_=_-5": {
"actions": {
"Decrement_Current_Date": {
"inputs": {
"name": "Temp Date",
"value": "#{addDays(variables('Current Date'), variables('Day Index'))}"
},
"runAfter": {},
"type": "SetVariable"
},
"Decrement_Day_Index": {
"inputs": {
"name": "Day Index",
"value": 1
},
"runAfter": {
"Increment_variable": [
"Succeeded"
]
},
"type": "DecrementVariable"
},
"Increment_variable": {
"inputs": {
"name": "Business Days Up to Today",
"value": "#if(and(greaterOrEquals(dayOfWeek(variables('Temp Date')), 1), lessOrEquals(dayOfWeek(variables('Temp Date')), 6)), 1, 0)"
},
"runAfter": {
"Decrement_Current_Date": [
"Succeeded"
]
},
"type": "IncrementVariable"
}
},
"expression": "#equals(variables('Day Index'), -5)",
"limit": {
"count": 60,
"timeout": "PT1H"
},
"runAfter": {
"Initialize_Day_Index": [
"Succeeded"
]
},
"type": "Until"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {},
"triggers": {
"Recurrence": {
"evaluatedRecurrence": {
"frequency": "Day",
"interval": 1,
"timeZone": "AUS Eastern Standard Time"
},
"recurrence": {
"frequency": "Day",
"interval": 1,
"timeZone": "AUS Eastern Standard Time"
},
"type": "Recurrence"
}
}
},
"parameters": {}
}
At the base all of that, throw in a condition to check if the current business day variable equals 3, if it does, run your logic.
That's included in the JSON definition above.
One of those approach is what I'd be doing so I hope that answers your question.
Related
I've got a Logic App that reads in an email from a form submission. Unfortunately, I only have access to the email and not where the form results are stored, so I currently have the Logic App set up to pull in the email and convert it to text.
If it was just one field, I assume I could just grab it all, throw it in a variable and split it to get what I need, but I'm not sure how to do this with multiple lines.
The emails always come in with this format:
---
Date: 08/18/2022
Time: 09:30:00
Requestor Name: Robert Bobson
Requestor Email: bob#companyname.com
Requestor Phone Number: 800-867-5309
Site Name: CompanyName
Site Number: 123456789
---
Am I able to set a line index and then grab everything on that line and then split it before assigning it to a variable? Is this going to require RegEx or is there a workaround or expression in Logic Apps that will handle this?
Thank you in advance!
You can achieve your requirement using expressions.
First, try removing the extra lines from the given email. below is the expression I'm using to achieve the same.
split(outputs('Compose'),'\n\n')
The above expression results in:
Am I able to set a line index and then grab everything on that line and then split it before assigning it to a variable?
yes, This is possible using the below expression
outputs('Compose_2')?[0] gives the Date
outputs('Compose_2')?[1] gives the Time
...
RESULTS:
Alternatively, You can convert the string into Parsable Json and then Parse it through Parse JSON Action. Below is the flow of my Logic App
In the above step I'm extracting the values on both left and right sides and storing them into a variable. You can use either of slice or substring functions to extract the values.
In the next step I'm trying to Parse the values through Parse JSON action.
RESULTS:
You can test the same using below code view in your logicapp
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": "Date: 08/18/2022\n\nTime: 09:30:00\n\nRequestor Name: Robert Bobson\n\nRequestor Email: bob#companyname.com\n\nRequestor Phone Number: 800-867-5309\n\nSite Name: CompanyName\n\nSite Number: 123456789",
"runAfter": {},
"type": "Compose"
},
"Compose_2": {
"inputs": "#split(outputs('Compose'),'\n\n')",
"runAfter": {
"Compose": [
"Succeeded"
]
},
"type": "Compose"
},
"Compose_3": {
"inputs": "#body('Parse_JSON')?['Site Name']",
"runAfter": {
"Parse_JSON": [
"Succeeded"
]
},
"type": "Compose"
},
"Initialize_variable": {
"inputs": {
"variables": [
{
"name": "length",
"type": "integer"
}
]
},
"runAfter": {
"Compose_2": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Initialize_variable_2": {
"inputs": {
"variables": [
{
"name": "Json",
"type": "string"
}
]
},
"runAfter": {
"Initialize_variable": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Parse_JSON": {
"inputs": {
"content": "#concat('{',substring(variables('Json'),0,sub(length(variables('Json')),1)),'}')",
"schema": {
"properties": {
"Date": {
"type": "string"
},
"Requestor Email": {
"type": "string"
},
"Requestor Name": {
"type": "string"
},
"Requestor Phone Number": {
"type": "string"
},
"Site Name": {
"type": "string"
},
"Site Number": {
"type": "string"
},
"Time": {
"type": "string"
}
},
"type": "object"
}
},
"runAfter": {
"Until": [
"Succeeded"
]
},
"type": "ParseJson"
},
"Until": {
"actions": {
"Append_to_string_variable": {
"inputs": {
"name": "Json",
"value": "#{outputs('Current_Object')},"
},
"runAfter": {
"Current_Object": [
"Succeeded"
]
},
"type": "AppendToStringVariable"
},
"Current_Object": {
"inputs": "\"#{substring(outputs('Compose_2')?[variables('length')],0,indexOf(outputs('Compose_2')?[variables('length')],':'))}\":\"#{slice(outputs('Compose_2')?[variables('length')],add(indexOf(outputs('Compose_2')?[variables('length')],':'),2),length(outputs('Compose_2')?[variables('length')]))}\"",
"runAfter": {},
"type": "Compose"
},
"Increment_variable": {
"inputs": {
"name": "length",
"value": 1
},
"runAfter": {
"Append_to_string_variable": [
"Succeeded"
]
},
"type": "IncrementVariable"
}
},
"expression": "#equals(variables('length'), length(outputs('Compose_2')))",
"limit": {
"count": 60,
"timeout": "PT1H"
},
"runAfter": {
"Initialize_variable_2": [
"Succeeded"
]
},
"type": "Until"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {},
"triggers": {
"manual": {
"inputs": {},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {}
}
You can Convert the email body using "Html To Text" action and then assign it to an array variable by using the split expression for any delimiter.
I have an Azure Logical app, I want to get the first date of the month two days ago.
For example:
1. If today is '2022-08-01' then two days ago was '2022-07-29', I want to get '2022-07-01' as result
2. If today is '2022-08-02' then two days ago was '2022-07-31', I want to get '2022-07-01' as result
3. If today is '2022-08-03' then two days ago was '2022-08-01', I want to get '2022-08-01' as result
I already know I can get date for two days ago using "#addDays(utcNow(), -2, 'yyyy-MM-dd')" but I basically need first date of the month this date belongs to
You can use startOfMonth() function after calculating for 2 days ago date in order to achieve your requirement. Below is my Logic App flow.
RESULTS:
You can use the below code-view to reproduce the same in your environment.
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": "#startOfMonth(outputs('Compose_2'),'yyyy-MM-dd')",
"runAfter": {
"Compose_2": [
"Succeeded"
]
},
"type": "Compose"
},
"Compose_2": {
"inputs": "#addDays(variables('Date'), -2, 'yyyy-MM-dd')",
"runAfter": {
"Initialize_variable": [
"Succeeded"
]
},
"type": "Compose"
},
"Initialize_variable": {
"inputs": {
"variables": [
{
"name": "Date",
"type": "string",
"value": "2022-08-03"
}
]
},
"runAfter": {},
"type": "InitializeVariable"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {},
"triggers": {
"manual": {
"inputs": {},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {}
}
i have an azure logic app with recurrence that will call api endpoint every midnight, im passing in two date properties in my request body that will include start of day and end of day.
Logic app has expression startOfDay() however they dont have endOfDay(), how can I dynamically get end of day in UTC format like startOfDay() does?
Thanks
this is how my request body looks like but its also complaining about #startOfDay()
{
"organizationId": 'f41186b0-7f09-42c5-8a9d-81a2ad2b0e61',
"attemptedDeliveries": true,
"cancelDateStart": #{startOfDay()},
"cancelDateEnd": ""
}
There is no direct expression for endOfDay but one of the workarounds is to addToTime and 'subtractFromTime' from startOfDay to get the endOfDay. Consider I'm taking the timestamp to be UtcNow() and to calculate the endOfDay I'm using the below expression.
subtractFromTime(addToTime(outputs('startOfDay'),1,'Day','o'),1,'Second','yyyy-MM-ddTHH:mm:ss')
I'm using Parse Json in order to retrieve the inner Json details for future use. By doing this, you can have a custom JSON created using Compose() connector.
Result:-
Codeview
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": {
"attemptedDeliveries": "#body('Parse_JSON')?['attemptedDeliveries']",
"cancelDateEnd": "#{outputs('endOfDay')}",
"cancelDateStart": "#{outputs('startOfDay')}",
"organizationId": "#{body('Parse_JSON')?['organizationId']}"
},
"runAfter": {
"endOfDay": [
"Succeeded"
]
},
"type": "Compose"
},
"Parse_JSON": {
"inputs": {
"content": "#triggerBody()",
"schema": {
"properties": {
"attemptedDeliveries": {
"type": "boolean"
},
"cancelDateEnd": {
"type": "string"
},
"cancelDateStart": {
"type": "string"
},
"organizationId": {
"type": "string"
}
},
"type": "object"
}
},
"runAfter": {},
"type": "ParseJson"
},
"endOfDay": {
"inputs": "#subtractFromTime(addToTime(startOfDay(utcNow()),1,'Day','o'),1,'Second','yyyy-MM-ddTHH:mm:ss')",
"runAfter": {
"startOfDay": [
"Succeeded"
]
},
"type": "Compose"
},
"startOfDay": {
"inputs": "#startOfDay(utcNow(),'yyyy-MM-ddTHH:mm:ss')",
"runAfter": {
"Parse_JSON": [
"Succeeded"
]
},
"type": "Compose"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {}
}
Here is the json that I'm receiving from my body of Http trigger
{
"organizationId": "f41186b0-7f09-42c5-8a9d-81a2ad2b0e61",
"attemptedDeliveries": true,
"cancelDateStart": "",
"cancelDateEnd": ""
}
Updated Answer
After the follow up here my logic app
Result:
Codeview
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"endOfDay": {
"inputs": "#subtractFromTime(addToTime(startOfDay(utcNow()),1,'Day','o'),1,'Second','yyyy-MM-ddTHH:mm:ss')",
"runAfter": {
"startOfDay": [
"Succeeded"
]
},
"type": "Compose"
},
"startOfDay": {
"inputs": "#startOfDay(utcNow(),'yyyy-MM-ddTHH:mm:ss')",
"runAfter": {},
"type": "Compose"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {},
"triggers": {
"Recurrence": {
"evaluatedRecurrence": {
"frequency": "Second",
"interval": 30
},
"recurrence": {
"frequency": "Second",
"interval": 30
},
"type": "Recurrence"
}
}
},
"parameters": {}
}
After endOfDay connector you can add 4 parallel HTTP triggers and add logic to it.
REFERENCES: Reference guide to workflow expression functions in Azure Logic Apps and Power Automate
I am trying to check the files count in array and perform below actions
If Count is 0, No action
If Count is 1, perform a specific action
If Count is greater than 1, perform another set of action.
I am using length expression to check the array file count using length(body('Filter_blobs_which_added_in_last_15min'))
Currently I am following below logic ( 2 step condition, first check if the value is zero, and then check if value is 1 or greater than 1). Is there anyway I can combine this into single condition ?
You can achieve this by using Switch condition in your workflow as shown below.
We have created a logic app which will calculate, the number of blobs present in the storage account, using compose & length function (we have calculated the number of blobs) based on the blob count the switch statement associates actions will be executed further.
And also we have tested this in our local environment which is working fine.
Here is the logic app Designer Image:
Here is the code view of the logic app :
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": "#length(body('Lists_blobs_(V2)')?['value'])",
"runAfter": {
"Lists_blobs_(V2)": [
"Succeeded"
]
},
"type": "Compose"
},
"Lists_blobs_(V2)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/foldersV2/#{encodeURIComponent(encodeURIComponent('JTJmY29udDE='))}",
"queries": {
"nextPageMarker": "",
"useFlatListing": false
}
},
"metadata": {
"JTJmY29udDE=": "/cont1"
},
"runAfter": {},
"type": "ApiConnection"
},
"Switch": {
"cases": {
"Case": {
"actions": {
"Compose_4": {
"inputs": "length of blob are #{outputs('Compose')}",
"runAfter": {},
"type": "Compose"
}
},
"case": 0
},
"Case_2": {
"actions": {
"Compose_3": {
"inputs": "length of blobs are #{outputs('Compose')}",
"runAfter": {},
"type": "Compose"
}
},
"case": 1
}
},
"default": {
"actions": {
"Compose_2": {
"inputs": "length of blobs are #{outputs('Compose')}",
"runAfter": {},
"type": "Compose"
}
}
},
"expression": "#outputs('Compose')",
"runAfter": {
"Compose": [
"Succeeded"
]
},
"type": "Switch"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"Recurrence": {
"evaluatedRecurrence": {
"frequency": "Minute",
"interval": 3
},
"recurrence": {
"frequency": "Minute",
"interval": 3
},
"type": "Recurrence"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<subId>/resourceGroups/<rgName>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<subId>/providers/Microsoft.Web/locations/eastus/managedApis/azureblob"
}
}
}
}
}
Here is the sample output for reference:
WHAT I am trying to do:
I am trying to retrieve the Sign-In logs for the last 24 hours of all users and save it in a blob storage. After the first result set creates the blob, the next results sets would update the blob file with the next remaining results.
Thought of using a blob storage and MS Graph because the Graph output contains all the details that I want without having to jump through various hoops in Powershell to expand certain properties and because the result size is huge (over 1GB via Export-CSV in PowerShell).
HOW I'm trying to do it
A scheduled run that does an HTTP request with the Graph query filtered by the last 24h which creates a block with the HTTP Body as content. After creation of the Blob, I added a (Do) Until control that runs until the HTTP Body does not contain #odata.nextLink and updates the blob file.
ISSUES:
First issue is that the Until loop finishes in 6 seconds.
Second issue is that the blob file only contains the first result set and is usually 9.3MB in size. Which means the next results set is not accessed and appended to the existing blob file.
My research
I tried with pagination enabled & disabled, various pagination thresholds, custom functions, but nothing that would make sense (to me at least) and I'm trying to follow the KISS model.
I looked over and tried to apply in one shape or form the answers from the below S.O. questions:
Graph Pagination in Logic Apps | Pagination with oauth azure data factory | Microsoft graph, batch request's nextLink | https://learn.microsoft.com/en-us/graph/paging;
Code I am trying
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Create_blob": {
"inputs": {
"body": "#body('fRequest')",
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "post",
"path": "/datasets/default/files",
"queries": {
"folderPath": "/graph",
"name": "DoUntil",
"queryParametersSingleEncoded": true
}
},
"runAfter": {
"fRequest": [
"Succeeded"
]
},
"runtimeConfiguration": {
"contentTransfer": {
"transferMode": "Chunked"
}
},
"type": "ApiConnection"
},
"Until": {
"actions": {
"Update_blob": {
"inputs": {
"body": "#body('fRequest')",
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "put",
"path": "/datasets/default/files/#{encodeURIComponent(encodeURIComponent('/graph/DoUntil'))}"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"expression": "#not(contains(body('fRequest'), '#odata.nextLink'))",
"limit": {
"count": 60,
"timeout": "PT1H"
},
"runAfter": {
"Create_blob": [
"Succeeded"
]
},
"type": "Until"
},
"fRequest": {
"inputs": {
"authentication": {
"audience": "https://graph.microsoft.com",
"clientId": "registered_app",
"secret": "app_secret",
"tenant": "tenant_id",
"type": "ActiveDirectoryOAuth"
},
"method": "GET",
"uri": "https://graph.microsoft.com/beta/auditLogs/signIns?$filter=createdDateTime gt #{addDays(utcNow(),-1)}"
},
"runAfter": {},
"runtimeConfiguration": {
"paginationPolicy": {
"minimumItemCount": 500
}
},
"type": "Http"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"Recurrence": {
"recurrence": {
"frequency": "Week",
"interval": 7,
"schedule": {
"hours": [
"7"
],
"minutes": [
0
]
}
},
"type": "Recurrence"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/subscription_id/resourceGroups/Apps/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/subscription_id/providers/Microsoft.Web/locations/eastus/managedApis/azureblob"
}
}
}
}
}
What am I doing wrong or missing?
Thanks in advance!
I managed to increase the pagination threshold to 20000 and now my files are no longer 9MB, they reach 200MB in size. I also removed the "Do" loop. Now I only need to create a break to avoid the threshold and resume collecting the remaining pages of results.