Azure Logic for getting data from SQL to FTP - azure

I've a task of taking data from SQL and uploading the data as a CSV file up to an FTP server.
Now I've done this for a single SQL row just fine. The problem I'm having is looping over all rows (foreach loop) and inserting these rows as the content of the CSV file. I've tried a FTP Create File Task inside a foreach loop, but I can only access a single row at a time to set as the file's content - I need all the rows!
Also to keep in mind is that these files will have 200k+ rows.
I could of course just write a C# console app for this but the ease at which I got this far without writing any code makes it seem like it will be a worthwhile endeavor.

We recently added "Table" primitive for this scenario, support in designer is still work in progress, but you can use it in code view.
In below scenario, I'm getting rows from a table in SQL Azure, producing an CSV with two columns using data from the SQL query (First Name, Last Name), then send it via e-mail.
"Get_rows": {
"inputs": {
"host": {
"api": {
"runtimeUrl": "https://logic-apis-southcentralus.azure-apim.net/apim/sql"
},
"connection": {
"name": "#parameters('$connections')['sql']['connectionId']"
}
},
"method": "get",
"path": "/datasets/default/tables/#{encodeURIComponent(encodeURIComponent('[SalesLT].[Customer]'))}/items",
"queries": {
"$top": 10
}
},
"runAfter": {},
"type": "ApiConnection"
},
"tableCsv0": {
"inputs": {
"columns": [
{
"header": "First Name",
"value": "#item()?['FirstName']"
},
{
"header": "Last Name",
"value": "#item()?['LastName']"
}
],
"format": "csv",
"from": "#body('Get_rows')?['value']"
},
"runAfter": {
"Get_rows": [
"Succeeded"
]
},
"type": "Table"
},
"Send_an_email": {
"inputs": {
"body": {
"Body": "#body('tableCsv0')",
"Subject": "Subject",
"To": "deli#microsoft.com"
},
"host": {
"api": {
"runtimeUrl": "https://logic-apis-southcentralus.azure-apim.net/apim/office365"
},
"connection": {
"name": "#parameters('$connections')['office365']['connectionId']"
}
},
"method": "post",
"path": "/Mail"
},
"runAfter": {
"tableCsv0": [
"Succeeded"
]
},
"type": "ApiConnection"
}

So just following up to show how Derek's answer helped me with my problem to get a large number of rows to up to a file on an FTP server. I ended up using the output body of the Execute Stored Procedure action as the GetRows action was limited to 512 rows.
NOTE: As the Table action is not available in the designer, yet, do everything in the code viewer, opening the designer caused issues and deleted all my code at one point.
"actions": {
"Create_file": {
"inputs": {
"body": "#body('tableCsv0')",
"host": {
"api": {
"runtimeUrl": "https://logic-apis-northeurope.azure-apim.net/apim/ftp"
},
"connection": {
"name": "#parameters('$connections')['ftp']['connectionId']"
}
},
"method": "post",
"path": "/datasets/default/files",
"queries": {
"folderPath": "transactions/ready/ecommerce/tickets_test/",
"name": "grma_tickets_#{formatDateTime(utcNow(),'yyyyMMdd_hhmmss')}.csv"
}
},
"runAfter": {
"tableCsv0": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Execute_stored_procedure": {
"inputs": {
"host": {
"api": {
"runtimeUrl": "https://logic-apis-northeurope.azure-apim.net/apim/sql"
},
"connection": {
"name": "#parameters('$connections')['sql']['connectionId']"
}
},
"method": "post",
"path": "/datasets/default/procedures/#{encodeURIComponent(encodeURIComponent('[Scheduledjob].[GetBArcodesForGRMA]'))}"
},
"runAfter": {},
"type": "ApiConnection"
},
"tableCsv0": {
"inputs": {
"columns": [
{
"header": "EventDateTime",
"value": "#item()?['EventDateTime']"
},
{
"header": "EventName",
"value": "#item()?['EventName']"
}
],
"format": "csv",
"from": "#body('Execute_stored_procedure')['ResultSets']['Table1']"
},
"runAfter": {
"Execute_stored_procedure": [
"Succeeded"
]
},
"type": "Table"
}

Related

How to grab column value from ADLS gen 2 csv file and use the column value in the body of the email,also send blob data as attachment to outlook mail

Here is my Scenario,
There will be a drop of csv file into blob storage every day ,that will be processed by my dataflow in ADF and generate a csv in output folder.
Now Using logic apps, I need to send that csv file (less than 10 mb ) as an attachement to customer via Outlook connector.
Besides ,My body of the email must have dynamic value coming from that blob csv .
For example 'AppWorks' is the column value in column 'Works/not'. Sometimes it may be "AppNotWorks".So How to handle this scenario in Azure logic apps
You can use the combination of both data factory and logic apps to do this. Use look up activity to get the first row of the file (Since the entire column value will be same, we can get the required value from one row).
Now use web activity to trigger the logic app. Pass the logic app's HTTP request URL to web activity. In the body, pass the following dynamic content:
#activity('Lookup1').output.firstRow
When you debug the pipeline, the logic app will be successfully triggered. I have given the Request Body JSON schema to get values individually. For the sample I have taken, it would look as shown below:
{
"properties": {
"customer": {
"type": "string"
},
"id": {
"type": "string"
}
},
"type": "object"
}
Create a connection to storage account to link the required file.
Now, using the Outlook connector, send the Email.
The following is the entire Logic app JSON:
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Get_blob_content_(V2)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/files/#{encodeURIComponent(encodeURIComponent('JTJmZGF0YSUyZnNhbXBsZTEuY3N2'))}/content",
"queries": {
"inferContentType": true
}
},
"metadata": {
"JTJmZGF0YSUyZnNhbXBsZTEuY3N2": "/data/sample1.csv"
},
"runAfter": {},
"type": "ApiConnection"
},
"Send_an_email_(V2)": {
"inputs": {
"body": {
"Attachments": [
{
"ContentBytes": "#{base64(body('Get_blob_content_(V2)'))}",
"Name": "sample1.csv"
}
],
"Body": "<p>Hi #{triggerBody()?['customer']},<br>\n<br>\nRandom description</p>",
"Importance": "Normal",
"Subject": "sample data",
"To": "<to_email>"
},
"host": {
"connection": {
"name": "#parameters('$connections')['office365']['connectionId']"
}
},
"method": "post",
"path": "/v2/Mail"
},
"runAfter": {
"Get_blob_content_(V2)": [
"Succeeded"
]
},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {
"properties": {
"customer": {
"type": "string"
},
"id": {
"type": "string"
}
},
"type": "object"
}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/xxx/providers/Microsoft.Web/locations/westus2/managedApis/azureblob"
},
"office365": {
"connectionId": "/subscriptions/xxx/resourceGroups/v-sarikontha-Mindtree/providers/Microsoft.Web/connections/office365",
"connectionName": "office365",
"id": "/subscriptions/xxx/providers/Microsoft.Web/locations/westus2/managedApis/office365"
}
}
}
}
}
The following is the resulting Mail image for reference:

How do I parse Json file from azure blob storage to Azure Sql using Azure logic app

[][1]I have a multiple json files dropping in blob storage weekly, I want to use azure logic app to parse json file and copy data into Azure Sql? Please help
For achieving your requirement, Below are the flow that you can follow :-
Blob trigger (When a blob is added or modified (properties only) (V2)) >> Get blob content using path (V2) >> Parse JSON >> SQL related Action (For instance I'm using Insert row (V2)).
Below is the sample JSON that I'm uploading to my container.
{
"employees": {
"emp_name": "abc",
"hire_date": "2022-10-23",
"salary": 10000
}
}
I'm using triggers path to get the content of the blob. While Inserting the row I'm using Parse JSON values. Below is my Logic App.
Result:
UPDATED ANSWER
As per your requirement, you can either manually trigger flow using `` or set a recurrence trigger to make the flow triggered at times and then use list all files in that particular container from the storage account. Here is how the flow looks like
detailed flow
RESULTS:
code view of my logic app
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"For_each": {
"actions": {
"Get_blob_content_using_path_(V2)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/GetFileContentByPath",
"queries": {
"inferContentType": true,
"path": "#items('For_each')?['Path']",
"queryParametersSingleEncoded": true
}
},
"runAfter": {},
"type": "ApiConnection"
},
"Insert_row_(V2)": {
"inputs": {
"body": {
"emp_id": "#body('Parse_JSON')?['employees']?['employee_id']",
"emp_name": "#body('Parse_JSON')?['employees']?['emp_name']",
"hire_date": "#body('Parse_JSON')?['employees']?['hire_date']",
"salary": "#body('Parse_JSON')?['employees']?['salary']"
},
"host": {
"connection": {
"name": "#parameters('$connections')['sql']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('default'))},#{encodeURIComponent(encodeURIComponent('default'))}/tables/#{encodeURIComponent(encodeURIComponent('[dbo].[employees]'))}/items"
},
"runAfter": {
"Parse_JSON": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Parse_JSON": {
"inputs": {
"content": "#json(body('Get_blob_content_using_path_(V2)'))",
"schema": {
"properties": {
"employees": {
"properties": {
"emp_name": {
"type": "string"
},
"employee_id": {
"type": "integer"
},
"hire_date": {
"type": "string"
},
"salary": {
"type": "integer"
}
},
"type": "object"
}
},
"type": "object"
}
},
"runAfter": {
"Get_blob_content_using_path_(V2)": [
"Succeeded"
]
},
"type": "ParseJson"
}
},
"foreach": "#body('Lists_blobs_(V2)')?['value']",
"runAfter": {
"Lists_blobs_(V2)": [
"Succeeded"
]
},
"type": "Foreach"
},
"Lists_blobs_(V2)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/foldersV2/#{encodeURIComponent(encodeURIComponent('JTJmY29udGFpbmVyMQ=='))}",
"queries": {
"nextPageMarker": "",
"useFlatListing": false
}
},
"metadata": {
"JTJmY29udGFpbmVyMQ==": "/container1"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<SUBSCRIPTION ID>/resourceGroups/<RESOURCE GROUP NAME>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<SUBSCRIPTION ID>/providers/Microsoft.Web/locations/centralus/managedApis/azureblob"
},
"sql": {
"connectionId": "/subscriptions/<SUBSCRIPTION ID>/resourceGroups/<RESOURCE GROUP NAME>/providers/Microsoft.Web/connections/sql",
"connectionName": "sql",
"id": "/subscriptions/<SUBSCRIPTION ID>/providers/Microsoft.Web/locations/centralus/managedApis/sql"
}
}
}
}
}

Azure Logic app how to append list blobs to array variable?

Hi I am working on azure logic app. I have below array variable. These are the mail folder path and inside I have files.
["/mycontainer/PreDesign/1e36d504-7876-41b1-89b3-83d2132fa7c4/AdditionalDocumets","/mycontainer/PreDesign/1e36d504-7876-41b1-89b3-83d2132fa7c4/TowerCalcOutPut","/mycontainer/PreDesign/1e36d504-7876-41b1-89b3-83d2132fa7c4/TowerDataSheet"]
Then I have below method to list the blobs
Here blob list is array variable as defined above. Then I have added listblobs which will loop through each path defined in array variable and lists blobs and I want to append blobs to array variable.
Here in append array variable I am able to append only blobs available inside first array element that is /containername/PreDesign/1e36d504-7876-41b1-89b3-83d2132fa7c4/AdditionalDocumets
I am not able to append blobs which are inside mycontainer/PreDesign/1e36d504-7876-41b1-89b3-83d2132fa7c4/TowerCalcOutPut","/mycontainer/PreDesign/1e36d504-7876-41b1-89b3-83d2132fa7c4/TowerDataSheet, only first element blobs I am able to append it to array variable. I am struggling to append all the blobs to array variable. Can someone help me where exactly I am missing? Any help would be appreciated. Thank you
I am struggling to append all the blobs to array variable. Can someone help me where exactly I am missing?
Its because the flow goes into only the first folder but not all folders.
Here is one of the workaround that you can try. Like said in How to attach multiple blobs in logic app from different folder just try adding Append to array variable at the end of For each 2 and you will have paths for all the blobs in your variable. Below is the screenshot of my logic app for your reference.
output
Here is the code view of my logic app
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": "#variables('BlobList')",
"runAfter": {
"For_each": [
"Succeeded"
]
},
"type": "Compose"
},
"For_each": {
"actions": {
"For_each_2": {
"actions": {
"For_each_3": {
"actions": {
"Append_to_array_variable": {
"inputs": {
"name": "BlobList",
"value": "#items('For_each_3')?['Path']"
},
"runAfter": {},
"type": "AppendToArrayVariable"
}
},
"foreach": "#body('Lists_blobs_in_Directory')?['value']",
"runAfter": {
"Lists_blobs_in_Directory": [
"Succeeded"
]
},
"type": "Foreach"
},
"Lists_blobs_in_Directory": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/foldersV2/#{encodeURIComponent(encodeURIComponent(items('For_each_2')?['Path']))}",
"queries": {
"nextPageMarker": ""
}
},
"runAfter": {},
"type": "ApiConnection"
}
},
"foreach": "#body('Lists_Directories_inside_Container')?['value']",
"runAfter": {
"Lists_Directories_inside_Container": [
"Succeeded"
]
},
"type": "Foreach"
},
"Lists_Directories_inside_Container": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/foldersV2/#{encodeURIComponent(encodeURIComponent(items('For_each')?['Path']))}",
"queries": {
"nextPageMarker": ""
}
},
"runAfter": {},
"type": "ApiConnection"
}
},
"foreach": "#body('Lists_Containers_in_root_folder')?['value']",
"runAfter": {
"Lists_Containers_in_root_folder": [
"Succeeded"
]
},
"type": "Foreach"
},
"Initialize_BlobList_variable": {
"inputs": {
"variables": [
{
"name": "BlobList",
"type": "array"
}
]
},
"runAfter": {},
"type": "InitializeVariable"
},
"Lists_Containers_in_root_folder": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/foldersV2",
"queries": {
"nextPageMarker": "",
"useFlatListing": false
}
},
"runAfter": {
"Initialize_BlobList_variable": [
"Succeeded"
]
},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<YOUR_SUBSCRIPTION_ID>/resourceGroups/<YOUR_RESOURCE_GROUP>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<YOUR_SUBSCRIPTION_ID>/providers/Microsoft.Web/locations/northcentralus/managedApis/azureblob"
}
}
}
}
}

Send Email with multiple attachment using Azure Logic App

I need to send the blobs uploaded to my Azure storage container as an attachments. Number of files getting uploaded to container will change so I need to use dynamic method for attachment.
I have verified this question related to it
I am using below logic:
Append to variable values
{
"Name": items('For_each')?['DisplayName']
"ContentBytes":body('Get_blob_content')
}
When I am trying to save the logic, getting below error:
Save logic app failed
Failed to save logic app testing. The template validation failed: 'The action(s) 'Get_blob_content' referenced by 'inputs' in action 'Append_to_array_variable' are not defined in the template.'.
How can I solve this ?
Based on the error message shared above, Instead of saving the entire workflow at once would suggest you to save the logic app at each stage or before the appendtoarray variable stage & post then append the values to attachment variable with previous stage outputs.
Based on the above requirement, we have created the below logic app in our local environment &tested it as well which is working fine.
In our workflow, We have Used For Each to loop the blobs from List Blobs action. Within For Each you can use Get blob content to get blob content, and then use Append to array variable to append attachments.
The expressions Name and ContentBytes are as follows:
"ContentBytes": "#base64(body('Get_blob_content_(V2)'))",
"Name": "#items('For_each')?['DisplayName']"
Here is the code view of the Logic app that we have created :
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"For_each": {
"actions": {
"Append_to_array_variable": {
"inputs": {
"name": "attachments",
"value": {
"ContentBytes": "#base64(body('Get_blob_content_(V2)'))",
"Name": "#items('For_each')?['DisplayName']"
}
},
"runAfter": {
"Get_blob_content_(V2)": [
"Succeeded"
]
},
"type": "AppendToArrayVariable"
},
"Get_blob_content_(V2)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/files/#{encodeURIComponent(encodeURIComponent(items('For_each')?['Path']))}/content",
"queries": {
"inferContentType": true
}
},
"runAfter": {},
"type": "ApiConnection"
}
},
"foreach": "#body('Lists_blobs_(V2)')?['value']",
"runAfter": {
"Initialize_variable": [
"Succeeded"
]
},
"type": "Foreach"
},
"Initialize_variable": {
"inputs": {
"variables": [
{
"name": "attachments",
"type": "array"
}
]
},
"runAfter": {
"Lists_blobs_(V2)": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Lists_blobs_(V2)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/foldersV2/#{encodeURIComponent(encodeURIComponent('JTJmcmVwb3J0cw=='))}",
"queries": {
"nextPageMarker": "",
"useFlatListing": false
}
},
"metadata": {
"JTJmcmVwb3J0cw==": "/reports"
},
"runAfter": {},
"type": "ApiConnection"
},
"Send_an_email_(V2)": {
"inputs": {
"body": {
"Attachments": "#variables('attachments')",
"Body": "<p>tested logic app flow successfully</p>",
"Subject": "blob test",
"To": "<username>#microsoft.com"
},
"host": {
"connection": {
"name": "#parameters('$connections')['office365']['connectionId']"
}
},
"method": "post",
"path": "/v2/Mail"
},
"runAfter": {
"For_each": [
"Succeeded"
]
},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"When_a_blob_is_added_or_modified_(properties_only)_(V2)": {
"evaluatedRecurrence": {
"frequency": "Minute",
"interval": 1
},
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/triggers/batch/onupdatedfile",
"queries": {
"checkBothCreatedAndModifiedDateTime": false,
"folderId": "JTJmcmVwb3J0cw==",
"maxFileCount": 10
}
},
"metadata": {
"JTJmcmVwb3J0cw==": "/reports"
},
"recurrence": {
"frequency": "Minute",
"interval": 1
},
"splitOn": "#triggerBody()",
"type": "ApiConnection"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<sub-ID>/resourceGroups/<resourceGroup>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<sub-id>/providers/Microsoft.Web/locations/eastus/managedApis/azureblob"
},
"office365": {
"connectionId": "/subscriptions/<sub-id>/resourceGroups/<resroucegroup>/providers/Microsoft.Web/connections/office365",
"connectionName": "office365",
"id": "/subscriptions/<sub-id>/providers/Microsoft.Web/locations/eastus/managedApis/office365"
}
}
}
}
}
Here is the sample Output for reference:

MS Graph pagination in Logic App with result export in Azure Blob storage

WHAT I am trying to do:
I am trying to retrieve the Sign-In logs for the last 24 hours of all users and save it in a blob storage. After the first result set creates the blob, the next results sets would update the blob file with the next remaining results.
Thought of using a blob storage and MS Graph because the Graph output contains all the details that I want without having to jump through various hoops in Powershell to expand certain properties and because the result size is huge (over 1GB via Export-CSV in PowerShell).
HOW I'm trying to do it
A scheduled run that does an HTTP request with the Graph query filtered by the last 24h which creates a block with the HTTP Body as content. After creation of the Blob, I added a (Do) Until control that runs until the HTTP Body does not contain #odata.nextLink and updates the blob file.
ISSUES:
First issue is that the Until loop finishes in 6 seconds.
Second issue is that the blob file only contains the first result set and is usually 9.3MB in size. Which means the next results set is not accessed and appended to the existing blob file.
My research
I tried with pagination enabled & disabled, various pagination thresholds, custom functions, but nothing that would make sense (to me at least) and I'm trying to follow the KISS model.
I looked over and tried to apply in one shape or form the answers from the below S.O. questions:
Graph Pagination in Logic Apps | Pagination with oauth azure data factory | Microsoft graph, batch request's nextLink | https://learn.microsoft.com/en-us/graph/paging;
Code I am trying
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Create_blob": {
"inputs": {
"body": "#body('fRequest')",
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "post",
"path": "/datasets/default/files",
"queries": {
"folderPath": "/graph",
"name": "DoUntil",
"queryParametersSingleEncoded": true
}
},
"runAfter": {
"fRequest": [
"Succeeded"
]
},
"runtimeConfiguration": {
"contentTransfer": {
"transferMode": "Chunked"
}
},
"type": "ApiConnection"
},
"Until": {
"actions": {
"Update_blob": {
"inputs": {
"body": "#body('fRequest')",
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "put",
"path": "/datasets/default/files/#{encodeURIComponent(encodeURIComponent('/graph/DoUntil'))}"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"expression": "#not(contains(body('fRequest'), '#odata.nextLink'))",
"limit": {
"count": 60,
"timeout": "PT1H"
},
"runAfter": {
"Create_blob": [
"Succeeded"
]
},
"type": "Until"
},
"fRequest": {
"inputs": {
"authentication": {
"audience": "https://graph.microsoft.com",
"clientId": "registered_app",
"secret": "app_secret",
"tenant": "tenant_id",
"type": "ActiveDirectoryOAuth"
},
"method": "GET",
"uri": "https://graph.microsoft.com/beta/auditLogs/signIns?$filter=createdDateTime gt #{addDays(utcNow(),-1)}"
},
"runAfter": {},
"runtimeConfiguration": {
"paginationPolicy": {
"minimumItemCount": 500
}
},
"type": "Http"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"Recurrence": {
"recurrence": {
"frequency": "Week",
"interval": 7,
"schedule": {
"hours": [
"7"
],
"minutes": [
0
]
}
},
"type": "Recurrence"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/subscription_id/resourceGroups/Apps/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/subscription_id/providers/Microsoft.Web/locations/eastus/managedApis/azureblob"
}
}
}
}
}
What am I doing wrong or missing?
Thanks in advance!
I managed to increase the pagination threshold to 20000 and now my files are no longer 9MB, they reach 200MB in size. I also removed the "Do" loop. Now I only need to create a break to avoid the threshold and resume collecting the remaining pages of results.

Resources