How to add to Azure Blob Storage from Logic App - azure

I have an Azure Stream Analytics job which adds JSON data to an Azure storage container as an output. The data is organized in a folder structure based on the date, as specified in the stream analytics output here:
ASA Output Details
I also have a Logic App which I want to add data to the same place. I am looking at the logic app Blob Storage actions and cant figure out how to do this. The Update Blob action seems to want to point to a single blob file, rather than having it integrated into the data based on date.
Is there a way to do this with the Logic Apps actions? Or maybe there is a better way to structure my data so that I can add events both from stream analytics as well as from logic apps?
Thanks!

To add blobs to the same folder structure you need to mention the path. To achieve your requirement I have used Create blob action and added blobs to the required folder.
In case if you are trying to automate this process to run everyday and add files to everyday to the date path, then try following the below flow.
You can reproduce the same in your environment using the below code view
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": {
"Sample": "Sample Json from Logic Apps"
},
"runAfter": {},
"type": "Compose"
},
"Create_blob_(V2)": {
"inputs": {
"body": "#outputs('Compose')",
"headers": {
"ReadFileMetadataFromServer": true
},
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/files",
"queries": {
"folderPath": "/container1/#{formatDateTime(utcNow(),'yyyy')}/#{formatDateTime(utcNow(),'MM')}/#{formatDateTime(utcNow(),'dd')}",
"name": "LAEvent.json",
"queryParametersSingleEncoded": true
}
},
"runAfter": {
"Compose": [
"Succeeded"
]
},
"runtimeConfiguration": {
"contentTransfer": {
"transferMode": "Chunked"
}
},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<SUB_ID>/resourceGroups/<RG>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<SUB_ID>/providers/Microsoft.Web/locations/centralus/managedApis/azureblob"
}
}
}
}
}
In Storage account:
Updated Answer
In the case where you want to add data to an existing blob, you can concat both the data which is already present in the blob and the new logic app's data using concat function.
concat(body('Get_blob_content_(V2)'),outputs('Compose'))
RESULTS:

Related

How to grab column value from ADLS gen 2 csv file and use the column value in the body of the email,also send blob data as attachment to outlook mail

Here is my Scenario,
There will be a drop of csv file into blob storage every day ,that will be processed by my dataflow in ADF and generate a csv in output folder.
Now Using logic apps, I need to send that csv file (less than 10 mb ) as an attachement to customer via Outlook connector.
Besides ,My body of the email must have dynamic value coming from that blob csv .
For example 'AppWorks' is the column value in column 'Works/not'. Sometimes it may be "AppNotWorks".So How to handle this scenario in Azure logic apps
You can use the combination of both data factory and logic apps to do this. Use look up activity to get the first row of the file (Since the entire column value will be same, we can get the required value from one row).
Now use web activity to trigger the logic app. Pass the logic app's HTTP request URL to web activity. In the body, pass the following dynamic content:
#activity('Lookup1').output.firstRow
When you debug the pipeline, the logic app will be successfully triggered. I have given the Request Body JSON schema to get values individually. For the sample I have taken, it would look as shown below:
{
"properties": {
"customer": {
"type": "string"
},
"id": {
"type": "string"
}
},
"type": "object"
}
Create a connection to storage account to link the required file.
Now, using the Outlook connector, send the Email.
The following is the entire Logic app JSON:
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Get_blob_content_(V2)": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/files/#{encodeURIComponent(encodeURIComponent('JTJmZGF0YSUyZnNhbXBsZTEuY3N2'))}/content",
"queries": {
"inferContentType": true
}
},
"metadata": {
"JTJmZGF0YSUyZnNhbXBsZTEuY3N2": "/data/sample1.csv"
},
"runAfter": {},
"type": "ApiConnection"
},
"Send_an_email_(V2)": {
"inputs": {
"body": {
"Attachments": [
{
"ContentBytes": "#{base64(body('Get_blob_content_(V2)'))}",
"Name": "sample1.csv"
}
],
"Body": "<p>Hi #{triggerBody()?['customer']},<br>\n<br>\nRandom description</p>",
"Importance": "Normal",
"Subject": "sample data",
"To": "<to_email>"
},
"host": {
"connection": {
"name": "#parameters('$connections')['office365']['connectionId']"
}
},
"method": "post",
"path": "/v2/Mail"
},
"runAfter": {
"Get_blob_content_(V2)": [
"Succeeded"
]
},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {
"properties": {
"customer": {
"type": "string"
},
"id": {
"type": "string"
}
},
"type": "object"
}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/xxx/providers/Microsoft.Web/locations/westus2/managedApis/azureblob"
},
"office365": {
"connectionId": "/subscriptions/xxx/resourceGroups/v-sarikontha-Mindtree/providers/Microsoft.Web/connections/office365",
"connectionName": "office365",
"id": "/subscriptions/xxx/providers/Microsoft.Web/locations/westus2/managedApis/office365"
}
}
}
}
}
The following is the resulting Mail image for reference:

Using Event Grid, is it possible to trigger a logic app when a resource is created/deleted in Azure?

I would like to know if using Event Grid, is it possible to have a logic app triggered when any resource is deployed on a Azure Subscription.
The use case is :
Somebody creates/deletes a resource on a particular Azure subscription
It sends a event in Event Grid (not sure about that ?)
A logic app is then triggered when such event occurs, this logic app will send notification in a Teams channel.
The goal here is to have a simple and basic helicopter view on what's happening on this sub.
For testing purposes, I've created a logic app and add a "When a resource event occurs" trigger with Microsoft.Resources.ResourceGroups and these event types :
Microsoft.Resources.ResourceActionSuccess
Microsoft.Resources.ResourceDeleteSuccess
Microsoft.Resources.ResourceWriteSuccess
Not sure I'm exploring here.
Then I've deployed a storageaccount, but I get notifications even when "Reviewing" the deployment just before the resource is actually deployed.
Once deployed, I also have random notifications (even if the storage account is not used, some kind of background activities I guess ?)
As per this Official documentation:
Resource events are created for PUT, PATCH, POST, and DELETE operations that are sent to management.azure.com. GET operations don't create events.
Hence you are receiving multiple triggers. For minimal triggers you can add the filters for the deployment. Below is the flow I'm using.
Below is the complete JSON of my logic app
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Send_an_email_(V2)": {
"inputs": {
"body": {
"Body": "<p>#{triggerBody()?['subject']} has been created</p>",
"Importance": "Normal",
"Subject": "xxx",
"To": "xxx"
},
"host": {
"connection": {
"name": "#parameters('$connections')['office365']['connectionId']"
}
},
"method": "post",
"path": "/v2/Mail"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"When_a_resource_event_occurs": {
"conditions": [],
"inputs": {
"body": {
"properties": {
"destination": {
"endpointType": "webhook",
"properties": {
"endpointUrl": "#{listCallbackUrl()}"
}
},
"filter": {
"includedEventTypes": [
"Microsoft.Resources.ResourceDeleteSuccess",
"Microsoft.Resources.ResourceActionSuccess"
],
"subjectBeginsWith": "/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.Resources/deployments"
},
"topic": "/subscriptions/xxx/resourceGroups/xxx"
}
},
"host": {
"connection": {
"name": "#parameters('$connections')['azureeventgrid']['connectionId']"
}
},
"path": "/subscriptions/#{encodeURIComponent('b83c1ed3-c5b6-44fb-b5ba-2b83a074c23f')}/providers/#{encodeURIComponent('Microsoft.Resources.ResourceGroups')}/resource/eventSubscriptions",
"queries": {
"x-ms-api-version": "2017-09-15-preview"
}
},
"splitOn": "#triggerBody()",
"type": "ApiConnectionWebhook"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureeventgrid": {
"connectionId": "/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.Web/connections/azureeventgrid",
"connectionName": "azureeventgrid",
"id": "/subscriptions/xxx/providers/Microsoft.Web/locations/eastus/managedApis/azureeventgrid"
},
"office365": {
"connectionId": "/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.Web/connections/office365",
"connectionName": "office365",
"id": "/subscriptions/xxx/providers/Microsoft.Web/locations/eastus/managedApis/office365"
}
}
}
}
}
RESULTS:

No Workflow Resource found in the template in Logic App

I am trying to copy the logic app code from code view and paste it in the logic app project that I created locally.
Once I have pasted the code, I am trying to right click and open the file using logic app designer, then I am getting an error No workflow resource found in the template in Logic App.
Request some help if someone has faced this problem before.
I would suggest you to copy each array separately like triggers and actions and paste it in local project. I have done the same thing and i am able to open the designer. Below are steps I followed,
Created a standard logic app in portal.
Created a workflow as shown below,
Created a local project in vs code as followed by document.
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {},
"contentVersion": "1.0.0.0",
"outputs": {},
"triggers": {}
},
"kind": "Stateful"
}
Copied actions json from portal to vs code.
"actions": {
"Initialize_variable": {
"inputs": {
"variables": [
{
"name": "str",
"type": "string"
}
]
},
"runAfter": {},
"type": "InitializeVariable"
},
"Lists_blobs_(V2)": {
"inputs": {
"host": {
"connection": {
"referenceName": "azureblob"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/foldersV2/#{encodeURIComponent(encodeURIComponent('xxxx'))}",
"queries": {
"nextPageMarker": "",
"useFlatListing": false
}
},
"metadata": {
"xxxx": "/xxxx"
},
"runAfter": {
"Initialize_variable": [
"Succeeded"
]
},
"type": "ApiConnection"
}
}
Now copied triggers code from portal to vs code.
"triggers": {
"manual": {
"inputs": {},
"kind": "Http",
"type": "Request"
}
}
Saved the workflow.json file in vs code.
Able to open designer from local project,

SharePoint online storage full

Our SharePoint online storage is full, it's expensive to buy extra storage on SharePoint online. I want to copy the folders from SharePoint Online to Azure Blob or Azure file shares cool access tier as a storage Archive. I tried using power automate and logic apps, it just creates the rules for newly created files but no option to move the already created Sharepoint online folders. Please advise me with any steps or recommendations.
Below is the flow I followed to achieve your requirement.
First I tried to list the files inside the desired folder using List folder action of sharepoint which gives me an option to retireve File Identifier of a particular file.
In the next step I tried looping inside the folder mentioned in the previous step to retrieving the metadata of the file using Get file metadata action and get the content of the file using Get file content action of sharepoint.
In the next step I used Create blob (V2) action of Azure blob storage to create blobs inside the storage using the file content and the metadata of the files from sharepoint.
This step is not required but if you want tto automate deleting all the files from your sharepoint, you can use Delete file action of sharepoint to delete the file which you have copied to sharepoint.
Below is the whole flow of my logic app
RESULTS:
Below are the files that are already presnt in my Sharepoint site:-
In my storage account
NOTE: If you are trying to automate deletion of files from sharepoint then make sure the files are moving from sharepoint to blob storage first (i.e., without using the delete file action) and then, after the transfer is completed try creating a new flow with delete blob action to delete the files.
To reproduce the same in your logic apps, you can use the below code-view.
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"For_each": {
"actions": {
"Create_blob_(V2)": {
"inputs": {
"body": "#body('Get_file_content')",
"headers": {
"ReadFileMetadataFromServer": true
},
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/files",
"queries": {
"folderPath": "/container",
"name": "#body('Get_file_metadata')?['DisplayName']",
"queryParametersSingleEncoded": true
}
},
"runAfter": {
"Get_file_content": [
"Succeeded"
]
},
"runtimeConfiguration": {
"contentTransfer": {
"transferMode": "Chunked"
}
},
"type": "ApiConnection"
},
"Get_file_content": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['sharepointonline']['connectionId']"
}
},
"method": "get",
"path": "/datasets/#{encodeURIComponent(encodeURIComponent('<SITE_URL>'))}/files/#{encodeURIComponent(items('For_each')?['Id'])}/content",
"queries": {
"inferContentType": true
}
},
"runAfter": {
"Get_file_metadata": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Get_file_metadata": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['sharepointonline']['connectionId']"
}
},
"method": "get",
"path": "/datasets/#{encodeURIComponent(encodeURIComponent('<SITE_URL>'))}/files/#{encodeURIComponent(items('For_each')?['Id'])}"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"foreach": "#body('List_folder')",
"runAfter": {
"List_folder": [
"Succeeded"
]
},
"type": "Foreach"
},
"List_folder": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['sharepointonline']['connectionId']"
}
},
"method": "get",
"path": "/datasets/#{encodeURIComponent(encodeURIComponent('<SITE_URL>'))}/folders/#{encodeURIComponent('%252fShared%2bDocuments')}"
},
"metadata": {
"%252fShared%2bDocuments": "/Shared Documents"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<SUB_ID>/resourceGroups/<RG>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<SUB_ID>/providers/Microsoft.Web/locations/centralus/managedApis/azureblob"
},
"sharepointonline": {
"connectionId": "/subscriptions/<SUB_ID>/resourceGroups/<RG>/providers/Microsoft.Web/connections/sharepointonline",
"connectionName": "sharepointonline",
"id": "/subscriptions/<SUB_ID>/providers/Microsoft.Web/locations/centralus/managedApis/sharepointonline"
}
}
}
}
}

MS Graph pagination in Logic App with result export in Azure Blob storage

WHAT I am trying to do:
I am trying to retrieve the Sign-In logs for the last 24 hours of all users and save it in a blob storage. After the first result set creates the blob, the next results sets would update the blob file with the next remaining results.
Thought of using a blob storage and MS Graph because the Graph output contains all the details that I want without having to jump through various hoops in Powershell to expand certain properties and because the result size is huge (over 1GB via Export-CSV in PowerShell).
HOW I'm trying to do it
A scheduled run that does an HTTP request with the Graph query filtered by the last 24h which creates a block with the HTTP Body as content. After creation of the Blob, I added a (Do) Until control that runs until the HTTP Body does not contain #odata.nextLink and updates the blob file.
ISSUES:
First issue is that the Until loop finishes in 6 seconds.
Second issue is that the blob file only contains the first result set and is usually 9.3MB in size. Which means the next results set is not accessed and appended to the existing blob file.
My research
I tried with pagination enabled & disabled, various pagination thresholds, custom functions, but nothing that would make sense (to me at least) and I'm trying to follow the KISS model.
I looked over and tried to apply in one shape or form the answers from the below S.O. questions:
Graph Pagination in Logic Apps | Pagination with oauth azure data factory | Microsoft graph, batch request's nextLink | https://learn.microsoft.com/en-us/graph/paging;
Code I am trying
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Create_blob": {
"inputs": {
"body": "#body('fRequest')",
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "post",
"path": "/datasets/default/files",
"queries": {
"folderPath": "/graph",
"name": "DoUntil",
"queryParametersSingleEncoded": true
}
},
"runAfter": {
"fRequest": [
"Succeeded"
]
},
"runtimeConfiguration": {
"contentTransfer": {
"transferMode": "Chunked"
}
},
"type": "ApiConnection"
},
"Until": {
"actions": {
"Update_blob": {
"inputs": {
"body": "#body('fRequest')",
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "put",
"path": "/datasets/default/files/#{encodeURIComponent(encodeURIComponent('/graph/DoUntil'))}"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"expression": "#not(contains(body('fRequest'), '#odata.nextLink'))",
"limit": {
"count": 60,
"timeout": "PT1H"
},
"runAfter": {
"Create_blob": [
"Succeeded"
]
},
"type": "Until"
},
"fRequest": {
"inputs": {
"authentication": {
"audience": "https://graph.microsoft.com",
"clientId": "registered_app",
"secret": "app_secret",
"tenant": "tenant_id",
"type": "ActiveDirectoryOAuth"
},
"method": "GET",
"uri": "https://graph.microsoft.com/beta/auditLogs/signIns?$filter=createdDateTime gt #{addDays(utcNow(),-1)}"
},
"runAfter": {},
"runtimeConfiguration": {
"paginationPolicy": {
"minimumItemCount": 500
}
},
"type": "Http"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"Recurrence": {
"recurrence": {
"frequency": "Week",
"interval": 7,
"schedule": {
"hours": [
"7"
],
"minutes": [
0
]
}
},
"type": "Recurrence"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/subscription_id/resourceGroups/Apps/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/subscription_id/providers/Microsoft.Web/locations/eastus/managedApis/azureblob"
}
}
}
}
}
What am I doing wrong or missing?
Thanks in advance!
I managed to increase the pagination threshold to 20000 and now my files are no longer 9MB, they reach 200MB in size. I also removed the "Do" loop. Now I only need to create a break to avoid the threshold and resume collecting the remaining pages of results.

Resources