No Workflow Resource found in the template in Logic App - azure

I am trying to copy the logic app code from code view and paste it in the logic app project that I created locally.
Once I have pasted the code, I am trying to right click and open the file using logic app designer, then I am getting an error No workflow resource found in the template in Logic App.
Request some help if someone has faced this problem before.

I would suggest you to copy each array separately like triggers and actions and paste it in local project. I have done the same thing and i am able to open the designer. Below are steps I followed,
Created a standard logic app in portal.
Created a workflow as shown below,
Created a local project in vs code as followed by document.
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {},
"contentVersion": "1.0.0.0",
"outputs": {},
"triggers": {}
},
"kind": "Stateful"
}
Copied actions json from portal to vs code.
"actions": {
"Initialize_variable": {
"inputs": {
"variables": [
{
"name": "str",
"type": "string"
}
]
},
"runAfter": {},
"type": "InitializeVariable"
},
"Lists_blobs_(V2)": {
"inputs": {
"host": {
"connection": {
"referenceName": "azureblob"
}
},
"method": "get",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/foldersV2/#{encodeURIComponent(encodeURIComponent('xxxx'))}",
"queries": {
"nextPageMarker": "",
"useFlatListing": false
}
},
"metadata": {
"xxxx": "/xxxx"
},
"runAfter": {
"Initialize_variable": [
"Succeeded"
]
},
"type": "ApiConnection"
}
}
Now copied triggers code from portal to vs code.
"triggers": {
"manual": {
"inputs": {},
"kind": "Http",
"type": "Request"
}
}
Saved the workflow.json file in vs code.
Able to open designer from local project,

Related

Using Event Grid, is it possible to trigger a logic app when a resource is created/deleted in Azure?

I would like to know if using Event Grid, is it possible to have a logic app triggered when any resource is deployed on a Azure Subscription.
The use case is :
Somebody creates/deletes a resource on a particular Azure subscription
It sends a event in Event Grid (not sure about that ?)
A logic app is then triggered when such event occurs, this logic app will send notification in a Teams channel.
The goal here is to have a simple and basic helicopter view on what's happening on this sub.
For testing purposes, I've created a logic app and add a "When a resource event occurs" trigger with Microsoft.Resources.ResourceGroups and these event types :
Microsoft.Resources.ResourceActionSuccess
Microsoft.Resources.ResourceDeleteSuccess
Microsoft.Resources.ResourceWriteSuccess
Not sure I'm exploring here.
Then I've deployed a storageaccount, but I get notifications even when "Reviewing" the deployment just before the resource is actually deployed.
Once deployed, I also have random notifications (even if the storage account is not used, some kind of background activities I guess ?)
As per this Official documentation:
Resource events are created for PUT, PATCH, POST, and DELETE operations that are sent to management.azure.com. GET operations don't create events.
Hence you are receiving multiple triggers. For minimal triggers you can add the filters for the deployment. Below is the flow I'm using.
Below is the complete JSON of my logic app
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Send_an_email_(V2)": {
"inputs": {
"body": {
"Body": "<p>#{triggerBody()?['subject']} has been created</p>",
"Importance": "Normal",
"Subject": "xxx",
"To": "xxx"
},
"host": {
"connection": {
"name": "#parameters('$connections')['office365']['connectionId']"
}
},
"method": "post",
"path": "/v2/Mail"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"When_a_resource_event_occurs": {
"conditions": [],
"inputs": {
"body": {
"properties": {
"destination": {
"endpointType": "webhook",
"properties": {
"endpointUrl": "#{listCallbackUrl()}"
}
},
"filter": {
"includedEventTypes": [
"Microsoft.Resources.ResourceDeleteSuccess",
"Microsoft.Resources.ResourceActionSuccess"
],
"subjectBeginsWith": "/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.Resources/deployments"
},
"topic": "/subscriptions/xxx/resourceGroups/xxx"
}
},
"host": {
"connection": {
"name": "#parameters('$connections')['azureeventgrid']['connectionId']"
}
},
"path": "/subscriptions/#{encodeURIComponent('b83c1ed3-c5b6-44fb-b5ba-2b83a074c23f')}/providers/#{encodeURIComponent('Microsoft.Resources.ResourceGroups')}/resource/eventSubscriptions",
"queries": {
"x-ms-api-version": "2017-09-15-preview"
}
},
"splitOn": "#triggerBody()",
"type": "ApiConnectionWebhook"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureeventgrid": {
"connectionId": "/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.Web/connections/azureeventgrid",
"connectionName": "azureeventgrid",
"id": "/subscriptions/xxx/providers/Microsoft.Web/locations/eastus/managedApis/azureeventgrid"
},
"office365": {
"connectionId": "/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.Web/connections/office365",
"connectionName": "office365",
"id": "/subscriptions/xxx/providers/Microsoft.Web/locations/eastus/managedApis/office365"
}
}
}
}
}
RESULTS:

Update Assignee when Jira ticket is created using Azure logic app

I have task for automating ticket assignee on Jira using Azure logic app. When new ticket is created Azure logic app will trigger it and assign ticket to a user.
I tried using the HTTP connector to update the ticket assignee but I got Bad Request
URL:
https://company.atlassian.net/rest/api/2/{issue_Key}
Body:
"fields": {
"assignee": {
"name": "employee name"
}
}
}
I don't know how it is done via the API.
I used JSM automations to solve the problem.
I created an automation within JSM which will update the ticket when it is moved to active and also syncs the assignee.
Maybe you can try something like that :)
After reproducing from my end, I could get this work only after including the accountId along with emailAddress in the request body. Below is the complete request body in my logic app flow.
{
"fields": {
"assignee": {
"accountId": "63f32c...",
"emailAddress": "<YOUR_EMAIL_ADDRESS>"
}
}
}
Results:
In Logic App run:
In Jira dashboard:
Below is the complete code of my logic app
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"HTTP": {
"inputs": {
"authentication": {
"password": "yyy",
"type": "Basic",
"username": "yyy"
},
"body": {
"fields": {
"assignee": {
"accountId": "63f32c...",
"emailAddress": "yyy"
}
}
},
"method": "PUT",
"uri": "https://yyy.atlassian.net/rest/api/2/issue/10004"
},
"runAfter": {},
"type": "Http"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {}
}

Trigger pipeline upon approval in Azure

I have a report which generate after running an ADF pipeline, and I would need to append these records into a history table upon client approval. So, I need to use either PowerBi or Sharepoint to show this report to the client and get approval. So, I have this plan, Can someone please tell me if this is doable, if yes, how to achieve it? if not please suggest changes.
Show the report in either PowerBI or SharePoint, and have buttons Approve/Reject.
If the client clicks on Approve, it should trigger a pipeline using the Logic app, with necessary parameters passed.
if this is doable, can you please share the references? if not, please let me know how i achieve this functionality in another way.
One of the workaround to achieve your requirement is to use logic apps and send the PowerBI or SharePoint link using email with Send approval email action of outlook connector where the email is send with Approval or Reject user options.
The flow is kept on hold until the the response is received.
In my outlook
Run after response is received
Now to continue the flow if you receive accepted response, You can add condition action and check if the response is Approved and then continue the flow.
RESULTS:
Below is the codeview of my Logic App
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Condition": {
"actions": {
"Compose": {
"inputs": "Your response have been Accepted.",
"runAfter": {},
"type": "Compose"
}
},
"expression": {
"and": [
{
"equals": [
"#body('Send_approval_email')?['SelectedOption']",
"Approve"
]
}
]
},
"runAfter": {
"Send_approval_email": [
"Succeeded"
]
},
"type": "If"
},
"Send_approval_email": {
"inputs": {
"body": {
"Message": {
"Body": "https://microsoftapc.sharepoint.com/teams/Sample2408/Lists/SampleList/AllItems.aspx",
"HideHTMLMessage": false,
"Importance": "Normal",
"Options": "Approve, Reject",
"ShowHTMLConfirmationDialog": false,
"Subject": "Approval Request",
"To": "<Email_ID>"
},
"NotificationUrl": "#{listCallbackUrl()}"
},
"host": {
"connection": {
"name": "#parameters('$connections')['office365']['connectionId']"
}
},
"path": "/approvalmail/$subscriptions"
},
"runAfter": {},
"type": "ApiConnectionWebhook"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"office365": {
"connectionId": "/subscriptions/<Sub_Id>/resourceGroups/<RG>/providers/Microsoft.Web/connections/office365",
"connectionName": "office365",
"id": "/subscriptions/<Sub_Id>/providers/Microsoft.Web/locations/centralus/managedApis/office365"
}
}
}
}
}

SharePoint online storage full

Our SharePoint online storage is full, it's expensive to buy extra storage on SharePoint online. I want to copy the folders from SharePoint Online to Azure Blob or Azure file shares cool access tier as a storage Archive. I tried using power automate and logic apps, it just creates the rules for newly created files but no option to move the already created Sharepoint online folders. Please advise me with any steps or recommendations.
Below is the flow I followed to achieve your requirement.
First I tried to list the files inside the desired folder using List folder action of sharepoint which gives me an option to retireve File Identifier of a particular file.
In the next step I tried looping inside the folder mentioned in the previous step to retrieving the metadata of the file using Get file metadata action and get the content of the file using Get file content action of sharepoint.
In the next step I used Create blob (V2) action of Azure blob storage to create blobs inside the storage using the file content and the metadata of the files from sharepoint.
This step is not required but if you want tto automate deleting all the files from your sharepoint, you can use Delete file action of sharepoint to delete the file which you have copied to sharepoint.
Below is the whole flow of my logic app
RESULTS:
Below are the files that are already presnt in my Sharepoint site:-
In my storage account
NOTE: If you are trying to automate deletion of files from sharepoint then make sure the files are moving from sharepoint to blob storage first (i.e., without using the delete file action) and then, after the transfer is completed try creating a new flow with delete blob action to delete the files.
To reproduce the same in your logic apps, you can use the below code-view.
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"For_each": {
"actions": {
"Create_blob_(V2)": {
"inputs": {
"body": "#body('Get_file_content')",
"headers": {
"ReadFileMetadataFromServer": true
},
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/files",
"queries": {
"folderPath": "/container",
"name": "#body('Get_file_metadata')?['DisplayName']",
"queryParametersSingleEncoded": true
}
},
"runAfter": {
"Get_file_content": [
"Succeeded"
]
},
"runtimeConfiguration": {
"contentTransfer": {
"transferMode": "Chunked"
}
},
"type": "ApiConnection"
},
"Get_file_content": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['sharepointonline']['connectionId']"
}
},
"method": "get",
"path": "/datasets/#{encodeURIComponent(encodeURIComponent('<SITE_URL>'))}/files/#{encodeURIComponent(items('For_each')?['Id'])}/content",
"queries": {
"inferContentType": true
}
},
"runAfter": {
"Get_file_metadata": [
"Succeeded"
]
},
"type": "ApiConnection"
},
"Get_file_metadata": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['sharepointonline']['connectionId']"
}
},
"method": "get",
"path": "/datasets/#{encodeURIComponent(encodeURIComponent('<SITE_URL>'))}/files/#{encodeURIComponent(items('For_each')?['Id'])}"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"foreach": "#body('List_folder')",
"runAfter": {
"List_folder": [
"Succeeded"
]
},
"type": "Foreach"
},
"List_folder": {
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['sharepointonline']['connectionId']"
}
},
"method": "get",
"path": "/datasets/#{encodeURIComponent(encodeURIComponent('<SITE_URL>'))}/folders/#{encodeURIComponent('%252fShared%2bDocuments')}"
},
"metadata": {
"%252fShared%2bDocuments": "/Shared Documents"
},
"runAfter": {},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<SUB_ID>/resourceGroups/<RG>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<SUB_ID>/providers/Microsoft.Web/locations/centralus/managedApis/azureblob"
},
"sharepointonline": {
"connectionId": "/subscriptions/<SUB_ID>/resourceGroups/<RG>/providers/Microsoft.Web/connections/sharepointonline",
"connectionName": "sharepointonline",
"id": "/subscriptions/<SUB_ID>/providers/Microsoft.Web/locations/centralus/managedApis/sharepointonline"
}
}
}
}
}

How to add to Azure Blob Storage from Logic App

I have an Azure Stream Analytics job which adds JSON data to an Azure storage container as an output. The data is organized in a folder structure based on the date, as specified in the stream analytics output here:
ASA Output Details
I also have a Logic App which I want to add data to the same place. I am looking at the logic app Blob Storage actions and cant figure out how to do this. The Update Blob action seems to want to point to a single blob file, rather than having it integrated into the data based on date.
Is there a way to do this with the Logic Apps actions? Or maybe there is a better way to structure my data so that I can add events both from stream analytics as well as from logic apps?
Thanks!
To add blobs to the same folder structure you need to mention the path. To achieve your requirement I have used Create blob action and added blobs to the required folder.
In case if you are trying to automate this process to run everyday and add files to everyday to the date path, then try following the below flow.
You can reproduce the same in your environment using the below code view
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"Compose": {
"inputs": {
"Sample": "Sample Json from Logic Apps"
},
"runAfter": {},
"type": "Compose"
},
"Create_blob_(V2)": {
"inputs": {
"body": "#outputs('Compose')",
"headers": {
"ReadFileMetadataFromServer": true
},
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('AccountNameFromSettings'))}/files",
"queries": {
"folderPath": "/container1/#{formatDateTime(utcNow(),'yyyy')}/#{formatDateTime(utcNow(),'MM')}/#{formatDateTime(utcNow(),'dd')}",
"name": "LAEvent.json",
"queryParametersSingleEncoded": true
}
},
"runAfter": {
"Compose": [
"Succeeded"
]
},
"runtimeConfiguration": {
"contentTransfer": {
"transferMode": "Chunked"
}
},
"type": "ApiConnection"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
}
},
"triggers": {
"manual": {
"inputs": {
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {
"$connections": {
"value": {
"azureblob": {
"connectionId": "/subscriptions/<SUB_ID>/resourceGroups/<RG>/providers/Microsoft.Web/connections/azureblob",
"connectionName": "azureblob",
"id": "/subscriptions/<SUB_ID>/providers/Microsoft.Web/locations/centralus/managedApis/azureblob"
}
}
}
}
}
In Storage account:
Updated Answer
In the case where you want to add data to an existing blob, you can concat both the data which is already present in the blob and the new logic app's data using concat function.
concat(body('Get_blob_content_(V2)'),outputs('Compose'))
RESULTS:

Resources