Deploy LogicApps from PowerShell - azure

I created in Visual Studio 2017 a new Azure Resource Group and selected Logic Apps. In the project there is Deploy-AzureResourceGroup.ps1.
I want to change parameters in the json file accordingly with the parameters in the LogicApp.json
If I run it, it seems it working but nothing is created in Azure. I change the parameters file
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"logicAppName": {
"value": "LogicApps-Test-Deploy"
},
"logicAppLocation": {
"value": "northeurope"
}
}
}
And nothing happened. I tried to generate in the Azure portal an Automatic Script: in this case I have a script and a template for all my resources (very long and complicate file).
Basically I want to create different scripts for different environment. What is the right process for that?

Don't "create different scripts for different environments". Instead, aim to have a single script ( DeployAzureResourceGroup.ps1 ), a single template to deploy your logic app ( LogicApp.json ), and different versions of your parameters file to parameterize the template. For example,
LogicApp.parameters.dev.json
LogicApp.parameters.test.json
LogicApp.parameters.prod.json
This will enable you to deploy the same infrastructure consistently and reliably, across multiple subscriptions (ie: subscriptions for dev, test, and prod environments).
Also, use a PowerShell console window or PowerShell ISE to test your work. Some of your problem could be simply trying to execute the template in Visual Studio. I've had intermittent issues in the past (mostly token cache issues) executing ARM template deployments from Visual Studio and finally just got in the practice of testing the code from a PowerShell environment, which is what most users of the scripts and templates will be doing anyway.

If you have already created an ARM template for logic app, you can also use Template feature in Azure Portal to fix any issues with your ARM template. I normally use it while developing any new ARM template. This also helps with a detailed error report and in general I find it a better and quick way of identifying any issue.
Have a look the link below -
https://azure.microsoft.com/en-gb/updates/deploy-custom-templates-from-the-preview-portal/

Related

Azure SQL Server Backup: Need to UnRegister Containers and Redisover DBs in Azure SQL Server Backup using ARM templates

I have performed discovery operations for listing protectable items in Azure Backup: 'SQL in Azure VM'.
I am able to perform 'Disovery' using the following template
"resources": [
{
"type": "Microsoft.RecoveryServices/vaults/backupFabrics/protectionContainers",
"apiVersion": "2016-12-01",
"name": "[concat(parameters('vaultName'), '/', parameters('fabricName'), '/',parameters('protectionContainers')[copyIndex()])]",
"properties": {
"backupManagementType": "[parameters('backupManagementType')]",
"workloadType": "[parameters('workloadType')]",
"containerType": "[parameters('protectionContainerTypes')[copyIndex()]]",
"sourceResourceId": "[parameters('sourceResourceIds')[copyIndex()]]",
"operationType": "Register"
},
"copy": {
"name": "protectionContainersCopy",
"count": "[length(parameters('protectionContainers'))]"
}
}
]
I similarly tried the following operation types:
"Reregister": Works as expected.
"Invalid: Did not perform any operation.
Could someone guide me with unregistering of containers using the ARM template?
(I already have the API to do it, but I need it with an ARM template).
Similarly is there any way to rediscover DBs within a registered container using an ARM template?
Any help is much appriceiated.
Looking at the Register API for Protection Containers, it looks like the supported values for OperationType are: Invalid, Register and Reregister. The Unregister API fires a HTTP DELETE request that is not straightforward to simulate with an ARM template. ARM Templates are primarily meant to be used for creating and managing your Azure resources as an IaC solution.
That said, if you have ARM templates as your only option, you could try deploying it in Complete mode. In complete mode, Resource Manager deletes resources that exist in the resource group but aren't specified in the template.
To deploy a template in Complete mode, you'd have to set it explicitly using the Mode parameter since the default mode is incremental. Be sure to use the what-if operation before deploying a template in complete mode to avoid unintentionally deleting resources.

How to create standard type Logic Apps using ARM templates

I can create consumption type logic apps with sample workflow using ARM templates. I want to create standard type logic apps with sample workflows using ARM templates.
But, I’m unable to find any reference documentations for the above one.
So, can anyone help me out on this one.
Sorry, in my earlier answer I misunderstood what you were actually asking. Now I believe I got you. But unfortunately what you want to achieve is not possible. And that is by design:
Standard logic apps are fundamentally different to consumption logic apps.
The old logic apps (now called consumption or multi-tenant) make no distinction between the workflow that you execute within a logic app and the logic app as an azure resource. Your logic app really IS your workflow and it runs on an dedicated ISE that you can not configure. That is why you will find all that workflow information in the arm template.
The new logic apps (now called standard or single-tenant) are build upon the same system that function apps are. Now your logic app is an azure resource that provides the runtime for one or more workflows. It is analogous to a function app that can run one or more functions. There is therefore a clear separation between the logic resource that is described in the arm template and the "application code" (your workflow) that is run within this azure resource.
Like function apps you can only create the azure infrastructure resources with arm templates. Azure Resource Manager has no means to deploy application code.
Your workflow definition will be a separate json file to the arm template that defines your logic app infrastructure and the deployment of the workflow is a step that happens after the provisioning of the infrastructure.
See this project for an example of how this can be setup in a CI/CD pipeline: https://github.com/Azure/logicapps/tree/master/azure-devops-sample
To add on Manuel answer, additional useful CI/CD info can be found here - https://learn.microsoft.com/en-us/azure/logic-apps/set-up-devops-deployment-single-tenant-azure-logic-apps?tabs=github
And we had to use different App Service SKU for App Service Plan section. Haven't had time to deep-dive in to SKU Topic, but for us only Workflow Standard (WS1 - for example)plans are available.
If you need to parameterize your connections.json - just refer the values to the appsettings like this:
{
"managedApiConnections": {
"documentdb": {
"api": {
"id": "/subscriptions/#appsetting('WORKFLOWS_SUBSCRIPTION_ID')/providers/Microsoft.Web/locations/norwayeast/managedApis/documentdb"
},
"authentication": {
"type": "ManagedServiceIdentity"
},
"connection": {
"id": "/subscriptions/#appsetting('WORKFLOWS_SUBSCRIPTION_ID')/resourceGroups/INT010/providers/Microsoft.Web/connections/documentdb-test10A"
},
"connectionRuntimeUrl": "#appsetting('connection_runtimeUrl')"
}
}
}

What is the best approach to update and test Logic App ARM templates after making changes from designer?

We have a CI/CD setup for the deployment of Logic Apps using ARM templates. There are around 20 logic apps we have parameterized and checked in as templates in our code (both template.json and parameters.json) files, and then we can deploy them using our pipelines to any environment we want.
The question I want to ask is how to check/verify the ARM templates after making changes to the Logic Apps. For now what we are doing is editing the logic apps using the designer in azure portal in one of the dev environments and then manually copy and pasting the updated part in the template.json file.
I was wondering if there is a better approach we can take to verify the manually updated template.json files, since we only come to know about an error in the template when we try to deploy it. Also it would be better to have a way to visualize the logic app ARM template using some designer if possible to check all the steps.
I know we have an extension for visual studio that allows you to design and deploy logic app ARM template in the editor itself, but I am unable to find a way to open already existing templates in the visual studio logic app designer, since we only want to update and verify the template in the designer, the deployment is handled by pipelines.
Please let me know if there is a better approach to locally verify/test the logic app ARM templates.
For now what we are doing is editing the logic apps using the designer in azure portal in one of the dev environments and then manually copy and pasting the updated part in the template.json file.
You can use the LogicAppTemplate module in powershell to get the template from your subscription. That way you don't have to copy paste the changed components.
but I am unable to find a way to open already existing templates in the visual studio logic app designer, since we only want to update and verify the template in the designer, the deployment is handled by pipelines.
As I understand, there is no way of importing a template.json for a logic app in Visual Studio or Visual Studio Code. And the only way to view it in designer would be to publish it first. However, if you convert the template.json into a bicep file, Visual Studio Code can validate the converted .bicep file.
To generate the template.json for the existing logic app:
$parameters = #{
Token = (az account get-access-token | ConvertFrom-Json).accessToken
LogicApp = '<logicapp_name>'
ResourceGroup = '<resourceGroupName'
SubscriptionId = "Subscription_ID"
Verbose = $true
}
Get-LogicAppTemplate #parameters | Out-File temp.json
and then you convert it with the following command:
bicep decompile temp.json
Now we load up the generated bicep file in visual studio code for validation. For instance:

Create resource group and deploy resources using Arm template and deploy from Visual Studio

I want to create resource group and deploy resource using ARM template from visual Studio
When I tried the following example by copying the script and put it in my Visual Studio. When I try to run the Deployment template comes out blank.
https://learn.microsoft.com/en-us/azure/azure-resource-manager/deploy-to-subscription#create-resource-group-and-deploy-resources
How do i run this the arm template?
Schema that is used in your example uses "subscription level schema"
https://schema.management.azure.com/schemas/2018-05-01/subscriptionDeploymentTemplate.json#
As per https://learn.microsoft.com/en-us/azure/azure-resource-manager/deploy-to-subscription
To deploy templates at the subscription level, you use Azure CLI and
Azure PowerShell.
I used this schema "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#" I used this default schema..and it detects that file as "deployment template"
Currently VS doesn't handle subscription level deployment templates. VS won't deploy them via the UI and if the schema is set properly it won't even recognize it as an ARM template.

Azure Data Factory V2 Copy activity Mapping deployment issue

Consider following test Mapping for Data Factory Copy activity:
"translator": {
"columnMappings": "#json('{\"from\":\"to\"}')",
"type": "TabularTranslator"
}
After deploying pipeline with the help of Set-AzureRmDataFactoryV2Pipeline PowerShell cmdlet we get normally deployed pipeline with the exact columnMappings value as specified in source code. But if you try to be more dynamic:
"translator": {
"columnMappings": "#json(pipeline().parameters.Mapping)",
"type": "TabularTranslator"
}
then after deployment you'll find that translator element is completely missing in pipeline. A workaround - set translator in Azure Portal Data Factory pipeline editing UI (either in Designer or JSON modes - both options work). But if after these manipulations you save pipeline JSON to the file and attempt to deploy it via Set-AzureRmDataFactoryV2Pipeline PowerShell cmdlet - bang, translator becomes missing. Expected result - deployment shall preserve translator element, because Portal JSON Editor preserves it.
We are doing automated deployment of pipelines (as you already figured out - with the help of Set-AzureRmDataFactoryV2Pipeline) and this bug breaks our automated deployment because it requires manual postdeployment pipeline editing on Azure Portal UI.
What may be the reason of such a buggy behavior? Can you suggest an idea how to work around this bug in automated way, or how to fix the code so it can be properly deployed with Set-AzureRmDataFactoryV2Pipeline?
You could try whether "Update-Module -Name AzureRm.DataFactoryV2" helps. It might be caused by that your powershell module is out of date.

Resources