Adding new parameters to ARM template parameters file - azure

I have an original azure deployment that was created with original template parameters file.
I needed to change some logic at the template file and hence I needed to add some new extra parameters to the parameters file.
But when I tried to redeploy it I encountered the following error:
New-AzureRmResourceGroupDeployment : 9:43:19 AM - Error: Code=InvalidTemplate; Message=Deployment template validation
failed: 'The template parameters 'SecondaryServiceFabricClusterName, shouldDeployNewCluster' in the parameters file
are not valid; they are not present in the original template and can therefore not be provided at deployment time. The
only supported parameters for this template are .....
Anyone have an idea what is the problem and how I should do it properly?
P.S - this procedure is done via the Azure command line

Looks like the same issue here : https://github.com/Azure/azure-quickstart-templates/issues/3551
Happens when the parameters specified in the azuredeploy.parameters.json (ARM template parameters file) and the azuredeploy.json ARM template file have a mismatch. (Ps : make sure you also double check each param for spelling errors, types etc.)
'SecondaryServiceFabricClusterName, shouldDeployNewCluster' in the parameters file are not valid << check inside the parameters section of your ARM template if these 2 parameters are specified or used.
If you aren't using those parameters in your template, then remove them from your parameters file azuredeploy.parameters.json and it should fix the issue.
But alternatively if you want to keep them, then add them to the parameters section of your azuredeploy.json ARM template file

Related

Azure Template spec caching

I have a Azure template spec, version "ado", which has been working great. I recently changed a parameter name (ie. "location" to "aslocation") and updated the template spec using Powershell's Set-AzTemplateSpec command with the same version name "ado"
But when I call the template spec using the new parameter name, "aslocation", it throws:
Error BCP037: The property "aslocation" is not allowed on objects of type "params". Permissible properties include "location"
Even if I try using the old parameter name, "location", it throws:
New-AzResourceGroupDeployment: Cannot retrieve the dynamic parameters for the cmdlet. D:\git\IaC\Azure\main.bicep(4,5) : Error BCP035: The specified "object" declaration is missing the following required properties: "location". D:\git\IaC\Azure\main.bicep(5,7) : Error BCP089: The property "aslocation" is not allowed on objects of type "params". Did you mean "location"?
So it seems something is being cached. Any ideas on how to resolve or avoid this problem?
I have confirmed:
Occurs using Azure CLI or Powershell commands
Occurs using VS Code's integrated terminal or standalone powershell/CMD terminals
Template spec is indeed updated (verified via portal)
Issue persists through multiple days/reboots
I managed to resolve the issue. Leaving this here in case anyone else runs into this problem.
Bicep maintains a local cache for template specs at %USERPROFILE%.bicep
This is NOT automatically updated when template spec versions are updated. You must use Bicep CLI's restore command with the --force flag to refresh it (or remove your cache).
This seems counterintuitive IMHO but I digress..
Reference: https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-cli#restore

Unable to reference Logic App Action output in inline Javascript

I am trying to access an action output in Logic App inline Javascript action as follows:
However, this leads to an error while saving the workflow -
Failed to save logic app <>. The template validation failed: 'The action(s) '#outputs('Get_current_loop_object')' referenced by 'inputs' in action 'Wrap_Work_Relationship_data_to_array_if_needed' are not defined in the template.'.
Any ideas what I may be missing?
As per my repro It works for me by using following steps with inline java script :
Click on the Workflow settings menu bar.
After that select integration account which you created:-
Add files in get content:
Then Initialize the variable name with object type which is needed.(variable "json" should have to be initialized & declared before using it in the java script)
Coming to Action item part you need to mention the function or the operation as action item that you have mentioned in the below javascript connector in the logic app.
Saved it successfully:
NOTE:- It is not supported with loop as mentioned in this document
For further details refer this link .

ARM Deployment Error- The request content was invalid and could not be deserialized: 'Cannot deserialize the current JSON array

I had gone through the previous posts similar and not able to find any solution for my situation. So asking again. Please consider.
I am trying to deploy Azure Policy using ARM templates. So, I have created
1- Policy Definition File
2- Policy Parameter File
3- Power Shell script – Run with both Policy and Parameter file as input.
But when I trying to deploy, I am getting the error as attached. The “policyParameters” are being passed as Object type. Seems like the problem resides there. It would be great if you could look at this screen shot attached and advice.
Also the Powershell script out put shows the values expected I think but "ProvisioningState : Failed".
Thanks,
PolicyFile
Error Output
Parameter File
JSON-part1
JSON-Part2
You have to create a variable for policyParametars:
"variables": {
"policyParameters": {
"policyDefinitionId": {
"defaultValue": "[parameters('policyDefinitionId')]",
"type": "String"
},
...
This variable has to be passed to your parameters:
"parameters": "[variables('policyParameters')]",
You can find a sample here:
Configure Azure Diagnostic Settings with Azure Policies

Substitute Service Fabric application parameters during deployment

I'm setting up my production environment and would like to secure my environment-related variables.
For the moment, every environment has its own application parameters file, which works well, but I don't want every dev in my team knowing the production connection strings and other sensitive stuffs that could appear in there.
So I'm looking for every possibility available.
I've seen that in Azure DevOps, which I'm using at the moment for my CI/CD, there is some possible variable substitution (xml transformation). Is it usable in a SF project?
I've seen in another project something similar through Octopus.
Are there any other tools that would help me manage my variables by environment safely (and easily)?
Can I do that with my KeyVault eventually?
Any recommendations?
Thanks
EDIT: an example of how I'd like to manage those values; this is a screenshot from octopus :
so something similar to this that separates and injects the values is what I'm looking for.
You can do XML transformation to the ApplicationParameter file to update the values in there before you deploy it.
The other option is use Powershell to update the application and pass the parameters as argument to the script.
The Start-ServiceFabricApplicationUpgrade command accept as parameter a hashtable with the parameters, technically, the builtin task in VSTS\DevOps transform the application parameters in a hashtable, the script would be something like this:
#Get the existing parameters
$app = Get-ServiceFabricApplication -ApplicationName "fabric:/AzureFilesVolumePlugin"
#Create a temp hashtable and populate with existing values
$parameters = #{ }
$app.ApplicationParameters | ForEach-Object { $parameters.Add($_.Name, $_.Value) }
#Replace the desired parameters
$parameters["test"] = "123test" #Here you would replace with your variable, like $env:username
#Upgrade the application
Start-ServiceFabricApplicationUpgrade -ApplicationName "fabric:/AzureFilesVolumePlugin" -ApplicationParameter $parameters -ApplicationTypeVersion "6.4.617.9590" -UnmonitoredAuto
Keep in mind that the existing VSTS Task also has other operations, like copy the package to SF and register the application version in the image store, you will need to replicate it. You can copy the full script from Deploy-FabricApplication.ps1 file in the service fabric project and replace it with your changes. The other approach is get the source for the VSTS Task here and add your changes.
If you are planning to use KeyVault, I would recommend the application access the values direct on KeyVault instead of passing it to SF, this way, you can change the values in KeyVault without redeploying the application. In the deployment, you would only pass the KeyVault credentials\configuration.

Wrong publishsettings xml file structure for Import-AzurePublishSettingsFile command

I have a fresh publishsettings file obtained with use of command Get-AzurePublishSettingsFile
And when I run following command:
Import-AzurePublishSettingsFile -SubscriptionDataFile "path to publishsettings"
I'm getting this error:
Import-AzurePublishSettingsFile : Error in line 1 position 14. Expecting element 'ProfileData' from namespace 'http://schemas.datacontract.org/2004/07/Microsoft.WindowsAzure.Comman
ds.Utilities.Common'.. Encountered 'Element' with name 'PublishData', namespace ''.
At line:1 char:1
+ Import-AzurePublishSettingsFile -SubscriptionDataFile "path to publishsettings ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Import-AzurePublishSettingsFile], SerializationException
+ FullyQualifiedErrorId : Microsoft.WindowsAzure.Commands.Profile.ImportAzurePublishSettingsCommand
Looks like powershell cmdlet expects to see a file with different structure, but I have no idea where I can get one.
Am I doing something wrong here or it's an issue with Azure Powershell?
Azure module version is 0.8.2
I don't have an answer to your specific question. But I may have a better option, and something you can try to resolve the original issue.
See ArgumentNullException - Get-AzureService.
That post describes the following two options:
Instead of using publish settings files for management API authentication you can use your normal management portal login credentials. This is generally a better option for using the Azure powershell cmdlets.
If that doesn't work for you then see the link above for how to clear out the cached subscription configuration files and see if that fixes the Import-AzurePublishSettingsFile issue you are seeing.
I ran into the same issue. It turned out that I had used the incorrect parameter switch to specify the settings file.
Double check that you are using -PublishSettingsFile argument and not -SubscriptionDataFile as shown in your examples.
See the excerpt from powershell help below, for an explanation of each parameter.
Parameters
-PublishSettingsFile <String>
Specifies the full path and filename for the .publishsettings file for the Windows Azure account.
Required? true
Position? 1
Default value
Accept pipeline input? false
Accept wildcard characters? false
-SubscriptionDataFile <String>
Specifies the path to a file where the subscription data is stored. This parameter is optional. If it is not provided, the subscription data is imported into a default file in the user's profile.
Required? false
Position? named
Default value
Accept pipeline input? false
Accept wildcard characters? false
I was able to make this work after several tries.
You need to start PowerShell as Administrator and clearly layout the path to the filename assigned the downloaded file to.
Once I did this it worked well.

Resources