Retrieve Variable Name and Value from az pipeline variable-group? - azure

I have got the variables from a variable-group with:
$be_common=az pipelines variable-group variable list --group-id <some_id> --output json
and the result is something like this:
{
"key1": {
"isSecret": null,
"value": "value1"
},
"key2": {
"isSecret": null,
"value": "value2"
}
}
How can i use the values of the variables in the group later in the pipeline?
For example i want later to execute
docker build --build-arg arg1=value1
(the value of variable key1 from the group)?
I see here
Retrieve Variable Name and Value from az pipeline variable-group
how they are retrieved, but they are enumerated, and i want to point exactly which variable value i want to use.

Related

Unable to query Azure Table Storage using Azure CLI

I wanted to filter the entry in my Azure Storage Table and the structure looks like the following. I wanted to filter the entry's based on the given Id for example JD.98755. How can we achieve this?
{
"items": [
{
"selectionId": {
"Id": "JD.98755",
"status": 0
},
"Consortium": "xxxxxx",
"CreatedTime": "2019-09-06T09:34:07.551260+00:00",
"RowKey": "yyyyyy",
"PartitionKey": "zzzzzz-zzzzz-zz-zzzzz-zz",
"Timestamp": "2019-09-06T09:41:34.660306+00:00",
"etag": "W/\"datetime'2019-09-06T09%3A41%3A34.6603060Z'\""
}
],
"nextMarker": {}
}
I can filter other elements like the Consortium using the below query but not the Id
az storage entity query -t test --account-name zuhdefault --filter "Consortium eq 'test'"
I tried something like the following to filter based on the given ID but it has not returned any results.
az storage entity query -t test --account-name zuhdefault --filter "Id eq 'JD.98755'"
{
"items": [],
"nextMarker": {}
}
I do agree with #Gaurav Mantri and I guess one of other approach you can use is:
I have reproduced in my environment and got expected results as below:
Firstly, you need to store the output of the command into a variable like below:
I have stored output in $x variable:
$x
Then you can change the output from Json:
$r= $x | ConvertFrom-Json
Then you can store items.id value in a variable like below:
Now you can use below command to get the items with Id JD.98755:
If you have more data, then store the first output into variable then divide them into objects using ConvertFrom-json and then you use the above steps from first.
The reason you are not getting any data back is because Azure Table Storage is a simple key/value pair store and you are storing a JSON there (in all likelihood, the SDK serialized JSON data and stored it as string in Table Storage).
Considering there is no key named Id, you will not be able to search for that.
If you need to store JSON document, one option is to make use of Cosmos DB (with SQL API) instead of Table Storage. Other option would be to flatten your JSON so that you store them as key/value pair. In this scenario, your data would look something like:
{
"selectionId_Id": "JD.98755",
"selectionId_status": 0,
"Consortium": "xxxxxx",
"CreatedTime": "2019-09-06T09:34:07.551260+00:00",
"RowKey": "yyyyyy",
"PartitionKey": "zzzzzz-zzzzz-zz-zzzzz-zz",
"Timestamp": "2019-09-06T09:41:34.660306+00:00",
"etag": "W/\"datetime'2019-09-06T09%3A41%3A34.6603060Z'\""
}
then you should be able to filter by selectionId_Id.

How to pass Json variable as inputs in Azure DevOps pipeline task

I am forming a JSON dynamically during the pipeline run based on few pipeline parameters and pre-defined environment variables and trying to pass this JSON as an input in subsequent pipeline task.
jobs:
- job: PayloadCreation
pool: linux-agent (or windows)
steps:
- ${{ each app in apps }}:
- bash: |
payload=$(jq .artifact += [{"name": "${{ app.name}}, "version":"$(Build.BuildId)"}]' artifact.json)
echo $payload > artifact.json
echo "##vso[task.setvariable variable=payload]$payload"
I am getting the output of artifact.json as well as variable $payload as follows -
"artifacts": [
{
"name":"service-a",
"version":"1.0.0"
},
{
"name":"service-b",
"version": "1.0.1"
}
]
}
Subsequently, I am trying to use this JSON variable to pass it as input in the following job and unable to do so.
- job: JobB
steps:
- task: SericeNow-DevOps-Agent-Artifact-Registration#1
inputs:
connectedServiceName: 'test-SC'
artifactsPayload: $(payload)
It is unable to read the JSON as input variable. I get the below error -
Artifact Registration could not be sent due to the exception: Unexpected token $ in JSON at position 0
Is there any other way a JSON could be passed as input variable?
By default, variables are not available between jobs. In JobB, the $(payload) variable is not defined.
When setting the variable, you need to provide isOutput: echo "##vso[task.setvariable variable=payload;isOutput=true]$payload"
When referencing the variable, you need to use the appropriate runtime expression:
variables:
payload: $[ dependencies.PayloadCreation.outputs['payload'] ]
Ref: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#share-variables-across-pipelines
https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=bash#setvariable-initialize-or-modify-the-value-of-a-variable
Is there any other way a JSON could be passed as input variable?
Strictly, no. Variables under the concept of DevOps pipeline doesn't support JSON object.
Why no?
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#variables
Variables are always strings.
But this doesn't mean you can't pass the JSON information, if you want, passing string is the only way.
Is the task designed by yourself?
Convert string object to JSON object is not a difficult:
//convert string object to json object
var str = `
{
"artifacts": [
{
"name":"service-a",
"version":"1.0.0"
},
{
"name":"service-b",
"version": "1.0.1"
}
]
}
`;
var obj = JSON.parse(str);
console.log(obj.artifacts[0].name);
console.log(obj.artifacts[0].version);
Not sure how your task design, but Daniel's method of passing variables is correct.
You can do operations in your extension task code after convert the string object to JSON object.
Here I add other relevant information of the logging command:
Set Variables
Variables Level
By the way, in your question, the json is
"artifacts": [
{
"name":"service-a",
"version":"1.0.0"
},
{
"name":"service-b",
"version": "1.0.1"
}
]
}
Shouldn't it be like this?
{
"artifacts": [
{
"name":"service-a",
"version":"1.0.0"
},
{
"name":"service-b",
"version": "1.0.1"
}
]
}

Add properties to an object in a variable in Azure Logic Apps

How can I manage to add, update or delete a property of an object in a variable in Azure Logic Apps?
Example of my object before:
{
"prop1": "value1"
}
Example of my object after:
{
"prop1": "value1",
"prop2": "value2",
}
I would like to use the Set variable operation in order to add a new property in the variable (I used the union function, but with a temporary variable because assigning a self-referenced value to a variable isn't authorized).
Thank you for your help!
You can use an Initialize variable to create the prop2 variable or you can source it from wherever you need to. Then you can use Compose step to combine the prop1 and prop2.
To illustrate, I used a http triggered logic app with a JSON body of "prop1" and value "value1". I think Initialize the variable "prop2" to "value2". For my compose block I used the input
{
"prop1": "#{triggerBody()?['prop1']}",
"prop2": "#{variables('prop2')}"
}
My output is a webhook which will receive the combined JSON as
{
"prop1": "value1",
"prop2": "value2"
}

How can I pass pipeline variable to parameters file for blueprint assignment

I'm trying to create an Azure DevOps pipeline for deploying Azure Blueprint. There are some fields in the parameters file(JSON) which I want to be configurable. How can I pass these values as pipeline variables and use them in the parameters file?
I tried defining a pipeline variable and reference it in the parameter file like this "$(var-name)", but it didn't work. Is there a way to solve this?
Below is my pipeline definition, I'm using AzureBlueprint extension for creating and assigning blueprint:
steps:
- task: CreateBlueprint#1
inputs:
azureSubscription: $(serviceConnection)
BlueprintName: $(blueprintName)
BlueprintPath: '$(blueprintPath)'
AlternateLocation: false
PublishBlueprint: true
- task: AssignBlueprint#1
inputs:
azureSubscription: $(serviceConnection)
AssignmentName: '$(blueprintName)-assignment'
BlueprintName: $(blueprintName)
ParametersFile: '$(blueprintPath)/assign.json'
SubscriptionID: $(subscriptionId)
Wait: true
Timeout: 500
and my parameters file:
"parameters":{
"organization" : {
"value": "xxxx"
},
"active-directory-domain-services_ad-domain-admin-password" : {
"reference": {
"keyVault": {
"id": "/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.KeyVault/vaults/xxxx"
},
"secretName": "xxxx"
}
},
"jumpbox_jumpbox-local-admin-password" : {
"reference": {
"keyVault": {
"id": "/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.KeyVault/vaults/xxxx"
},
"secretName": "xxxx"
}
},
"keyvault_ad-domain-admin-user-password" : {
"value" : "xxxx"
},
"keyvault_deployment-user-object-id" : {
"value" : "xxxx"
},
"keyvault_jumpbox-local-admin-user-password" : {
"value" : "xxxx"
}
}
Since the Tasks (CreateBlueprint and AssignBlueprint) you are using doesn't support overriding parameters, you have two options:
Use the Azure CLI az blueprint command to directly create and assign blueprints.
Change the parameters file bei either using JSON variable substitution or by using a small PowerShell script (see blow):
Sample:
$paramFile = Get-Content ./azuredeploy.parameters.json | ConvertFrom-Json
$paramFile.parameters.organization.value = "your-org-name"
$paramFile | ConvertTo-Json | Set-Content ./azuredeploy.parameters.json
Be aware that the Task you are using hasn't received an update within the last 17 months (here is the GitHub repository).
AssignBlueprint#1 doesn't support natively this. However you can modify assign.json using Json substitution
It comes down to having Azure Pipeline variables with a name like a path to a leaf in the json file which you want to teplace.
Here is an example:
variables:
Data.DebugMode: disabled
Data.DefaultConnection.ConnectionString: 'Data Source=(prodDB)\MSDB;AttachDbFilename=prod.mdf;'
Data.DBAccess.Users.0: Admin-3
Data.FeatureFlags.Preview.1.NewWelcomeMessage: AllAccounts
# Update appsettings.json via FileTransform task.
- task: FileTransform#1
displayName: 'File transformation: appsettings.json'
inputs:
folderPath: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
targetFiles: '**/appsettings.json'
fileType: json

Properly using Terraform External Data Source

I am using Terraform from the bash cloud shell in Azure. I am trying to add an external data source to my Terraform configuration file that will use az cli to query for the virtualip object on a Microsoft.Web/hostingEnvironment the template deploys.
AZ CLI command line:
az resource show --ids /subscriptions/<subscription Id>/resourceGroups/my-ilbase-rg/providers/Microsoft.Web/hos
tingEnvironments/my-ilbase/capacities/virtualip
Output when run from command line:
{
"additionalProperties": {
"internalIpAddress": "10.10.1.11",
"outboundIpAddresses": [
"52.224.70.119"
],
"serviceIpAddress": "52.224.70.119",
"vipMappings": []
},
"id": null,
"identity": null,
"kind": null,
"location": null,
"managedBy": null,
"name": null,
"plan": null,
"properties": null,
"sku": null,
"tags": null,
"type": null
}
In my Terraform config I create a variable for the --ids value:
variable ilbase_resourceId {
default = "/subscriptions/<subscription Id>/resourceGroups/my-ilbase-rg/providers/Microsoft.Web/hostingEnvironments/my-ilbase/capacities/virtualip"
}
I then have the data source structured this way:
data "external" "aseVip" {
program = ["az", "resource", "show", "--ids", "${var.ilbase_resourceId}"]
}
When I execute my configuration, I get the error below:
data.external.aseVip: data.external.aseVip: command "az" produced invalid JSON: json: cannot unmarshal object into Go value of type string
Any ideas what I am doing wrong?
I discovered the problem was that the Terraform External Data Source is not yet able to handle the complex structure of what gets returned by the command. I was able to get around this by adding an AZ CLI command block at the beginning of the script I use to deploy the Application Gateway that grabs the IP address and passes it into the Terraform config as a variable. Below is the script block I am using:
ilbase_virtual_ip=$(
az resource show \
--ids "/subscriptions/$subscription_id/resourceGroups/$ilbase_rg_name/providers/Microsoft.Web/hostingEnvironments/$ilbase_name/capacities/virtualip" \
--query "additionalProperties.internalIpAddress"
)
That command will be successful when you are working in a session. I guess that when you run it from your shell, you already have done az login. When terraform executes your command, it is not using your existing session. You would need to create a PS1 script where you would be propmted for login, or where you provide your credentials so your request can be successful.
Whatever your choice is, take into account that the ONLY output that script should have is a JSON. If any other command add something to the output (for example, when you do a login, you have an output with information about your subscription) then you will have the same error as the output is not a proper JSON. You will need to pipeline that kind of extra ouputs to Out-Null making them "silent" and just write to the output the JSON you are receiving from your request.
I hope this can help.
While there is an accepted response, that is actually a good workaround.
The error is because terraform expect an one level json map like { "a" = "b", "c" = "d" } and your az command returns a multi level map. ( a map of maps )
You can improve your az command to limit the return only one map by adding --query
data "external" "aseVip" {
program = ["az", "resource", "show", "--ids", "${var.ilbase_resourceId}" , "--query additionalProperties" ]
}
output "internalIpAddress" {
value = data.external.aseVip.internalIpAddress
}
output "outboundIpAddresses" {
value = data.external.aseVip.outboundIpAddresses
}
I hope this may help other people.

Resources