Im trying to cleanup old images in my ACR. It has 8 repositories so first I want it to test it in only one of them... The complicated thing about it that I need to keep last 4 images created. So I have this script:
$acrName = ACRttestt
$repo = az acr repository list --name $acrName --top 1
$repo | Convertfrom-json | Foreach-Object {
$imageName = $_
(az acr repository show-tags -n $acrName --repository $_ |
convertfrom-json |) Select-Object -SkipLast 4 | Foreach-Object {
az acr repository delete -n $acrName --image "$imageName:$_"
}
}
But Im receiving the following error:
Failed At line:9 char:58 + ... az acr repository delete -n $acrName
--image "$imageName:$_" + ~~~~~~~~~~~ Variable reference is not valid. ':' was not followed by a valid variable name character. Consider
using ${} to delimit the name.
Any ideas?
Thanks in advance
You need to change the "$imageName:$_" into "${imageName}:$_". Then the script will like below:
$acrName = "ACRttestt"
$repo = az acr repository list --name $acrName --top 1
$repo | Convertfrom-json | Foreach-Object {
$imageName = $_
(az acr repository show-tags -n $acrName --repository $_ |
convertfrom-json |) Select-Object -SkipLast 4 | Foreach-Object {
az acr repository delete -n $acrName --image "${imageName}:$_"
}
}
Related
This question already has answers here:
How do I copy over all secrets from one Azure Keyvault to another using Powershell
(7 answers)
Closed 1 year ago.
All,
I have the below Azure DevOps pipeline setup that copy keyvault secrets from one KV to another. As you can see, I have two tasks: 1) one to read the secrets and 2) one to write the secrets. I am having difficulties figuring out how to pass the "$secrets" variable (thru "echo "##vso[task.setVariable variable=sourceSecrets]$json") from the first task to the second task.
stages:
- stage: "Test1"
displayName: "Test1 - Copy KV"
jobs:
- deployment : "Deploy"
timeoutInMinutes: 120
variables:
sourceSecrets: ""
strategy:
runOnce:
deploy:
steps:
- task: AzureCLI#2
inputs:
azureSubscription: $(ServiceConnection1)
scriptType: 'pscore'
scriptLocation: 'inlineScript'
inlineScript: |
if ("$(mysubscription1)"){
az account set --subscription "mysubscription1"
}
$secNames = az keyvault secret list --vault-name "kvName1" -o json --query "[].name" | ConvertFrom-Json
Write-Host 'Reading secrets...'
$secrets = $secNames | % {
$secret = az keyvault secret show --name $_ --vault-name "kvName1" -o json | ConvertFrom-Json
[PSCustomObject]#{
name = $_;
value = $secret.value;
}
}
$json = $($secrets | ConvertTo-Json)
echo "##vso[task.setVariable variable=sourceSecrets]$json"
- task: AzureCLI#2
inputs:
azureSubscription: $(ServiceConnection2)
scriptType: 'pscore'
scriptLocation: 'inlineScript'
inlineScript: |
if ("$(mysubscription2)"){
az account set --subscription $(mysubscription2)
}
$secrets = "$(sourceSecrets)" | ConvertFrom-Json
$secrets.foreach{
Write-Host 'Writing secrets:'
az keyvault secret set --vault-name $(kvName2) --name $_.name --value $_.value --output none
Write-Host '---->' $_.name
}
When the pipeline executes, tasks one executes fine. However, the 2nd task errored out with the following:
ConvertFrom-Json : Conversion from JSON failed with error: Error reading JArray from JsonReader. Path '', line 1, position 1.
At /home/vsts/work/_temp/azureclitaskscript1620360635888_inlinescript.ps1:4 char:18
+ $secrets = "[" | ConvertFrom-Json
+ ~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [ConvertFrom-Json], ArgumentException
+ FullyQualifiedErrorId : System.ArgumentException,Microsoft.PowerShell.Commands.ConvertFromJsonCommand
I did some checking, it appears the $(sourceSecrets) variable contain only "[" instead of the entire json content. This means the "echo "##vso[task.setVariable variable=sourceSecrets]$json" line from the first task is excluding everything after "[". I can't figure out why it is doing that. Ideas?
Thanks in advance.
Generally, the value of the pipeline variable only supports string type, and should be a single line string. If you pass a multi-line content to a pipeline variable, normally only the first line will be received as the value of the variable.
In your case, the value you pass to the variable is a JSON object that contains multi-line content.
To avoid the issue you are facing, you should convert the content of the JSON object to be a single line string before passing it to the pipeline variable.
To convert a multi-line string to be a single line string, you can try the command lines below:
. . .
# escape '%', '\n' and '\r'
json="${json//'%'/'%25'}"
json="${json//$'\n'/'%0A'}"
json="${json//$'\r'/'%0D'}"
echo "##vso[task.setVariable variable=sourceSecrets]$json"
I want to list all the VMs that generate costs in a specific timeframe or billing period.
I managed to create this script to get me the desired output:
$file="C:\temp\GeneratedCost-short.csv"
(az consumption usage list `
--start-date "2020-07-01" --end-date "2020-07-31" | ConvertFrom-Json)`
| Where-Object {$_.product -Match "Virtual Machines"}`
| Sort-Object -Property instanceName -Descending | Select-Object instanceName, subscriptionName`
| Get-Unique -AsString | ConvertTo-Csv -NoTypeInformation | Set-Content $file
But this will give me the output only for the current subscription.
How can I run on all the subscriptions that I have on the azure tenant?
I tried using the below version but it doesn't seem to work:
$file="C:\temp\GeneratedCost-short.csv"
$VMs = #()
$Subscriptions = Get-AzSubscription
foreach ($sub in $Subscriptions) {
Get-AzSubscription -SubscriptionName $sub.Name | az account set -s $sub.Name
$VMs += (az consumption usage list --start-date "2020-07-01" --end-date "2020-07-03" | ConvertFrom-Json)
}
#
$VMs | Where-Object {$_.product -Match "Virtual Machines"}`
| Sort-Object -Property instanceName -Descending | Select-Object instanceName, subscriptionName`
| Get-Unique -AsString | ConvertTo-Csv -NoTypeInformation | Set-Content $file
Any suggestions?
Mixing the Azure PowerShell module and Azure CLI could be causing issues with your code if the accounts haven't been retrieved between the two. Verify that az cli has the proper subscriptions
az account list -o table
If you don't see the accounts be sure to re-run az login.
Here's your code with the azure cli only
$file="C:\temp\GeneratedCost-short.csv"
$VMs = #()
az account list -o json | ConvertFrom-Json |
ForEach-Object {
Write-Host "Getting usage for account: " $_.Name
az account set -s $_.Name
$VMs += (az consumption usage list --start-date "2020-07-01" --end-date "2020-07-03" | ConvertFrom-Json)
}
$VMs | Where-Object {$_.product -Match "Virtual Machines"} |
Sort-Object -Property instanceName -Descending |
Select-Object instanceName, subscriptionName |
Get-Unique -AsString | ConvertTo-Csv -NoTypeInformation |
Set-Content $file
never do += on an array, worst pattern ever.
[System.Collections.Generic.List[PSObject]]$VMs = #()
$subs = Get-AzSubscription # | Where-Object {$_.State -eq 'Enabled'}
foreach ($s in $subs) {
Set-AzContext -SubscriptionObject $s | Out-Null
$vm = # your search here ...
$VMs.Add($vm)
}
I am writing a Powershell script using Azure CLI for doing an Azure SQL Instance restore. This is my script so far:
az login
$AzureSubscription = "SubscriptionName"
az account set --subscription $AzureSubscription
$RGName = "ResourceGroupName"
$SrvName = "AzureSQLServerName"
$RestoreDateTime = (Get-Date).ToUniversalTime().AddHours(-1).ToString()
$RestoreDateTimeString = (Get-Date).ToUniversalTime().AddHours(-1).ToString("yyyy-MM-dd_HH:mm")
$RestoreName = $SrvName + "_" + $RestoreDateTimeString
az sql db restore --dest-name $RestoreName --resource-group $RGName --server $SrvName --name $SrvName --time = $RestoreDateTime
When I run this, I get the following error:
az: error: unrecognized arguments: 7/10/2019 10:39:21 AM
usage: az [-h] [--verbose] [--debug]
[--output {json,jsonc,table,tsv,yaml,none}] [--query JMESPATH]
{sql} ...
I have tried a variety of date-time formats, but, I can't seem to get any of them to work. Is there a specific format that is needed? Should I be passing a different value into time? Any help would be appreciated.
As far as I can tell, the --time parameter wants the datetime formatted as 'Sortable date/time pattern' (yyyy-MM-ddTHH:mm:ss).
This should do it:
$RestoreDateTime = (Get-Date).ToUniversalTime().AddHours(-1)
$RestoreDateTimeString = '{0:yyyy-MM-dd_HH:mm}' -f $RestoreDateTime
$RestoreName = '{0}_{1}' -f $SrvName, $RestoreDateTimeString
# format the datetime as Sortable date/time pattern 'yyyy-MM-ddTHH:mm:ss'
# see: https://learn.microsoft.com/en-us/dotnet/standard/base-types/standard-date-and-time-format-strings
$azRestoreTime = '{0:s}' -f $RestoreDateTime
az sql db restore --dest-name $RestoreName --resource-group $RGName --server $SrvName --name $SrvName --time $azRestoreTime
Hope that helps
I tried to find some simple way how to copy container from one storage to other asynchronously via Azure CLI. Something that can be done by azcopy. I don't have azcopy on my machine installed, but Azure CLI is.
Question: I understand I need to copy one blob after other. How do I check that the copy operation is finished?
Something that kind of works, but calling az storage blob show one by one takes very long time (minutes).
$backup = 'somecontainer'
$exists = (az storage container exists --name $backup --account-name an --account-key ak --output tsv) -match 'true'
if (!$exists) {
az storage container create --name $backup --account-name mt --account-key mk
}
$blobs = az storage blob list --container-name $backup --account-name an --account-key ak | ConvertFrom-Json
# copy one by one
$blobs.name | % {
$name = $_
az storage blob copy start --destination-blob $name --destination-container $backup --source-blob $name --source-container $backup --account-name mt --account-key mk --source-account-name an --source-account-key ak
}
# check operation status
$results = $blobs.name | % {
az storage blob show --container-name $backup --name $_ --account-name mt --account-key mk | ConvertFrom-Json
}
# still unfinished copy opearations:
$results | ? { !($_.properties.copy.completiontime) } | % { $_.name }
#stej As #GeorgeChen mentioned you can use the below:
az storage blob copy start-batch --account-key 00000000 --account-name MyAccount --destination-container MyDestinationContainer --source-account-key MySourceKey --source-account-name MySourceAccount --source-container MySourceContainer
Here is the documentation link:
https://learn.microsoft.com/en-us/cli/azure/storage/blob/copy?view=azure-cli-latest#az-storage-blob-copy-start-batch
I have a couple of txt/csv files, I am trying to feed the each line in the file into my API commands in a loop.
eg: server.txt - first file contains servers and resource groups server1, rg1
server2, rg2
server3, rg3 etc.
ips.txt - another file contains rules & ips rule1, startip1, endip1
rule2, startip2, endip2
rule3, startip3, endip3 etc.
The issue is how to I set up powershell to each line in server.txt and also ips.txt within the same loop?
I had something like this before, but It doesnt seem to work well. Any thoughts?
$list=Get-Content "servers.txt"
foreach ($data in $list) {
$server_name, $rg = $data -split ',' -replace '^\s*|\s*$'
Write-Host "Checking if SQL server belongs in subscription"
$check=$(az sql server list -g $rg --query "[?name == '$server_name'].name" -o tsv)
Write-Host $check
# Get current rules and redirect output to file
az sql server firewall-rule list --resource-group "$rg" --server "$server_name" --output table | Export-Csv -NoTypeInformation current-natrules.csv
$new_rules=Get-Content "ips.txt"
foreach ($data in $new_rules) {
$rule_name, $start_ip, $end_ip = $data -split ',' -replace '^\s*|\s*$'
# Create rule with new configs
Write-Host "Assigning new firewall rules..."
az sql server firewall-rule create --name "$rule_name" --server "$server_name" --resource-group "$rg" --start-ip-address "$start_ip" --end-ip-address "$end_ip" --output table
}
# Validating
Write-Host "Printing new NAT rules to file..."
az sql server firewall-rule list --resource-group "$rg" --server "$server_name" --output table | Export-Csv -NoTypeInformation new-natrules.csv
}