Retrieve Azure storage account key using Azure CLI - azure

In my release pipeline I'm using a Azure CLI to transfer my build files to a Azure storage blob:
call az storage blob upload-batch --source "$(System.DefaultWorkingDirectory)/_ClientWeb-Build-CI/ShellArtifact/out/build" --destination "$web" --account-key "****QxjclDGftOY/agnqUDzwNe/gOIAzsQ==" --account-name "*****estx"
This works, but I want to retrieve the account-key dynamically.
When I use:
az storage account keys list -g CustomersV2 -n ****estx
I get a array with 2 objects, both holding a key value:
[
{
"keyName": "key1",
"permissions": "Full",
"value": "f/eybpcl*****************Vm9uT1PwFC1D82QxjclDGftOY/agnqUDzwNe/gOIAzsQ=="
},
{
"keyName": "key2",
"permissions": "Full",
"value": "bNM**********L6OxAemK1U2oudW5WGRQW++/bzD6jVw=="
}
]
How do I use one of the two keys in my upload-batch command?

For your issue, if you just want one of the two keys for example, the first one. You can set a variable with the key as the value like this:
key=$(az storage account keys list -g CustomersV2 -n ****estx --query [0].value -o tsv)
And then use the variable key in the other command like this:
call az storage blob upload-batch --source "$(System.DefaultWorkingDirectory)/_ClientWeb-Build-CI/ShellArtifact/out/build" --destination "$web" --account-key $key --account-name "*****estx"
Or you can just put the command which gets the key in the other command directly like this:
call az storage blob upload-batch --source "$(System.DefaultWorkingDirectory)/_ClientWeb-Build-CI/ShellArtifact/out/build" --destination "$web" --account-key $(az storage account keys list -g CustomersV2 -n ****estx --query [0].value -o tsv) --account-name "*****estx"
Update
According to what you said, it seems you run the command in the windows command prompt, it's different from the Linux shell and PowerShell. You cannot set the environment variable with the value that the output of a command. You can do that like this:
az storage account keys list -g CustomersV2 -n ****estx --query [0].value -o tsv > key.txt
set /P key=<key.txt
az storage blob upload-batch --source "$(System.DefaultWorkingDirectory)/_ClientWeb-Build-CI/ShellArtifact/out/build" --destination "$web" --account-key %key% --account-name "*****estx"
And it seems you just can quote the environment variable as %variable_name%, so it seems it's a wrong way to use "$web" in your command.

I created a Azure Powershell task (version 4) that does:
az login -u **** -p ****
Write-Host "##vso[task.setvariable variable=storageKey;]az storage account keys list -g ***** -n ***** --query [0].value -o tsv"
$key = az storage account keys list -g ***** -n **** --query [0].value -o tsv
Write-Host "##vso[task.setvariable variable=something;]$key"
Then I can use the something variable in my Azure CLI task:
call az storage blob upload-batch --source "$(System.DefaultWorkingDirectory)/_ClientWeb-Build-CI/ShellArtifact/out/build" --destination "$web" --account-key $(something) --account-name "*****"
And this works. You'll probably need to put the -u and -p in a variable though.
#Charles thanks a lot for this line (az storage account keys list -g **** -n ****estx --query [0].value -o tsv) !

Related

Terraform: Module's output variable to azure-pipelines.yml file

I am using Terraform Modules for storage account and once storage account is created I want to use output variable for the access key
output "storage_account_primary_access_key" {
value = data.azurerm_storage_account.storage.primary_access_key
}
Further in my azure-pipelines.yml, I am using "az command" as below
az storage blob upload-batch -s drop -d \$web --account-name "" --account-key ""
How can I use Module's output variable in the YML file?
You can use the output command. For example:
terraform output storage_account_primary_access_key
So you may do something like this:
az storage blob upload-batch -s drop -d $web \
--account-name "" \
--account-key "$(terraform output -raw storage_account_primary_access_key)"
You could also assign it to variable, that you can use throughout your popeline. Something along these lines.
- script: echo "##vso[task.setvariable variable=ACCESS_KEY]$(terraform output -raw storage_account_primary_access_key)"

how to grab next marker (next_marker) in azure cli command az storage fs file list

Need to calculate size of specific containers and folders at ADLS Gen2.
Started with command az storage fs file list. However don't understand how to grab next_marker ? It appears in stdout as warning but not in output of command:
WARNING: Next Marker:
WARNING: VBbYvMrBhcCCqHEYSAAAA=
So how to get this next_marker:
$files=$(az storage fs file list --file-system <container name>\
--auth-mode login --account-name <account name> \
--query "[*].[contentLength]" --num-results 1000 -o tsv)
$files.next_marker is empty.
UPDATE1: Created issues https://github.com/Azure/azure-cli/issues/16893
If you're using this azure cli command: az storage fs file list, the next_marker is not returned to the variable $files, it's always printed out in the console. You need to copy and paste it.
As a workaround, you can use this azure cli command: az storage blob list(Most of the azure blob storage cli commands are also available in ADLS Gen2). This command has a parameter --show-next-marker, you can use it to return next_marker to a variable.
I write an azure cli scripts and it can work well for ADLS Gen2:
$next_token = ""
$blobs=""
$response = & az storage blob list --container-name your_file_system_in_ADLS_Gen2 --account-name your_ADLS_Gen2_account --account-key your_ADLS_Gen2_key --num-results 5 --show-next-marker | ConvertFrom-Json
$blobs += $response.properties.contentLength
$next_token = $response.nextMarker
while ($next_token -ne $null){
$response = & az storage blob list --container-name your_file_system_in_ADLS_Gen2 --account-name your_ADLS_Gen2_account --account-key your_ADLS_Gen2_key --num-results 5 --marker $next_token --show-next-marker | ConvertFrom-Json
$blobs = $blobs + " " + $response.properties.contentLength
$next_token = $response.nextMarker
}
$blobs
The test result:
Please note that upgrade your azure cli to the latest version, the --show-next-marker parameter may not work in the old versions as per this issue.

azure blob storage cli count

I'm testing some scripts and need a way to count how many blobs there are in a container.
I can list the blobs using:
az storage blob list --container-name "xxx" --account-key "xxx" --account-name "xxx" -o table
But how do I get a count?
I tried using --query length but it doesn't work.
If you want to get the amount of blob in one container with Azure CLI, please add --query "length(#)" in your command
az storage blob list -c "xxx" --account-key "xxx" --account-name "xxx" --query "length(#)" -o table

Can I do the same thing as Azure Powershell with Azure CLI? (Upload VHD)

So I have an original vhd file which size is 90mb.
Uploading vhd with Azure Powershell module Add-AzureRMVhd results in the uploaded vhd being 2gb in size.
Add-AzureRmVhd -LocalFilePath $sourceVHD -Destination $destinationVHD -ResourceGroupName $resourceGroupName -NumberOfUploaderThreads 5
Uploading vhd with azure cli results in the uploaded vhd being 90mb in size.
az storage blob upload --account-name tstorage --container-name tcontainer --file /home/azure/images/test.vhd --name test.vhd --type page
I can use the 2gb vhd to create an image, but I cannot use the 90mb.
Is there anyway to perform the function of the powershell module with AZ cli?
I tried below command and it worked for me, Please try to follow this and see if it works for you using Azure CLI
#!/bin/bash
# Create a resource group
az group create -n myResourceGroup -l westus
# Create the storage account to upload the vhd
az storage account create -g myResourceGroup -n mystorageaccount -l westus --sku PREMIUM_LRS
# Get a storage key for the storage account
STORAGE_KEY=$(az storage account keys list -g myResourceGroup -n mystorageaccount --query "[?keyName=='key1'] | [0].value" -o tsv)
# Create the container for the vhd
az storage container create -n vhds --account-name mystorageaccount --account-key ${STORAGE_KEY}
# Upload the vhd to a blob
az storage blob upload -c vhds -f ~/sample.vhd -n sample.vhd --account-name mystorageaccount --account-key ${STORAGE_KEY}
# Create the vm from the vhd
az vm create -g myResourceGroup -n myVM --image "https://myStorageAccount.blob.core.windows.net/vhds/sample.vhd" \
--os-type linux --admin-username deploy --generate-ssh-keys
# Update the deploy user with your ssh key
az vm user update --resource-group myResourceGroup -n custom-vm -u deploy --ssh-key-value "$(< ~/.ssh/id_rsa.pub)"
# Get public IP address for the VM
IP_ADDRESS=$(az vm list-ip-addresses -g az-cli-vhd -n custom-vm \
--query "[0].virtualMachine.network.publicIpAddresses[0].ipAddress" -o tsv)
echo "You can now connect using 'ssh deploy#${IP_ADDRESS}'"
Hope it helps.

How to get the blob list of storage accounts with Azure CLI?

I'm using Microsoft Azure CLI and I could not find a way to list the blob object of a storage account.
I have the list displayed in Azure Web portal but I don't find any away to do it with the az storage command.
I've tried az storage blob list but it required a container name which I don't know how to find it (with az cli).
Do someone have an idea ?
Update: fetch the account key in cli:
Please try the code below, which can list all the blobs in all the containers of your storage account.
Note that there are no whitespaces aroud "=".
# list storage account names
az storage account list --query "[].{name:name}" --output tsv)"
# update with your storage account name
storage_account_name="your_storage_account_name"
key="$(az storage account keys list -n ${storage_account_name} --query "[0].{value:value}" --output tsv)"
containers="$(az storage container list --account-name ${storage_account_name} --account-key $key --query "[].{name:name}" --output tsv)"
for c in $containers
do
echo "==== Blobs in Container $c ===="
az storage blob list --container-name $c \
--account-name ${storage_account_name} \
--account-key $key \
--query "[].{name:name}" --output tsv
done
Test results as below:
Command to get list of containers in a storage account on basis of storage ACCOUNT_NAME and ACCESS_KEY:
az storage container list --account-key ACCESS_KEY --account-name ACCOUNT_NAME
On basis of Container names received in its response, you can use az storage blob list command to get the list of objects within that container.
Thanks Ivan Yang for your answer!
The suggested edits queue was full for your answer, so I wanted to paste back this edited code:
get_blobs.sh
accounts="$(az storage account list --query "[].{name:name}" --output tsv)"
for storage_account_name in $accounts
do
echo "==== Storage Account ${storage_account_name} ===="
key="$(az storage account keys list -n ${storage_account_name} --query "[0].{value:value}" --output tsv)"
containers="$(az storage container list --account-name ${storage_account_name} --account-key $key --query "[].{name:name}" --output tsv)"
for c in $containers
do
echo "==== Blobs in Container $c ===="
az storage blob list --container-name $c \
--account-name ${storage_account_name} \
--account-key $key \
--query "[].{name:name}" --output tsv
done
done
I ran mine in Azure Cloud CLI:
Launch Cloud Shell from the top navigation of the Azure portal. Choose Bash.
Save the script with Unix line endings. Drag the file onto the Azure CLI to upload it. If you get an error with ^r line endings, run dos2unix get_blobs.sh on the file to fix that.
chmod u+x get_blobs.sh to allow it to execute.
./get_blobs.sh >> output.txt to run and output to text file.
You can then use the Azure CLI Upload/Download Button to retrieve your output.txt file.
If I understand correctly, the solution can be divided in 3 part
Get the list of storage accounts and copy the "desired" account name from output
az storage account list --query "[].{name:name}" --output tsv
Then get "first key" of "desired" account and copy the string (dont copy double quotes :) )
az storage account keys list --account-name WHATEVER_ACCOUNTNAME --query "[0].value"
3- Finally get the list of containers of your desired account (catch is here that "key" should be correctly copied from the output of second command)
az storage container list --account-name WHATEVER_ACCOUNTNAME --account-key YOUR-VERY-VERY-LONG-KEY --query "[].{name:name}" --output tsv
Note- Don't forget to login into azure account first of all :).
az login
Here is a simple script to list all your blob containers. As a bonus, list all your file shares too !
# List all the storage accounts under a subscription
for actname in $( az storage account list --query "[].name" --output tsv );
do
# Get the storage key
key1=`az storage account keys list --account-name $actname --query "[0].value" --output tsv 2>/dev/zero`
# Exclude the listing when you do not have permission to look into the storage account - like databricks managed storage for example
if [ $? -ne 0 ] ; then
continue
fi
echo -e "\n === Here are the storage containers for your storage account $actname === \n"
# List of containers and file shares for the account
az storage container list --account-name $actname --account-key $key1 --query "[].name" --output tsv
echo -e "\n --- Here are the storage file shares for your storage account $actname ---\n"
az storage share list --account-name $actname --account-key $key1 --query "[].name" --output tsv
done

Resources