i want to get the data of my files in azure storage using cli for that i am using query command but i want size of file using its creation date how can i achieve this i have use below command
bytes=az storage blob list \ --container-name mycontainer \ --query "[*].[properties.contentLength]" \ --output tsv | paste --serial --delimiters=+ | bc
Display total bytes
echo "Total bytes in container: $bytes"
but as i used * its gives me size of whole container i want specific file size or the size of file which is created today
When you want to get the size of the files created today, you can use the function contains in JMESPath like this:
bytes=$(az storage blob list --container-name terraform-container --account-name charlesstore --query "[?contains(#.properties.creationTime, '2020-10-27')==\`true\`].properties.contentLength" -o tsv)
Change the date as you need.
Related
I am using Terraform Modules for storage account and once storage account is created I want to use output variable for the access key
output "storage_account_primary_access_key" {
value = data.azurerm_storage_account.storage.primary_access_key
}
Further in my azure-pipelines.yml, I am using "az command" as below
az storage blob upload-batch -s drop -d \$web --account-name "" --account-key ""
How can I use Module's output variable in the YML file?
You can use the output command. For example:
terraform output storage_account_primary_access_key
So you may do something like this:
az storage blob upload-batch -s drop -d $web \
--account-name "" \
--account-key "$(terraform output -raw storage_account_primary_access_key)"
You could also assign it to variable, that you can use throughout your popeline. Something along these lines.
- script: echo "##vso[task.setvariable variable=ACCESS_KEY]$(terraform output -raw storage_account_primary_access_key)"
Need to calculate size of specific containers and folders at ADLS Gen2.
Started with command az storage fs file list. However don't understand how to grab next_marker ? It appears in stdout as warning but not in output of command:
WARNING: Next Marker:
WARNING: VBbYvMrBhcCCqHEYSAAAA=
So how to get this next_marker:
$files=$(az storage fs file list --file-system <container name>\
--auth-mode login --account-name <account name> \
--query "[*].[contentLength]" --num-results 1000 -o tsv)
$files.next_marker is empty.
UPDATE1: Created issues https://github.com/Azure/azure-cli/issues/16893
If you're using this azure cli command: az storage fs file list, the next_marker is not returned to the variable $files, it's always printed out in the console. You need to copy and paste it.
As a workaround, you can use this azure cli command: az storage blob list(Most of the azure blob storage cli commands are also available in ADLS Gen2). This command has a parameter --show-next-marker, you can use it to return next_marker to a variable.
I write an azure cli scripts and it can work well for ADLS Gen2:
$next_token = ""
$blobs=""
$response = & az storage blob list --container-name your_file_system_in_ADLS_Gen2 --account-name your_ADLS_Gen2_account --account-key your_ADLS_Gen2_key --num-results 5 --show-next-marker | ConvertFrom-Json
$blobs += $response.properties.contentLength
$next_token = $response.nextMarker
while ($next_token -ne $null){
$response = & az storage blob list --container-name your_file_system_in_ADLS_Gen2 --account-name your_ADLS_Gen2_account --account-key your_ADLS_Gen2_key --num-results 5 --marker $next_token --show-next-marker | ConvertFrom-Json
$blobs = $blobs + " " + $response.properties.contentLength
$next_token = $response.nextMarker
}
$blobs
The test result:
Please note that upgrade your azure cli to the latest version, the --show-next-marker parameter may not work in the old versions as per this issue.
I have a user that is putting a lot of whitespaces in their filenames and this is causing a download script to go bad.
To get the names of the blobs I use this:
BLOBS=$(az storage blob list --container-name $c \
--account-name $AZURE_STORAGE_ACCOUNT --account-key $AZURE_STORAGE_KEY \
--query "[].{name:name}" --output tsv)
What is happening for a blob like blob with space.pdf it is getting stored as [blob\twith\tspace.pdf] where \t is the tab. When I iterate in an effort to download obviously I can't get at the file.
How can I do this correctly?
You can use this command az storage blob download-batch.
I tested it in azure portal, all the blobs including whose name contains white-space are downloaded.
The command:
c=container_name
AZURE_STORAGE_ACCOUNT=xx
AZURE_STORAGE_KEY=xx
//download the blobs to clouddrive
cd clouddrive
az storage blob download-batch -d . -s $c --account-name $AZURE_STORAGE_ACCOUNT --account-key $AZURE_STORAGE_KEY
The test result:
I'm using Microsoft Azure CLI and I could not find a way to list the blob object of a storage account.
I have the list displayed in Azure Web portal but I don't find any away to do it with the az storage command.
I've tried az storage blob list but it required a container name which I don't know how to find it (with az cli).
Do someone have an idea ?
Update: fetch the account key in cli:
Please try the code below, which can list all the blobs in all the containers of your storage account.
Note that there are no whitespaces aroud "=".
# list storage account names
az storage account list --query "[].{name:name}" --output tsv)"
# update with your storage account name
storage_account_name="your_storage_account_name"
key="$(az storage account keys list -n ${storage_account_name} --query "[0].{value:value}" --output tsv)"
containers="$(az storage container list --account-name ${storage_account_name} --account-key $key --query "[].{name:name}" --output tsv)"
for c in $containers
do
echo "==== Blobs in Container $c ===="
az storage blob list --container-name $c \
--account-name ${storage_account_name} \
--account-key $key \
--query "[].{name:name}" --output tsv
done
Test results as below:
Command to get list of containers in a storage account on basis of storage ACCOUNT_NAME and ACCESS_KEY:
az storage container list --account-key ACCESS_KEY --account-name ACCOUNT_NAME
On basis of Container names received in its response, you can use az storage blob list command to get the list of objects within that container.
Thanks Ivan Yang for your answer!
The suggested edits queue was full for your answer, so I wanted to paste back this edited code:
get_blobs.sh
accounts="$(az storage account list --query "[].{name:name}" --output tsv)"
for storage_account_name in $accounts
do
echo "==== Storage Account ${storage_account_name} ===="
key="$(az storage account keys list -n ${storage_account_name} --query "[0].{value:value}" --output tsv)"
containers="$(az storage container list --account-name ${storage_account_name} --account-key $key --query "[].{name:name}" --output tsv)"
for c in $containers
do
echo "==== Blobs in Container $c ===="
az storage blob list --container-name $c \
--account-name ${storage_account_name} \
--account-key $key \
--query "[].{name:name}" --output tsv
done
done
I ran mine in Azure Cloud CLI:
Launch Cloud Shell from the top navigation of the Azure portal. Choose Bash.
Save the script with Unix line endings. Drag the file onto the Azure CLI to upload it. If you get an error with ^r line endings, run dos2unix get_blobs.sh on the file to fix that.
chmod u+x get_blobs.sh to allow it to execute.
./get_blobs.sh >> output.txt to run and output to text file.
You can then use the Azure CLI Upload/Download Button to retrieve your output.txt file.
If I understand correctly, the solution can be divided in 3 part
Get the list of storage accounts and copy the "desired" account name from output
az storage account list --query "[].{name:name}" --output tsv
Then get "first key" of "desired" account and copy the string (dont copy double quotes :) )
az storage account keys list --account-name WHATEVER_ACCOUNTNAME --query "[0].value"
3- Finally get the list of containers of your desired account (catch is here that "key" should be correctly copied from the output of second command)
az storage container list --account-name WHATEVER_ACCOUNTNAME --account-key YOUR-VERY-VERY-LONG-KEY --query "[].{name:name}" --output tsv
Note- Don't forget to login into azure account first of all :).
az login
Here is a simple script to list all your blob containers. As a bonus, list all your file shares too !
# List all the storage accounts under a subscription
for actname in $( az storage account list --query "[].name" --output tsv );
do
# Get the storage key
key1=`az storage account keys list --account-name $actname --query "[0].value" --output tsv 2>/dev/zero`
# Exclude the listing when you do not have permission to look into the storage account - like databricks managed storage for example
if [ $? -ne 0 ] ; then
continue
fi
echo -e "\n === Here are the storage containers for your storage account $actname === \n"
# List of containers and file shares for the account
az storage container list --account-name $actname --account-key $key1 --query "[].name" --output tsv
echo -e "\n --- Here are the storage file shares for your storage account $actname ---\n"
az storage share list --account-name $actname --account-key $key1 --query "[].name" --output tsv
done
I'm trying to upload a sample file to Azure from my Ubuntu machine using AzCopy for Linux but I keep getting the below error no matter what permission/ownership I change to.
$ azcopy --source ../my_pub --destination https://account-name.blob.core.windows.net/mycontainer --dest-key account-key
Incomplete operation with same command line detected at the journal directory "/home/jmis/Microsoft/Azure/AzCopy", do you want to resume the operation? Choose Yes to resume, choose No to overwrite the journal to start a new operation. (Yes/No) Yes
[2017/11/18 22:06:24][ERROR] Error parsing source location "../my_pub": Failed to enumerate directory /home/jmis/my_pub/ with file pattern *. Cannot find the path '/home/jmis/my_pub/'.
I have digged over the internet to find solutions, without having a luck I eventually ended up asking a question here.
Although AzCopy was having issues for Linux I'm able to do the above operation seamlessly with Azure CLI. The below code listed on Azure docs helped me do it:
#!/bin/bash
# A simple Azure Storage example script
export AZURE_STORAGE_ACCOUNT=<storage_account_name>
export AZURE_STORAGE_ACCESS_KEY=<storage_account_key>
export container_name=<container_name>
export blob_name=<blob_name>
export file_to_upload=<file_to_upload>
export destination_file=<destination_file>
echo "Creating the container..."
az storage container create --name $container_name
echo "Uploading the file..."
az storage blob upload --container-name $container_name --file $file_to_upload --name $blob_name
echo "Listing the blobs..."
az storage blob list --container-name $container_name --output table
echo "Downloading the file..."
az storage blob download --container-name $container_name --name $blob_name --file $destination_file --output table
echo "Done"
Going forward I will be using the Cool Azure CLI which is Linux compliant and Simple too.
We can use this script to upload single file with Azcopy(Linux):
azcopy \
--source /mnt/myfiles \
--destination https://myaccount.file.core.windows.net/myfileshare/ \
--dest-key <key> \
--include abc.txt
Use --include to specify which file you want to upload, here a example, please check it:
root#jasonubuntu:/jason# pwd
/jason
root#jasonubuntu:/jason# ls
test1
root#jasonubuntu:/jason# azcopy --source /jason/ --destination https://jasondisk3.blob.core.windows.net/jasonvm/ --dest-key m+kQwLuQZiI3LMoMTyAI8K40gkOD+ZaT9HUL3AgVr2KpOUdqTD/AG2j+TPHBpttq5hXRmTaQ== --recursive --include test1
Finished 1 of total 1 file(s).
[2017/11/20 07:45:57] Transfer summary:
-----------------
Total files transferred: 1
Transfer successfully: 1
Transfer skipped: 0
Transfer failed: 0
Elapsed time: 00.00:00:02
root#jasonubuntu:/jason#
More information about Azcopy on Linux, please refer to this link.