I'm trying to copy from azure blob with this command:
az storage blob download-batch --destination / --source my-continer --pattern "my/pattern/here/*"
to the / path, yet it copies the entire blob path.
So in / I see my/pattern/here structure instead of just the content of /here folder
Is there a way to override this behavior and make it copy just the path that matches the pattern that I entered?
It seems it's impossible that just download blobs without the path my/pattern/here through the CLI command az storage blob download-batch. See screenshot of the blob:
Its name shows with the path, it means you download the blob with name my/pattern/name to the current directory locally. Then in Linux, the name means that it will create the directories until there is no /.
So I suggest you can use the CLI command az storage blob download to download the files with a loop.
Related
Currently, I'm using az storage blob directory download but every time I run it, I get: This command is implicitly deprecated because command group 'storage blob directory' is deprecated and will be removed in a future release. Use 'az storage fs directory' instead.
I checked the docs and I can't seem to find what the values for --file-system should be. Can someone share an example of downloading the content of a directory (a folder) inside a container inside a blob from Azure Storage to a Windows machine?
--file-system is your container name
az storage fs directory download -f myfilesystem --account-name mystorageaccount -s SourceDirectoryPath -d "<local-path>" --recursive
Above will download the entire directory
Follow this command:
az storage fs directory download -f <container name> --account-name <storage account name> -s <the source dir you want to download> -d "./" --recursive --account-key <The access key of the storage account>
Before run the above command, please make sure your storage account has already enable hierarchical namespace, otherwise the file level of storage account will be flat in your side.
It works on my side:
This is the structure on my side:
I can download the specific directory to current directory:
Your can refer to this official document:
Examples of az storage fs directory
We somehow managed to create a file in an Azure File Share whose name ends in a . (dot) (the file name ends in . not the share name :) ).
We now cannot retrieve, remove, edit that file. Whenever we try to perform any action we get:
Extension
Microsoft_Azure_FileStorage
Content
FilePropertiesBladev2
Error code
404
Is there anyway we can remove this file using Powershell, Azure CLI, etc.?
Thanks
I tried in my environment and got below results:
Status Code: 404 - error
The above error indicates mostly File share or file present in the file share not found in the file storage.
According to this MS-DOCS check whether the file is locked for editing by another user also check if the file is open in another program.
Is there anyway we can remove this file using Powershell, Azure CLI, etc.?
I tried in powershell and removed the .files successfully.
Initially I have some files in fileshare.
Portal:
Commands:
$sharename = "fileshare1"
$foldername = "directory1"
$accountname = "storage326123"
$accountkey = "<storage account key>"
az storage file delete --account-name $accountname --account-key $accountkey --share-name $sharename --path directory1/1..json
Response:
Portal:
In azure cli the command
az storage blob upload-batch --source <source folder> --destination-path <destination path> --destination <destination>
is it available the same api in the javascript sdk #azure/storage-blob or in another package?
Thank you!
AFAIK, this feature is not available in any SDK.
If you are using #azure/storage-blob, you would need to list files in a folder yourself (using fs module) and then upload each file using uploadFile(string, BlockBlobParallelUploadOptions) method which is what az storage blob upload-batch is doing.
I have a custom vhd which I have in my Azure VM. It's a linux machine. I need to use azure cli to create an Image from this vhd file. This image will then be used to create VMs which have a username and Password. I have successfully used the vhd to create an Image using the Portal and created a VM out of it. However, I am unable to create an image using cli commands. This is what I get when I run the command -
jenkins#Jenkins-vm:~/testFolder$ az image create -g myRG -n myImage --os-type Linux --source ./myCustom.vhd
usage: az image create [-h] [--verbose] [--debug] [--only-show-errors]
[--output {json,jsonc,yaml,yamlc,table,tsv,none}]
[--query JMESPATH] [--subscription _SUBSCRIPTION]
--resource-group RESOURCE_GROUP_NAME --name NAME
--source SOURCE [--os-type {Windows,Linux}]
[--data-disk-sources DATA_DISK_SOURCES [DATA_DISK_SOURCES ...]]
[--location LOCATION]
[--storage-sku {Standard_LRS,Premium_LRS,StandardSSD_LRS,UltraSSD_LRS}]
[--hyper-v-generation {V1,V2}]
[--os-disk-caching {None,ReadOnly,ReadWrite}]
[--data-disk-caching {None,ReadOnly,ReadWrite}]
[--tags [TAGS [TAGS ...]]]
[--zone-resilient [{true,false}]]
az image create: error: 'NoneType' object has no attribute 'os_disk'
I don't understand the meaning of the error displayed.
I'm very new to Azure so feeling a little lost.
You can see the description of the CLI command az image create:
Create a custom Virtual Machine Image from managed disks or snapshots.
And the description of the parameter --source:
OS disk source from the same region, including a virtual machine ID or
name, OS disk blob URI, managed OS disk ID or name, or OS snapshot ID
or name.
It means the CLI command only can create VM images from Azure, not matter the VM blob URI, managed disk, or snapshot, all of them should exist in Azure. So you need to upload the VHD file to Azure Storage Blob, and then use the VHD URI to create the VM image via the CLI command. And one thing you need to watch out is that the VHD file should be the OS disk.
Is there a way of downloading a container and all of its content from Azure Blob Storage?
I use CloudBerry Explorer for Azure Blob Storage to manage my containers, files and folders.
I have a container with over 100GB of data which I would like to download, but cannot find a way of downloading the container, only individualy files.
If you want, you can use the AzCopy tool to download an entire blob container. Assuming you have the latest version of the Azure SDK installed, you can find this tool in the C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy folder.
You can try the following command:
AzCopy /Source:"https://[accountname].blob.core.windows.net/[containername]/" /Dest:"[folder path e.g. D:\temp\" /SourceKey:"[account key]" /S
Replace [accountname], [containername], [folder path], [account key] with the appropriate values.
To download the Container (All Files) from Azure Blob Storage
az login
az account set --subscription <Sub ID>
az storage blob download-batch --account-name <storageaccountname> --source <containername> --destination <C:\Users\Downloads\***>
To Delete All files at a time
az storage blob delete-batch --account-name <storageaccountname> --source <containername>
az storage blob download-batch --account-name --source --destination <C:\Users\Downloads***>