How to implement batch operations with storage - azure

We use Azure CLI to write a script to delete a storage account container's content and upload the contents of a directory to container
az storage blob delete-batch --account-name $ACCOUNT_NAME --source $web
az storage blob upload-batch --account-name $ACCOUNT_NAME -s $SOURCE_PATH -d $web
Now, we want to host the script on Azure. According to our search, we cannot directly host Azure CLI script on Azure and we must migrate the script to powershell. But we cannot find similar powershell commands. Could someone help me?

Current version of Azure PowerShell at the time of writing this answer is 3.1 (https://learn.microsoft.com/en-us/powershell/module/az.storage/?view=azps-3.1.0#storage) and unfortunately batch operations on blobs is not supported there.
Thus using Azure PowerShell Cmdlets is not possible as of version 3.1 to perform batch operations on blobs.
One option for you is to use Azure Storage Blobs Batch client library for .NET. You can write PowerShell script that makes use of this library to perform batch operations on blobs. You can find more information and code samples here: https://github.com/Azure/azure-sdk-for-net/blob/master/sdk/storage/Azure.Storage.Blobs.Batch/README.md.
Other option would be to consume REST API. Again, you can write PowerShell script that makes the HTTP requests to perform batch operations against your storage account. You can find more information about the REST API here: https://learn.microsoft.com/en-us/rest/api/storageservices/blob-batch.
P.S: I would've included the code but unfortunately my knowledge in PowerShell is very limited :).

Related

Upgrade Access Tier via Azure CLI or Python Code for Storage Account Gen2

We Want to update the access tiers in the ADLS Gen2 for Multiple paths and want to use Azure CLI or Python Code as per our requirement.
According to Microsoft Documentation, We see only Portal and Power shell code to do it.
Can anyone let us know if we can explore through the mentioned code.
Are you looking for this command -
az storage account update -g <resource-group> -n <storage-account> --set kind=StorageV2 --access-tier=<Hot/Cool>
I was able to update the access tier of Azure Data Lake Storage Gen2 using this command.
I'm not sure you want to change account access tier or blob access tier.
If you want to change blob access tier.
You can try this command:
az storage blob set-tier --account-key 00000000 --account-name MyAccount --container-name MyContainer --name MyBlob --tier Hot
tier value choose among Archive, Cool, Hot.
Below is my test screenshot,it works:
Here is the API document.

Delete content of Azure pipeline blob storage via Azure release

I’m trying to build a CICD pipeline. I want to copy my new static React build bundle to Azure blob storage. As part of this, the old content needs to be deleted. The only action I seem able to take, however, is copy. Is it possible to do this?
It's is not possible to delete/clean up the blob content using Azure File Copy.
You can use Azure CLI task to invoke the az cmdlets to clean up your container before running the Azure File Copy task.
az storage blob delete-batch --account-name <storage_account_name> --source <container_name>

Copy file from Azure VM to Azure Blob Storage

Am new to this and am trying to do something which I think is relatively simple.
I download a file from a URL to my Azure VM using wget (its a large file and I don't want to store it locally). I want to now copy this file to an existing container in blob storage. This is completely defeating me.
It's a single line command in the aws universe
aws s3 sync <file_name> s3://<bucket name>
is there an equivalent in azure?
There are a bunch of ways by you can accomplish this and you don't even have to download this large file on your local computer first and then upload in blob storage.
For example, you can use az storage blob copy command which is part of Azure CLI tools, to do so. Here's the sample command for that:
az storage blob copy start --account-key <your-azure-storage-account-key> --account-name <your-azure-storage-account-name> --destination-blob <name-of-the-blob> --destination-container <name-of-the-container> --source-uri <uri-of-the-file>
You can also accomplish the same using azcopy utility or Azure PowerShell Storage Cmdlets. The Cmdlet you would want to use is Start-AzStorageBlobCopy.

How to use Jenkins to upload to Azure Data Lake Storage Gen1?

I found the Windows Azure Storage Plugin for Jenkins. But it seems that it works only with blob storage.
Is there a way to upload files to Azure Data Lake Storage Gen1 from Jenkins ?
Thanks for your help !
AFAIK currently there is no Jenkins plugin to upload files to Azure Data Lake Storae Gen1 from Jenkins.
You could use Azure PowerShell or Azure CLI to accomplish your requirement.
Sample Azure PowerShell command:
Import-AzDataLakeStoreItem -AccountName $dataLakeStorageGen1Name -Path "C:\sampledata\vehicle1_09142014.csv" -Destination $myrootdir\mynewdirectory\vehicle1_09142014.csv
Sample Azure CLI command:
az dls fs upload --account mydatalakestoragegen1 --source-path "C:\SampleData\AmbulanceData\vehicle1_09142014.csv" --destination-path "/mynewfolder/vehicle1_09142014.csv"
For more information, refer below articles:
https://learn.microsoft.com/en-us/azure/data-lake-store/data-lake-store-get-started-powershell#upload-data-to-your-data-lake-storage-gen1-account
https://learn.microsoft.com/en-us/azure/data-lake-store/data-lake-store-get-started-cli-2.0
https://learn.microsoft.com/en-us/powershell/module/az.datalakestore/import-azdatalakestoreitem?view=azps-1.5.0
https://learn.microsoft.com/en-us/cli/azure/dls/fs?view=azure-cli-latest#az-dls-fs-upload
Few pre-requisites to make this work without any issues are:
Have Azure PowerShell / Azure CLI installed in the node where you run the command(s).
Add Azure service principal to Jenkins credential as instructed here -> https://learn.microsoft.com/en-us/azure/jenkins/execute-cli-jenkins-pipeline#add-azure-service-principal-to-jenkins-credential
Hope this helps!! Cheers!!

Install Powershell for Azure without an Azure Account

I want to use Get-AzureStorageBlob in a powershell script so my client can download files from my Azure Blob storage ( I use Devops to put them there )
The keys are in the script so the client does not need an Azure account.
He does however need to install Azure Powershell
And the instructions ask him to log in to Azure.
Is there an alternative?
If you just operates with azure storage, then you can ignore the "Connect-AzAccount" cmdlet.
After installing azure powershell module, since you have account_name and account_key of storage account, directly use them to download the blob.
But if you want to operate other resources like vm etc.,then you need to the cmdlet "Connect-AzAccount".
When I click Show Details in the right Commands panel I get an error message
cannot be loaded because running scripts is disabled on this system.

Resources