I’m trying to build a CICD pipeline. I want to copy my new static React build bundle to Azure blob storage. As part of this, the old content needs to be deleted. The only action I seem able to take, however, is copy. Is it possible to do this?
It's is not possible to delete/clean up the blob content using Azure File Copy.
You can use Azure CLI task to invoke the az cmdlets to clean up your container before running the Azure File Copy task.
az storage blob delete-batch --account-name <storage_account_name> --source <container_name>
Related
In order to programmatically automate enablement of the static website on blob service. What should be the property field allowed? I am unable to find something similar to this Microsoft.Storage/storageAccounts/blobService/staticWebsite.
Can anyone help me with this?
Indeed, you can do so with some Azure CLI commands.
# To query the current status of the property
az storage blob service-properties show --account-name <your-storage-account> --query 'staticWebsite.enabled'
To set toggle the staticWebsite.enabled property, you can use the az storage blob service-properties update command as follows:
az storage blob service-properties update --account-name <your-storage-account> --static-website
The Azure PowerShell equivalents for the above would be the Enable-AzStorageStaticWebsite and Disable-AzStorageStaticWebsite cmdlets.
Am new to this and am trying to do something which I think is relatively simple.
I download a file from a URL to my Azure VM using wget (its a large file and I don't want to store it locally). I want to now copy this file to an existing container in blob storage. This is completely defeating me.
It's a single line command in the aws universe
aws s3 sync <file_name> s3://<bucket name>
is there an equivalent in azure?
There are a bunch of ways by you can accomplish this and you don't even have to download this large file on your local computer first and then upload in blob storage.
For example, you can use az storage blob copy command which is part of Azure CLI tools, to do so. Here's the sample command for that:
az storage blob copy start --account-key <your-azure-storage-account-key> --account-name <your-azure-storage-account-name> --destination-blob <name-of-the-blob> --destination-container <name-of-the-container> --source-uri <uri-of-the-file>
You can also accomplish the same using azcopy utility or Azure PowerShell Storage Cmdlets. The Cmdlet you would want to use is Start-AzStorageBlobCopy.
We use Azure CLI to write a script to delete a storage account container's content and upload the contents of a directory to container
az storage blob delete-batch --account-name $ACCOUNT_NAME --source $web
az storage blob upload-batch --account-name $ACCOUNT_NAME -s $SOURCE_PATH -d $web
Now, we want to host the script on Azure. According to our search, we cannot directly host Azure CLI script on Azure and we must migrate the script to powershell. But we cannot find similar powershell commands. Could someone help me?
Current version of Azure PowerShell at the time of writing this answer is 3.1 (https://learn.microsoft.com/en-us/powershell/module/az.storage/?view=azps-3.1.0#storage) and unfortunately batch operations on blobs is not supported there.
Thus using Azure PowerShell Cmdlets is not possible as of version 3.1 to perform batch operations on blobs.
One option for you is to use Azure Storage Blobs Batch client library for .NET. You can write PowerShell script that makes use of this library to perform batch operations on blobs. You can find more information and code samples here: https://github.com/Azure/azure-sdk-for-net/blob/master/sdk/storage/Azure.Storage.Blobs.Batch/README.md.
Other option would be to consume REST API. Again, you can write PowerShell script that makes the HTTP requests to perform batch operations against your storage account. You can find more information about the REST API here: https://learn.microsoft.com/en-us/rest/api/storageservices/blob-batch.
P.S: I would've included the code but unfortunately my knowledge in PowerShell is very limited :).
I am trying to copy all files from one container to another. I am using AzCopy to accomplish this task.
AzCopy command as below:
azcopy copy "https://xxxxxxx.blob.core.windows.net/customers" "https://xxxxxxx.blob.core.windows.net/archive" --recursive
Error:
Alternatively is it possible to Move files between containers?
Please follow this doc to grant your user account the RBAC role Storage Blob Data Contributor in your account or your containers.
Besides, there isn't a "move" operation for Azure Blob Storage, you need to delete the original container after copying it.
I found the Windows Azure Storage Plugin for Jenkins. But it seems that it works only with blob storage.
Is there a way to upload files to Azure Data Lake Storage Gen1 from Jenkins ?
Thanks for your help !
AFAIK currently there is no Jenkins plugin to upload files to Azure Data Lake Storae Gen1 from Jenkins.
You could use Azure PowerShell or Azure CLI to accomplish your requirement.
Sample Azure PowerShell command:
Import-AzDataLakeStoreItem -AccountName $dataLakeStorageGen1Name -Path "C:\sampledata\vehicle1_09142014.csv" -Destination $myrootdir\mynewdirectory\vehicle1_09142014.csv
Sample Azure CLI command:
az dls fs upload --account mydatalakestoragegen1 --source-path "C:\SampleData\AmbulanceData\vehicle1_09142014.csv" --destination-path "/mynewfolder/vehicle1_09142014.csv"
For more information, refer below articles:
https://learn.microsoft.com/en-us/azure/data-lake-store/data-lake-store-get-started-powershell#upload-data-to-your-data-lake-storage-gen1-account
https://learn.microsoft.com/en-us/azure/data-lake-store/data-lake-store-get-started-cli-2.0
https://learn.microsoft.com/en-us/powershell/module/az.datalakestore/import-azdatalakestoreitem?view=azps-1.5.0
https://learn.microsoft.com/en-us/cli/azure/dls/fs?view=azure-cli-latest#az-dls-fs-upload
Few pre-requisites to make this work without any issues are:
Have Azure PowerShell / Azure CLI installed in the node where you run the command(s).
Add Azure service principal to Jenkins credential as instructed here -> https://learn.microsoft.com/en-us/azure/jenkins/execute-cli-jenkins-pipeline#add-azure-service-principal-to-jenkins-credential
Hope this helps!! Cheers!!