Is there an Azure cli for these AWS cli sentences? - azure

I'm used to AWS and aws cli (aws-shell), so if want to upload the content of a folder called images to a certain bucket, I used to type this in the aws-shell:
s3 cp ./images/ s3://mybucket.com/images/ --recursive --exclude "*" --include "*.jpg" --acl public-read --storage-class STANDARD --content-type "image/jpeg"
I have to migrate to Azure blobs and I don't know how to do that. More over, what would be the equivalent to aws-shell to do that? azure-cli?
Thanks!

To upload a folder in Azure Blob Storage, the AZ CLI command you would want to use is az storage blob upload-batch.
Other option would be to use azcopy tool which is designed specifically for performing various blob operations. You can find examples of uploading files and folders using azcopy here: https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-blobs-upload?toc=%2Fazure%2Fstorage%2Fblobs%2Ftoc.json
More over, what would be the equivalent to aws-shell to do that?
azure-cli?
That's correct. Other option would be to use Azure PowerShell Cmdlets.

Related

Azure container copy only changes

I would like to update static website assets from github repos. The documentation suggests to use an action based on
az storage blob upload-batch --account-name <STORAGE_ACCOUNT_NAME> -d '$web' -s .
If I see this correct, this copies all files regardless of the changes. Even if only one file was altered. Is it possible to only transfer files that have been changed? Like rsync does.
Else I would try to judge the changed files based on the git history and only transfer them. Please also answer, if you know an existing solution in this direction.
You can use azcopy sync to achieve that. That is a different tool, though.
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-blobs-synchronize?toc=/azure/storage/blobs/toc.json
https://learn.microsoft.com/en-us/azure/storage/common/storage-ref-azcopy-sync
Based on the sugggestion by #4c74356b41, I discovered that the mentioned tool was recently integrated into the az tool.
It can be used the same way as az storage blob upload-batch. The base command is:
az storage blob sync

Fetch Azure storage dacpac file in ARM Template

Is there a way to get data from Azure Storage like dacpac, zip etc and put in drop folder in CI/CD pipeline?
hm, for saving files to Azure Storage, there is a Azure File Copy task. So you probably have to either use PowerShell (like the Set-AzStorageBlobContent cmdlet) or using the azcopy CLI (you might have to find an image that contains the binary)

upload files from URL (FTP server) to Azure storage account

I am trying to download and save large weather forecasts model output into Azure storage account. The data is available from NOAA/NCEP websit ftp://ftp.ncep.noaa.gov/pub/data/nccf/com/hrrr/prod/hrrr.20200220/conus/
Based on the documentation I have read, there are potentially Azcopy, Azure CLI, and Python SDK I can use. I started with Azure CLI, and try to do it with
az storage blob upload
--container-name "hrrr"
--file "ftp://ftp.ncep.noaa.gov/pub/data/nccf/com/hrrr/prod/hrrr.20200220/conus/hrrr.t00z.wrfsfcf36.grib2"
--name "hrrr.t00z.wrfsfcf36.grib"
--account-name "MyStorageAccountName"
--account-key "AccountKey"
Which does not work. I could not find other documentation that is close to what I am trying to do. Any solutions? Ultimately, I am hoping to have a script running automatically that fetch data every hour from the NCEP/NOAA to get download the newest forecast into my Azure storage account.
You can use a fairly simple Logic App to do this. Make it a 'Recurrence' trigger set to the schedule you want.
Actions in Logic App:
FTP - List files in folder
For each file - 'Get file content' then 'Create blob' in storage account.

Upload multiple files in Azure Blob Storage from Linux

Is there a way to upload multiple files to Azure Blob Storage from a Linux machine, either using the terminal or an application (web based or not)?
Thank you for your interest – There are two options to upload files in Azure Blobs from Linux:
Setup and use XPlatCLI by following the steps below:
Install the OS X Installer from http://azure.microsoft.com/en-us/documentation/articles/xplat-cli/
Open a Terminal window and connect to your Azure subscription by either downloading and using a publish settings file or by logging in to Azure using an organizational account (find instructions here)
Create an environment variable AZURE_STORAGE_CONNECTION_STRING and set its value (you will need your account name and account key): “DefaultEndpointsProtocol=https;AccountName=enter_your_account;AccountKey=enter_your_key”
Upload a file into Azure blob storage by using the following command: azure storage blob upload [file] [container] [blob]
Use one of the third party web azure storage explorers like CloudPortam: http://www.cloudportam.com/.
You can find the full list of azure storage explorers here: http://blogs.msdn.com/b/windowsazurestorage/archive/2014/03/11/windows-azure-storage-explorers-2014.aspx.
You can use the find command with the exec option to execute the command to upload each file, as described here as described here:
find *.csv -exec az storage blob upload --file {} --container-name \
CONTAINER_NAME --name {} --connection-string=‘CONNECTION_STRING’ \;
where CONNECTION_STRING is the connection string of your Azure Blob store container, available from portal.azure.com. This will upload all CSV files in your directory to the Azure Blob store associated with the connection string.
If you prefer the commandline and have a recent Python interpreter, the Azure Batch and HPC team has released a code sample with some AzCopy-like functionality on Python called blobxfer. This allows full recursive directory ingress into Azure Storage as well as full container copy back out to local storage. [full disclosure: I'm a contributor for this code]

Cannot download BLOBs from Azure Storage using AzCopy

I am trying to use AzCopy to download blobs from a container in an Azure storage account. Every time I issue the command it immediately returns and says "Finished 0 of total 0 file(s)."
The container is private. I'm using Windows Azure Storage Command Line which is included in the Windows Azure Storage tools download that includes AzCopy.
I can successfully upload files using AzCopy with no problem. Here are examples of my commands.
Upload (Copy) To Azure Storage - This Works
AzCopy c:\temp https://<myaccount>.blob.core.windows.net/<mycontainer> /destkey:<mykey> /V:C:\temp\logs\azcopy.log
Download (Copy) From Azure Storage - This Does Not Work
AzCopy https://<myaccount>.blob.core.windows.net/<mycontainer> c:\temp\meb /sourceKey:<mykey> /V:C:\temp\logs\azcopy.log
I know my key is correct because upload works without a problem. It's like it thinks there are no files in the container, but if I login to the Azure portal I can see files in -mycontainer- which resides in -myaccount-.
I can't find any details online about anyone having a similar issue. What am I missing?
AzCopy Folder Files and Versions
AzCopy.exe (1.0.8698.584)
Microsoft.Data.Edm.dll (5.6.0.61587)
Microsoft.Data.OData.dll (5.6.0.61587)
Microsoft.Data.Services.Client.dll (5.6.0.61587)
Microsoft.WindowsAzure.Storage.DataMovement.dll (1.0.8698.584)
Microsoft.WindowsAzure.Storage.dll (3.0.3.0)
Try downloading the blob by specifying /S parameter. So your download command would be:
AzCopy https://<myaccount>.blob.core.windows.net/<mycontainer> c:\temp\meb /sourceKey:<mykey> /S /V:C:\temp\logs\azcopy.log
From the documentation:
/S Recursive copy.
In recursive copy mode the source and destination
are treated as a directory (file-system) or
as a prefix string (blob storage).
This should do the trick.
Its very simple with AzCopy. Download latest version from https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/
and in azcopy type:
Copy a blob within a storage account:
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer1 /Dest:https://myaccount.blob.core.windows.net/mycontainer2 /SourceKey:key /DestKey:key /Pattern:abc.txt
Copy a blob across storage accounts:
AzCopy /Source:https://sourceaccount.blob.core.windows.net/mycontainer1 /Dest:https://destaccount.blob.core.windows.net/mycontainer2 /SourceKey:key1 /DestKey:key2 /Pattern:abc.txt
Copy a blob from the secondary region
If your storage account has read-access geo-redundant storage enabled, then you can copy data from the secondary region.
Copy a blob to the primary account from the secondary:
AzCopy /Source:https://myaccount1-secondary.blob.core.windows.net/mynewcontainer1 /Dest:https://myaccount2.blob.core.windows.net/mynewcontainer2 /SourceKey:key1 /DestKey:key2 /Pattern:abc.txt
To resume any intrrupted operation specify /Z option or for recursive operation specify /S

Resources