Cannot download BLOBs from Azure Storage using AzCopy - azure

I am trying to use AzCopy to download blobs from a container in an Azure storage account. Every time I issue the command it immediately returns and says "Finished 0 of total 0 file(s)."
The container is private. I'm using Windows Azure Storage Command Line which is included in the Windows Azure Storage tools download that includes AzCopy.
I can successfully upload files using AzCopy with no problem. Here are examples of my commands.
Upload (Copy) To Azure Storage - This Works
AzCopy c:\temp https://<myaccount>.blob.core.windows.net/<mycontainer> /destkey:<mykey> /V:C:\temp\logs\azcopy.log
Download (Copy) From Azure Storage - This Does Not Work
AzCopy https://<myaccount>.blob.core.windows.net/<mycontainer> c:\temp\meb /sourceKey:<mykey> /V:C:\temp\logs\azcopy.log
I know my key is correct because upload works without a problem. It's like it thinks there are no files in the container, but if I login to the Azure portal I can see files in -mycontainer- which resides in -myaccount-.
I can't find any details online about anyone having a similar issue. What am I missing?
AzCopy Folder Files and Versions
AzCopy.exe (1.0.8698.584)
Microsoft.Data.Edm.dll (5.6.0.61587)
Microsoft.Data.OData.dll (5.6.0.61587)
Microsoft.Data.Services.Client.dll (5.6.0.61587)
Microsoft.WindowsAzure.Storage.DataMovement.dll (1.0.8698.584)
Microsoft.WindowsAzure.Storage.dll (3.0.3.0)

Try downloading the blob by specifying /S parameter. So your download command would be:
AzCopy https://<myaccount>.blob.core.windows.net/<mycontainer> c:\temp\meb /sourceKey:<mykey> /S /V:C:\temp\logs\azcopy.log
From the documentation:
/S Recursive copy.
In recursive copy mode the source and destination
are treated as a directory (file-system) or
as a prefix string (blob storage).
This should do the trick.

Its very simple with AzCopy. Download latest version from https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/
and in azcopy type:
Copy a blob within a storage account:
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer1 /Dest:https://myaccount.blob.core.windows.net/mycontainer2 /SourceKey:key /DestKey:key /Pattern:abc.txt
Copy a blob across storage accounts:
AzCopy /Source:https://sourceaccount.blob.core.windows.net/mycontainer1 /Dest:https://destaccount.blob.core.windows.net/mycontainer2 /SourceKey:key1 /DestKey:key2 /Pattern:abc.txt
Copy a blob from the secondary region
If your storage account has read-access geo-redundant storage enabled, then you can copy data from the secondary region.
Copy a blob to the primary account from the secondary:
AzCopy /Source:https://myaccount1-secondary.blob.core.windows.net/mynewcontainer1 /Dest:https://myaccount2.blob.core.windows.net/mynewcontainer2 /SourceKey:key1 /DestKey:key2 /Pattern:abc.txt
To resume any intrrupted operation specify /Z option or for recursive operation specify /S

Related

Failed to copy container from Azure to Local Emulator

scenario
I would like to copy whole contianer from my storage account at Azure to my local storage account at Storage Emulator. I had SAS token to both accounts generated.
I tried in windows 10 CMD:
azcopy copy "https://myazuresrg.blob.core.windows.net/mycontainer/?
sv=2020-08-04&ss=b&srt=co&sp=rltf&se=2021-09-10T15:14:05Z&st=2021-09-
10T07:14:05Z&spr=https&sig=Eb%2FsK9kmwVDJt2PPg2a6wocXkK7EDrj3fgY8uT5dI
IE%3D" "http://127.0.0.1:10000/devstoreaccount1/mycontainer?
sv=2019-07-07&sr=c&sig=XXXXXXXX&se=2021-09-11T07%3A29%3A46Z&sp=rwdl" -
-recursive=true --from-to=BlobLocal
Problem
In logs I can see error:
DOWNLOADFAILED:
https://myazuresrg.blob.core.windows.net/mycontainer/website/footer.js
on?se=2021-09-10t15%3A14%3A05z&sig=-REDACTED-
&sp=rltf&spr=https&srt=co&ss=b&st=2021-09-10t07%3A14%3A05z&sv=2020-08-
04 : 000 : File Creation Error mkdir \\?
\C:\AzCopy\http:\127.0.0.1:10000\devstoreaccount1\mycontainer?sv=2019-
07-07&sr=c&sig=-REDACTED-&se=2021-09-
11T07%3A29%3A46Z&sp=rwdl\mycontainer\website: The filename, directory
name, or volume label syntax is incorrect.
Why Azcopy add prepath \C:\AzCopy to my local acount emulator path?
There are two issues here:
Incorrect use of --from-to. Basically you would use --from-to=BlobLocal when you want to download the blob from storage to your local computer. That's the reason you're seeing azcopy prepending \C:\AzCopy to your local emulator path.
You cannot use azcopy copy to copy blobs from cloud storage account to your storage emulator. Essentially copy blob operation is an async operation where once you initiate the copy operation, Azure Storage service asynchronously copies the blob from source to destination account. For this both source and target account must be in the cloud. Azure Storage service must be able to reach out to both of these accounts. Considering your target account is storage emulator running on your local computer, Azure Storage service will not be able to reach it and thus copy operation will fail.
What you will need to do in this case is first download the blobs from your storage account to your local computer and then upload it in your storage emulator.

adls storage to adls storage file transfer

Trying to send files using a remote on-premises server and azcopy from one ADLS storage account to another storage account(weird requirement but needed).
azcopy cp 'https://mysourceaccount.dfs.core.windows.net/mycontainer?sxxxxxx' 'https://mydestinationaccount.dfs.core.windows.net/mycontainer' --recursive
throws an error saying below:
I tested in my environment and it is working for me Please use the below formatted command (You were missing SAS token for second container and –recursive=true)
azcopy copy "https://tstadlsstorage1.dfs.core.windows.net/testcontainer?SAS_token_for_container_in_source_storageaccount" "https://tstadlsstorage2.dfs.core.windows.net/testcontainer2?SAS_token_for_container_in_destination_storageaccount"
--recursive=true
Output--

How to upload a file from azure blob storage to Linux VM created on azure

I have one large file on my azure blob storage container. I want to move my file from blob storage to Linux VM created on azure> How can I do that using data factory? or any Powershell Command?
The easiest and without any tools is to generate SAS token for the blob and run CURL.
Generate SAS
And then CURL
curl <blob_sas_url> -o output.txt
If you need this automated every time you can generate SAS URL from the script or just use AzCopy.
Please reference this blog:How to copy data to VM from blob storage, it gives you a way to solve the problem with Data Factory:
"To anyone who might get into same problem in future, I solved my problem by using 'copy wizard' present in ADF.
We need to install Data Management Gateway on VM and register it before we use 'copy wizard'.
We need to specify blob storage as source and in destination we need to choose 'File Server Share' option. In 'File Server Share' option we need to specify user credentials which I suppose pipeline uses to login to VM, folder on VM where pipeline will copy the data."
From the Azure Blog Storage document, there is another way can help you Mount Blob storage as a file system with blobfuse on Linux.
Blobfuse is a virtual file system driver for Azure Blob storage. Blobfuse allows you to access your existing block blob data in your storage account through the Linux file system. Blobfuse uses the virtual directory scheme with the forward-slash '/' as a delimiter.
This guide shows you how to use blobfuse, and mount a Blob storage container on Linux and access data. To learn more about blobfuse, read the details in the blobfuse repository.
If you want to use AzCopy, you can reference this document Transfer data with AzCopy and Blob storage. You can download the AzCopy for Linux. It provided the command for upload and download files.
For example, upload file:
azcopy copy "<local-file-path>" "https://<storage-account-name>.<blob or dfs>.core.windows.net/<container-name>/<blob-name>"
For PowerShell, you need to use PowerShell Core 6.x and later on all platforms. It works with Windows and Linux virtual machines using Windows PowerShell 5.1 (Windows only) or PowerShell 6 (Windows and Linux).
You can find the PowerShell commands in this document:Quickstart: Upload, download, and list blobs by using Azure PowerShell
Here is another link talked about Copy Files to Azure VM using PowerShell Remoting 6 (Windows and Linux).
Hope this helps.
You have many options to copy content from the blob store to the disk on the VM:
1. Use AzCopy
2. Use Azure Pipelines - File copy task
3. Use Powershell cmdlets
A lot of content is available on these approaches on SO!
It seems this is not properly documented anywhere so I am sharing the most basic approach which is to use the azcopy tool that is available for both windows/linux OS. This approach doens't need the complexity of creating the credentials/tokens.
Download azcopy
Its simple executable which can be run directly after extraction
Create a managed identity(system-assigned identity) for your Virtual machine. Navigate to VM-> Identity -> Turn the Status to 'ON' -> Save
Now the VM can be assigned permission at the following levels:
Storage account
Container (file system)
Resource group
Subscription
For this case, navigate to storage account -> IAM -> Add role assignment -> Select role 'Storage Blob Data Contributor' -> Assign access to 'Virtual machine' -> Select the desired VM -> SAVE
NOTE: If you give access to the VM on IAM properties of a Resource Group, the VM will be able to access all the storage accounts of the RG.
Login to VM and assume the identity (run the command from the same location where the azcopy is located)
For windows : azcopy login --identity
For linux : ./azcopy login --identity
Upload or download the files now:
azcopy cp "source-file" "storageUri/blob-container/" --recursive=true
Example: azcopy cp "C:\test.txt" "https://mystorageaccount.blob.core.windows.net/backup/" --recursive=true
IAM permission can take few minutes to propagate. If you change/add the permissions/access level anywhere, run the azcopy login --identity command again to get the updated identity.
More info on Azcopy is available here

AzCopy uploading local files to Azure Storage as files, not Blobs

I'm attempting to upload 550K files from my local hard drive to Azure Blob Storage using the following command (AzCopy 5.1.1) -
AzCopy /Source:d:\processed /Dest:https://ContainerX.file.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
It starts churning right away.
But it's actually creating a new Azure File Storage folder called fec-data/reports rather than creating new blobs in the Azure Blob folder fec-data/reports I've already created.
What am I missing?
Also, is there anyway to keep the date created (or similar) values of the old files?
Thanks,
But it's actually creating a new Azure File Storage folder called
fec-data/reports rather than creating new blobs in the Azure Blob
folder fec-data/reports I've already created.
What am I missing?
The reason you're seeing this behavior is because you're uploading to File storage instead of Blob storage. To upload the files to Blob storage, you need to specify blob service endpoint (blob.core.windows.net). So your command would be:
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Also, is there anyway to keep the date created (or similar) values of
the old files?
Assuming you want to keep the date created of the blob same as that of the desktop file, then it is not possible. Blob's Last Modified Date/Time is a system property that gets assigned when a blob is created and is updated every time that blob is changed. You could however make use of blob's metadata and store file's creation date/time there.
I think you have to get the instance of the bob where you want to deploy the file
like :
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Blob: Upload
Upload single file
AzCopy /Source:C:\myfolder/Dest:https://myaccount.blob.core.windows.net/mycontainer /DestKey:key /Pattern:"abc.txt"
If the specified destination container does not exist, AzCopy will create it and upload the file into it.
Upload single file to virtual directory
AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer/vd /DestKey:key /Pattern:abc.txt
If the specified virtual directory does not exist, AzCopy will upload the file to include the virtual directory in its name (e.g., vd/abc.txt in the example above).
please refer the link :https://learn.microsoft.com/en-us/azure/storage/storage-use-azcopy

Upload multiple files in Azure Blob Storage from Linux

Is there a way to upload multiple files to Azure Blob Storage from a Linux machine, either using the terminal or an application (web based or not)?
Thank you for your interest – There are two options to upload files in Azure Blobs from Linux:
Setup and use XPlatCLI by following the steps below:
Install the OS X Installer from http://azure.microsoft.com/en-us/documentation/articles/xplat-cli/
Open a Terminal window and connect to your Azure subscription by either downloading and using a publish settings file or by logging in to Azure using an organizational account (find instructions here)
Create an environment variable AZURE_STORAGE_CONNECTION_STRING and set its value (you will need your account name and account key): “DefaultEndpointsProtocol=https;AccountName=enter_your_account;AccountKey=enter_your_key”
Upload a file into Azure blob storage by using the following command: azure storage blob upload [file] [container] [blob]
Use one of the third party web azure storage explorers like CloudPortam: http://www.cloudportam.com/.
You can find the full list of azure storage explorers here: http://blogs.msdn.com/b/windowsazurestorage/archive/2014/03/11/windows-azure-storage-explorers-2014.aspx.
You can use the find command with the exec option to execute the command to upload each file, as described here as described here:
find *.csv -exec az storage blob upload --file {} --container-name \
CONTAINER_NAME --name {} --connection-string=‘CONNECTION_STRING’ \;
where CONNECTION_STRING is the connection string of your Azure Blob store container, available from portal.azure.com. This will upload all CSV files in your directory to the Azure Blob store associated with the connection string.
If you prefer the commandline and have a recent Python interpreter, the Azure Batch and HPC team has released a code sample with some AzCopy-like functionality on Python called blobxfer. This allows full recursive directory ingress into Azure Storage as well as full container copy back out to local storage. [full disclosure: I'm a contributor for this code]

Resources