Could not verify the copy source within the specified time. RequestId: (blank) - azure

I am trying to copy some blob files from one storage account to another one. I am using AzCopy in order to fulfill this goal.
The process works for copying files between containers within the same storage account, but not between different storage accounts.
The command I am issuing is:
AzCopy /Source:https://<storage_account1>.blob.core.windows.net/<container_name1>/<path_to_desired_blobs> /Dest:https://<storage_account2>.blob.core.windows.net/<container_name2>/<path_to_store>/ /SourceKey:<source_key> /DestKey:<dest_key> /Pattern:<some_pattern> /S
The error I am getting is the following:
The remote server returned an error: (400) Bad Request.
Could not verify the copy source within the specified time.
RequestId:
Time:2016-04-01T19:33:01.0527460Z
The only difference between the two storage accounts is that one is Standard, whereas the other one is Premium.
Any help will be appreciated!

From your description, you're trying to copy Block Blob from source account to Page Blob in destination account, which is not supported in Azure Storage Service and AzCopy.
To work around it, you can firstly use AzCopy to download the Block Blobs from source account to local file system, and then upload them from local file system to destination account with option /BlobType:Page (this option is only valid when uploading from local to blob).

Premium Storage only supports page blobs. Please confirm that you are copying page blobs from standard to premium storage account. Also, specify the BlobType parameter to "page" in order to copy the data as page blobs into destination premium storage account.

From the description, I am assuming your source blob is a block blob. Azure's "Async Copy Blob" process (which is used by AzCopy as the default method) preserves the blob type. That is, you cannot convert a blob type from Block to Page through async copy blob.
Instead, can you try AzCopy again with "/SyncCopy" option along with "/BlobType:page" parameter? That might help change the destination blob type to Page.
(If that doesn't work, only other solution would be to first download the blob, and then upload it with "/BlobType:page")

Related

Is there possibility to synchronize data between two Azure blob storages

I have a task to copy BLOB storage to the other location and make synchronization between them. Unfortunately, I didn't find a solution for it. Is there a possibility to make it more simple way?
You can use the AzCopy utility to synchronize files, or replicate a source location to a destination location.
The azcopy sync command identifies all files at the destination, and then compares file names and last modified timestamps before the starting the sync operation. If you set the --delete-destination flag to true AzCopy deletes files without providing a prompt. If you want a prompt to appear before AzCopy deletes a file, set the --delete-destination flag to prompt.
Also check the az storage blob sync from Azure CLI.
Also consider the newer: Object Replication for Block Blobs.
Object replication asynchronously copies block blobs between a source
storage account and a destination account.

Azure: Unable to copy Archive blobs from one storage account to another?

Whenever I try to copy Archive blobs to a different storage account and changing its tier in destination. I am getting the following error:
Copy source blob has been modified. ErrorCode: CannotVerifyCopySource
I have tried copying Hot/Cool blobs to Hot/Cool/Archive. I am facing the issue only while copying Archive to Hot/Cool/Archive. Also, there is no issue while copying within same storage account.
I am using Azure python SDK:
blob_url = source_block_blob_service.make_blob_url(copy_from_container, blob_name, sas_token = sas)
dest_blob_service.copy_blob(copy_to_container, blob_name, blob_url, requires_sync = True, standard_blob_tier = 'Hot')
The reason you're getting this error is because copying an archived blob is only supported in the same storage account and you're trying it across different storage account.
From the REST API documentation page:
Copying Archived Blob (version 2018-11-09 and newer)
An archived blob can be copied to a new blob within the same storage
account. This will still leave the initially archived blob as is. When
copying an archived blob as source the request must contain the header
x-ms-access-tier indicating the tier of the destination blob. The data
will be eventually copied to the destination blob.
While a blob is in the archive access tier, it's considered offline and can't be read or modified.
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-rehydration
To read the blob, you either need to rehydrate it first. Or, as described in the link above, you can also use the CopyBlob operation. I am not sure if the python SDK copy_blob() operation uses that API behind the scenes - maybe not if it did not work that way for you.

access a file from a directory in azure blob storage through Azure Logic App

I am using LogicApp to import a set of files which are inside the directory(/devcontainer/sample1/abc.csv).
The problem here is that,I could not even located to the azure file from my LogicApp, I am getting the following error as:
verify that the path exists and does not contain the blob name.List Folder is not allowed on blobs.
Screenshots for reference
The problem here is that,I could not even located to the azure file from my LogicApp,
The file explorer will show all the contains and blobs when you choose blob path. And it will cache the data for a period of time to ensure the smoothness of the operation. If a blob is added to the container recently, it will not be seen and chosen from the file explorer. The workaround is by clicking the change connection link and using a new connection to retrieve the data.
Does your blob connection pointing to the correct storage account? one thing you can try to do is instead of providing the path try to browse the path so that you can what are the containers and the blobs that are present in the storage account that you are trying to access.

Azure blob copy in cloud

In aws, the "upload-part-copy" has option of byte ranges. If I wanted to copy portions of two objects to a new object within the cloud, I can copy using the "upload-part-copy" command.
I could not find any such method or mechanism to copy portions of blobs to a new blob in Azure. I tried AzCopy. But it does not have any option to select some portion of blob.
Can anyone please help me if there is any method like that.
Can anyone please help me if there is any method like that.
As of today, this feature is not there in Azure Blob Storage. A copy operation copies the entire source blob to destination blob.
A workaround would be to download the byte ranges (blocks) from the source blobs on your local machine and then create a new blob by uploading these blocks.
If you were using Blob Service REST API, here would be the operations you would need to perform:
Read Source Blob 1 by specifying the range in Range or x-ms-range request header you would like to read. Store the data fetched somewhere in your application.
Repeat the same for Source Blob 2.
Now create a new blob by uploading the data fetched for 1st source blob using Put Block.
Repeat the same for 2nd source blob.
Create the destination blob by committing block list.

AzCopy uploading local files to Azure Storage as files, not Blobs

I'm attempting to upload 550K files from my local hard drive to Azure Blob Storage using the following command (AzCopy 5.1.1) -
AzCopy /Source:d:\processed /Dest:https://ContainerX.file.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
It starts churning right away.
But it's actually creating a new Azure File Storage folder called fec-data/reports rather than creating new blobs in the Azure Blob folder fec-data/reports I've already created.
What am I missing?
Also, is there anyway to keep the date created (or similar) values of the old files?
Thanks,
But it's actually creating a new Azure File Storage folder called
fec-data/reports rather than creating new blobs in the Azure Blob
folder fec-data/reports I've already created.
What am I missing?
The reason you're seeing this behavior is because you're uploading to File storage instead of Blob storage. To upload the files to Blob storage, you need to specify blob service endpoint (blob.core.windows.net). So your command would be:
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Also, is there anyway to keep the date created (or similar) values of
the old files?
Assuming you want to keep the date created of the blob same as that of the desktop file, then it is not possible. Blob's Last Modified Date/Time is a system property that gets assigned when a blob is created and is updated every time that blob is changed. You could however make use of blob's metadata and store file's creation date/time there.
I think you have to get the instance of the bob where you want to deploy the file
like :
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Blob: Upload
Upload single file
AzCopy /Source:C:\myfolder/Dest:https://myaccount.blob.core.windows.net/mycontainer /DestKey:key /Pattern:"abc.txt"
If the specified destination container does not exist, AzCopy will create it and upload the file into it.
Upload single file to virtual directory
AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer/vd /DestKey:key /Pattern:abc.txt
If the specified virtual directory does not exist, AzCopy will upload the file to include the virtual directory in its name (e.g., vd/abc.txt in the example above).
please refer the link :https://learn.microsoft.com/en-us/azure/storage/storage-use-azcopy

Resources