Is there possibility to synchronize data between two Azure blob storages - azure

I have a task to copy BLOB storage to the other location and make synchronization between them. Unfortunately, I didn't find a solution for it. Is there a possibility to make it more simple way?

You can use the AzCopy utility to synchronize files, or replicate a source location to a destination location.
The azcopy sync command identifies all files at the destination, and then compares file names and last modified timestamps before the starting the sync operation. If you set the --delete-destination flag to true AzCopy deletes files without providing a prompt. If you want a prompt to appear before AzCopy deletes a file, set the --delete-destination flag to prompt.
Also check the az storage blob sync from Azure CLI.

Also consider the newer: Object Replication for Block Blobs.
Object replication asynchronously copies block blobs between a source
storage account and a destination account.

Related

Azure storage table copy

I have a problem using azure AzCopy. Here my scenario.
I have 2 storage accounts, which I am gonna name storage1 and storage2.
Storage1 contains some important data in multiple tables, what I want to do..is to be able to copy all the tables in storage1 to storage2 (having a backup).
I tried 2 different approaches:
AzCopy
Azure Data Factory
With Azure Data Factory I didn't have any particular problem to make it work, I was able to move all the blobs from storage1 to the data Factory but I I couldn't move the tables and have no clue if this is possible to do it with python.
with AzCopy I had zero luck. I gave myself permission in IAM Blob Storage Data contributor and from the terminal when I run this command:
azcopy cp 'https://storage1.table.core.windows.net/Table1' 'https://storage2[...]-Key'
I got the permission error.
In this specific scenario I would love to be able to use AzCopy as is way more simple than data factory as all what I need is to move those table from one storage to the other.
Anyone who can help me out to understand what am I doing wrong with azCopy please?
EDIT:
This is the error when I try to copy the table using azcopy
INFO: The parameters you supplied were Source: 'https://storage1.table.core.windows.net/[SAS]' of type Local, and Destination: 'https://storage2.table.core.windows.net/[SAS]' of type Local
INFO: Based on the parameters supplied, a valid source-destination combination could not automatically be found. Please check the parameters you supplied. If they are correct, please specify an exact source and destination type using the --from-to switch. Valid values are two-word phases of the form BlobLocal, LocalBlob etc. Use the word 'Blob' for Blob Storage, 'Local' for the local file system, 'File' for Azure Files, and 'BlobFS' for ADLS Gen2. If you need a combination that is not supported yet, please log an issue on the AzCopy GitHub issues list.
failed to parse user input due to error: the inferred source/destination combination could not be identified, or is currently not supported
If you want to copy all the tables which is present is abc container to xyz container. Use simple copy acitivty and while creating dataset just give folder path that copy all the content i.e all the tables to your xyz container.
I would like to watch below video from the 30th mins. It will help in your scenario.
https://youtu.be/m6wyB-Hm3j0

How to copy one storage account's container's blobs to another storage account's container's blobs

I have two storage accounts(storage1) and (storage2) and both of them have containers called data.
Now, storage1's data contains folder called database-files which contains lots of folders recursively. I mean it's kind of huge.
What I am trying to do is I want to copy database-files and everything that's in it from storage1's data container to storage2's data container. Note: both storage accounts are in the same resource group and subscription.
Here is what I've tried:
az storage blob copy start-batch --source-account-name "storage1" --source-container "data" --account-name "storage2" --destination-container "data"
This worked fine, but The problem is time it takes is ridiculously big and I can't wait this much because I want to do this command for one of my release . Which means that i need this as soon as fast, so that my deployment happens fast.
Is there any way to make it faster? maybe zip it/copy it/unzip it? Even If I use AzCopy, I have no idea how it's going to help with timing. All it helps is it doesn't have point of failure and also I have no idea how to use it via azure cli.
How can I proceed?

Can we copy Azure blobs from one storage account to other storage accounts in parallel from same machine?

In Microsoft Azure, I have a source storage account in one region and 3 destination storage accounts in 3 different regions. I want to copy blob data from source storage account to all 3 destination storage accounts. Currently I am using the azcopy (version 6) command in a bash script to do it. First it completes for one region then starts for another. It takes almost an hour everyday due to the geographical distance between the regions. I wanted to know if azcopy has any option to copy blobs from source to multiple destinations in a parallel manner. Any other suggestions to reduce the time are also invited :)
Generalization of azcopy command being used in my bash script:
/usr/bin/azcopy --source https://[srcaccount].blob.core.windows.net/[container]/[path/to/blob] --source-key $SOURCE_KEY --destination https://[destaccount].blob.core.windows.net/[container]/[path/to/blob] --dest-key $DEST_KEY --recursive --quiet --exclude-older
Azcopy can always only copy data from one source to one destination. But since you mention that you need to do this every day, I would probably go for a scheduled pipeline in Azure Data Factory instead. There you can also set up the three different copy jobs as parallel activities.
Just spawn a separate instance of your script for each destination. That way your copy will happen in parallel.
Here is a simple guide for doing this in BASH : https://www.slashroot.in/how-run-multiple-commands-parallel-linux

AzCopy uploading local files to Azure Storage as files, not Blobs

I'm attempting to upload 550K files from my local hard drive to Azure Blob Storage using the following command (AzCopy 5.1.1) -
AzCopy /Source:d:\processed /Dest:https://ContainerX.file.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
It starts churning right away.
But it's actually creating a new Azure File Storage folder called fec-data/reports rather than creating new blobs in the Azure Blob folder fec-data/reports I've already created.
What am I missing?
Also, is there anyway to keep the date created (or similar) values of the old files?
Thanks,
But it's actually creating a new Azure File Storage folder called
fec-data/reports rather than creating new blobs in the Azure Blob
folder fec-data/reports I've already created.
What am I missing?
The reason you're seeing this behavior is because you're uploading to File storage instead of Blob storage. To upload the files to Blob storage, you need to specify blob service endpoint (blob.core.windows.net). So your command would be:
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Also, is there anyway to keep the date created (or similar) values of
the old files?
Assuming you want to keep the date created of the blob same as that of the desktop file, then it is not possible. Blob's Last Modified Date/Time is a system property that gets assigned when a blob is created and is updated every time that blob is changed. You could however make use of blob's metadata and store file's creation date/time there.
I think you have to get the instance of the bob where you want to deploy the file
like :
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Blob: Upload
Upload single file
AzCopy /Source:C:\myfolder/Dest:https://myaccount.blob.core.windows.net/mycontainer /DestKey:key /Pattern:"abc.txt"
If the specified destination container does not exist, AzCopy will create it and upload the file into it.
Upload single file to virtual directory
AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer/vd /DestKey:key /Pattern:abc.txt
If the specified virtual directory does not exist, AzCopy will upload the file to include the virtual directory in its name (e.g., vd/abc.txt in the example above).
please refer the link :https://learn.microsoft.com/en-us/azure/storage/storage-use-azcopy

Could not verify the copy source within the specified time. RequestId: (blank)

I am trying to copy some blob files from one storage account to another one. I am using AzCopy in order to fulfill this goal.
The process works for copying files between containers within the same storage account, but not between different storage accounts.
The command I am issuing is:
AzCopy /Source:https://<storage_account1>.blob.core.windows.net/<container_name1>/<path_to_desired_blobs> /Dest:https://<storage_account2>.blob.core.windows.net/<container_name2>/<path_to_store>/ /SourceKey:<source_key> /DestKey:<dest_key> /Pattern:<some_pattern> /S
The error I am getting is the following:
The remote server returned an error: (400) Bad Request.
Could not verify the copy source within the specified time.
RequestId:
Time:2016-04-01T19:33:01.0527460Z
The only difference between the two storage accounts is that one is Standard, whereas the other one is Premium.
Any help will be appreciated!
From your description, you're trying to copy Block Blob from source account to Page Blob in destination account, which is not supported in Azure Storage Service and AzCopy.
To work around it, you can firstly use AzCopy to download the Block Blobs from source account to local file system, and then upload them from local file system to destination account with option /BlobType:Page (this option is only valid when uploading from local to blob).
Premium Storage only supports page blobs. Please confirm that you are copying page blobs from standard to premium storage account. Also, specify the BlobType parameter to "page" in order to copy the data as page blobs into destination premium storage account.
From the description, I am assuming your source blob is a block blob. Azure's "Async Copy Blob" process (which is used by AzCopy as the default method) preserves the blob type. That is, you cannot convert a blob type from Block to Page through async copy blob.
Instead, can you try AzCopy again with "/SyncCopy" option along with "/BlobType:page" parameter? That might help change the destination blob type to Page.
(If that doesn't work, only other solution would be to first download the blob, and then upload it with "/BlobType:page")

Resources