Azure cross-account copy using AzCopy and Shared Access Key - azure

I want to use AzCopy to copy a blob from account A to account B. But instead of using access key for the source, I only have access to the Shared Access Key. I've tried appending the SAS after the URL, but it throws a 404 error. This is the syntax I tried
AzCopy "https://source-blob-object-url?sv=blah-blah-blah-source-sas" "https://dest-blob-object-url" /destkey:base64-dest-access-key
The error I got was
Error parsing source location "https://source-blob-object-url?sv=blah-blah-blah-source-sas":
The remote server returned an error: (404) Not Found.
How can I get AzCopy to use the SAS URL? Or that it doesn't support SAS?
Update:
With the SourceSAS and FilePattern options, I'm still getting the 404 error. This is the command I use:
AzCopy [source-container-url] [destination-container-url] [file-pattern] /SourceSAS:"?sv=2013-08-15&sr=c&si=ReadOnlyPolicy&sig=[signature-removed]" /DestKey:[destination-access-key]
This will get me a 404 Not Found. If I change the signature to make it invalid, AzCopy will throw a 403 Forbidden instead.

You're correct. Copy operation using SAS on both source and destination blobs is only supported when source and destination blobs are in same storage account. Copying across storage accounts using SAS is still not supported by Windows Azure Storage. This has been covered (though one liner only) in this blog post from storage team: http://blogs.msdn.com/b/windowsazurestorage/archive/2013/11/27/windows-azure-storage-release-introducing-cors-json-minute-metrics-and-more.aspx. From the post:
Copy blob now allows Shared Access Signature (SAS) to be used for the
destination blob if the copy is within the same storage account.
UPDATE
So I tried it and one thing I realized is that it is meant for copying all blobs from one container to another. Based on my trial/error, a few things you would need to keep in mind are:
Source SAS is for source container and not the blob. Also ensure that you have both Read and List permission on the blob container in the SAS.
If you want to copy a single file, please ensure that it is defined as "filepattern" parameter.
Based on these, can you please try the following:
AzCopy "https://<source account>.blob.core.windows.net/<source container>?<source container sas with read/list permission>" "https://<destination account>.blob.core.windows.net/<destination container>" "<source blob name to copy>" /DestKey:"destination account key"
UPDATE 2
Error parsing source location [container-location]: Object reference
not set to an instance of an object.
I was able to recreate the error. I believe the reason for this error is the version of storage client library (and thus the REST API) which is used to create SAS token. If I try to list contents of a blob container using a SAS token created by using version 3.x of the library, this is the output I get:
<?xml version="1.0" encoding="utf-8"?>
<EnumerationResults ServiceEndpoint="https://cynapta.blob.core.windows.net/" ContainerName="vhds">
<Blobs>
<Blob>
<Name>test.vhd</Name>
<Properties>
<Last-Modified>Fri, 17 May 2013 15:23:39 GMT</Last-Modified>
<Etag>0x8D02129A4ACFFD7</Etag>
<Content-Length>10486272</Content-Length>
<Content-Type>application/octet-stream</Content-Type>
<Content-Encoding />
<Content-Language />
<Content-MD5>uflK5qFmBmek/zyqad7/WQ==</Content-MD5>
<Cache-Control />
<Content-Disposition />
<x-ms-blob-sequence-number>0</x-ms-blob-sequence-number>
<BlobType>PageBlob</BlobType>
<LeaseStatus>unlocked</LeaseStatus>
<LeaseState>available</LeaseState>
</Properties>
</Blob>
</Blobs>
<NextMarker />
</EnumerationResults>
However if I try to list contents of a blob container using a SAS token created by using version 2.x of the library, this is the output I get:
<?xml version="1.0" encoding="utf-8"?>
<EnumerationResults ContainerName="https://cynapta.blob.core.windows.net/vhds">
<Blobs>
<Blob>
<Name>test.vhd</Name>
<Url>https://cynapta.blob.core.windows.net/vhds/test.vhd</Url>
<Properties>
<Last-Modified>Fri, 17 May 2013 15:23:39 GMT</Last-Modified>
<Etag>0x8D02129A4ACFFD7</Etag>
<Content-Length>10486272</Content-Length>
<Content-Type>application/octet-stream</Content-Type>
<Content-Encoding />
<Content-Language />
<Content-MD5>uflK5qFmBmek/zyqad7/WQ==</Content-MD5>
<Cache-Control />
<x-ms-blob-sequence-number>0</x-ms-blob-sequence-number>
<BlobType>PageBlob</BlobType>
<LeaseStatus>unlocked</LeaseStatus>
<LeaseState>available</LeaseState>
</Properties>
</Blob>
</Blobs>
<NextMarker />
</EnumerationResults>
Notice the difference in <EnumerationResults> XElement.
Now AzCopy uses version 2.1.0.4 version of the storage client library. As a part of copying operation it first lists the blobs in source container using the SAS token. Now as we saw above the XML returned is different in both versions so storage client library 2.1.0.4 fails to parse the XML returned by storage service. Because it fails to parse the XML, it is not able to create a Blob object and thus you get the NullReferenceException.
Solution:
One possible solution to this problem is to create a SAS token using version 2.1.0.4 version of the library. I tried doing that and was able to successfully copy the blob. Do give it a try. That should fix the problem you're facing.

Make sure you are using the latest version of the AzCopy and
check this http://blogs.msdn.com/b/windowsazurestorage/archive/2013/09/07/azcopy-transfer-data-with-re-startable-mode-and-sas-token.aspx
/DestSAS and /SourceSAS: This option allows access to storage containers and blobs with a SAS (Shared Access Signature) token. SAS token, which is generated by the storage account owner, grants access to specific containers and blobs with specifc permissions and for a specified period of time.
Example: Upload all files from a local directory to a container using SAS token which offers permits for list and write
AzCopy C:\blobData https://xyzaccount.blob.core.windows.net/xyzcontainer /DestSAS:”?sr=c&si=mypolicy&sig=XXXXX” /s
/DestSAS here is for you to specify the SAS token to access storage container, it should be enclosed in quotes.

You can use IaaS Management Studio to generate the powershell script for you. It is a commercial tool, but you can to that in the trial version. It does not use AzCopy though, but the classic blob API in powershell.
Just "Share the VHD" to get the SAS link. Then "Import from shared link", copy the SAS link you got earlier. Check at the bottom, you'll see a script icon. Put your cursor on it and it shows up.
However, in the trial, you can't copy the script, you'll need to type it by hand, but it is not very long to do so.

Related

How to fix 403 Forbidden error when using New-MailboxImportRequest with blob storage

The Project: Moving a GCC Office365 tenant to Commercial Office365
I'm attempting to use the New-MailboxImportRequest command to import pst files from a blob storage. I'm doing it this way so I can specify the source folder within the PST file since eDiscovery exports create a non standard file structure within the PST. If I was to attempt to import the PST with Network upload like Microsoft suggests, it will not merge the the folders with the correct folders in the top of the information store.
Every time I attempt to use the import command, I receive the following output.
Unable to open PST file 'BLOB URL'.
Error details: The remote server returned an error: (403) Forbidden.
Currently the file is sitting in a self created Azure Blob Storage. I'm not sure if I have incorrectly created the blob storage, incorrectly set its permissions, or what.
I've tried creating the blob storage a different way via powershell by following this Microsoft article, https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-powershell , but I can't find out how to get the URI and the SAS key needed to use the command that I'm attmepting to use. Reference the full command below.
New-MailboxImportRequest -Name "importjob001" -Mailbox "mailboxname" -SourceRootFolder 'malboxname#domain.net (Primary)/Top of Information Store' -TargetRootFolder '/' -IncludeFolders "/*" -ConflictResolutionOption forcecopy -AzureBlobStorageAccountUri "Blob_SAS_URL" -AzureSharedAccessSignatureToken "Blob SAS URL"
Please let me know what I can provide to assist you in helping me with this issue.

Azcopy throws error while executing via Terraform

I am using the Azcopy tool to copy a storage account to another. While executing the command using terminal it executes perfectly. But while executing the same using Terraform's local-executioner it throws an error. Please find the code and error below.
Code:
resource "null_resource" "backup" {
provisioner "local-exec" {
command= <<EOF
azcopy cp "https://${var.src_storage_acc_name}.blob.core.windows.net${var.src_sas}" "https://${var.dest_storage_acc_name}.blob.core.windows.net${var.dest_sas}"
EOF
}
}
Error:
Error running command ' azcopy cp "https://strsrc.blob.core.windows.net?[SAS]" "https://strdest.blob.core.windows.net?[SAS]"
': exit status 1. Output: INFO: The parameters you supplied were Source: '"https://strsrc.blob.core.windows.net?[SAS]-REDACTED- of type Local, and Destination: '"https://strdest.blob.core.windows.net?[SAS]-REDACTED- of type Local
INFO: Based on the parameters supplied, a valid source-destination combination could not automatically be found. Please check the parameters you supplied. If they are correct, please specify an exact source and destination type using the --from-to switch. Valid values are two-word phases of the form BlobLocal, LocalBlob etc. Use the word 'Blob' for Blob Storage, 'Local' for the local file system, 'File' for Azure Files, and 'BlobFS' for ADLS Gen2. If you need a combination that is not supported yet, please log an issue on the AzCopy GitHub issues list.
failed to parse user input due to error: the inferred source/destination combination could not be identified, or is currently not supported
Please provide your thoughts on this.
Today I needed to implement a similar task, and I used the azcopy cp command with --recursive=true option which is given in the document.
It successfully copied all contents of the source container to the destination.
Copy all blob containers, directories, and blobs from storage account to another by using a SAS token:
- azcopy cp "https://[srcaccount].blob.core.windows.net?[SAS]" "https://[destaccount].blob.core.windows.net?[SAS]" --recursive=true
azcopy only support certain combinations of source and destination types (blob, Gen1, Gen2, S3, Local file system, ...) for copy sub-command.
azcopy tries to guess source/destination types based on URL & params.
This error means that
you're trying to use a combination that isn't supported OR
Nothing you can do. Raise a issue as suggested. They'll probably just ignore it like this or this.
there is something wrong with your URL. E.g. you have blob.core.windows.net when you should've had dfs.core.windows.net or vice versa. This in turn causes mis-identification of source and destination types.
If you're sure that the combination is supported then you can tell azcopy the types using --from-to. Ironically, when you use a combination that isn't supported (e.g. BlobFSBlobFS), it gives the same error message instead of saying "source destination combination not supported).
When dealing with Gen2, you could use blob instead of dfs in the URL to make it use older (blob/Gen1) APIs to interact with your Gen2 account. Though less performant, it still might work.
'Blob' for Blob Storage
'Local' for the local file system
'File' for Azure Files
'BlobFS' for ADLS Gen2
As of now following combinations are supported per documentation:
local <-> Azure Blob (SAS or OAuth authentication)
local <-> Azure Files (Share/directory SAS authentication)
local <-> Azure Data Lake Storage Gen 2 (SAS, OAuth, or shared key authentication)
Azure Blob (SAS or public) -> Azure Blob (SAS or OAuth authentication)
Azure Blob (SAS or public) -> Azure Files (SAS)
Azure Files (SAS) -> Azure Files (SAS)
Azure Files (SAS) -> Azure Blob (SAS or OAuth authentication)
Amazon Web Services (AWS) S3 (Access Key) -> Azure Block Blob (SAS or OAuth authentication)
Google Cloud Storage (Service Account Key) -> Azure Block Blob (SAS or OAuth authentication) [Preview]

azcopy in cloud shell fails attempting to copy source storage account contents to target storage account

error is 'failed to perform copy command due to error: no SAS token or OAuth token is present and the resource is not public'
command is formed per the MS doc page.
azcopy copy "http://source.blob.core.windows.net?sv=2019-02-02&ss=bfqt&srt=sco&sp=rwdlacup&se=2020-02-15T02:43:29Z&st=2020-02-14T18:43:29Z&spr=https&sig=Jojd%2FWIfjCza7yNtkt%2FeaFOepyxaunBnjH6O3xCKOto%3D" "http://target.blob.core.windows.net" --recursive
SAS token was generated within minutes of attempted execution, public access on source storage account is set to container, not private. Command syntax is identical to that in the MS documentation.
I'm thinking that I have overlooked something basic, probably staring me in the face, anyone see anything obvious?
If you're following this section, then please specify sas token for the target storage url.
Here is the test with the latest version of azcopy v10.3.4.
The command:
azcopy copy "https://yy7.blob.core.windows.net/?sasToken" "https://yy88.blob.core.windows.net/?sasToken" --recursive
The test result:

Referencing data in blob storage

I have set up a postgreSQL database on a linux VM in Azure, and I have a .csv file in blob storage that I'd like to upload to that database.
However, I can't find any documentation regarding how (or even if it's possible) to reference files that are stored in blob storage as if it were part of the file system, or otherwise transfer files from blob storage to a server also running in Azure.
All the references I've found are about importing directly into pre-built SQL Server VMs, which is not my problem.
Any references or other help anyone can provide would be much appreciated.
As far as I know, the PostgreSQL support program key word in its query.
So I suggest you could use this key word to access the blob storage csv file.
Normally we will use curl to access the file, you could download it in below url:
https://curl.haxx.se/download.html#Linux
More details, you could refer to follow example codes:
COPY persons(first_name,last_name,dob,email)
FROM PROGRAM 'C:\curl "https://yourstorageaccount.blob.core.windows.net/mycontainer/test2.csv?sv=2016-05-31&sr=c&sig=jtNRuzR7G98hHogHHZyKY9gYN0r%2FSgr2j78HGKihYlc%3D&st=2017-03-09T02%3A43%3A17Z&se=2017-03-11T02%3A43%3A17Z&sp=rl"'DELIMITER ',' CSV HEADER;
The result of the query like as below:
Here I used the SAS token to protect my blob file.
If you don’t want to use this token, you could set the container’s permission in the portal.
Like below:
Then you could directly access the file by the url.
Link this :
https://yourstorageaccount.blob.core.windows.net/mycontainer/test2.csv
If you want to use the SAS token to protect my blob file, you could generate the SAS token as below images shows:
The result is like this
Then you could add this token behind the access blob url.
More details, you could refer to follow link:
https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-shared-access-signature-part-2

Could not verify the copy source within the specified time. RequestId: (blank)

I am trying to copy some blob files from one storage account to another one. I am using AzCopy in order to fulfill this goal.
The process works for copying files between containers within the same storage account, but not between different storage accounts.
The command I am issuing is:
AzCopy /Source:https://<storage_account1>.blob.core.windows.net/<container_name1>/<path_to_desired_blobs> /Dest:https://<storage_account2>.blob.core.windows.net/<container_name2>/<path_to_store>/ /SourceKey:<source_key> /DestKey:<dest_key> /Pattern:<some_pattern> /S
The error I am getting is the following:
The remote server returned an error: (400) Bad Request.
Could not verify the copy source within the specified time.
RequestId:
Time:2016-04-01T19:33:01.0527460Z
The only difference between the two storage accounts is that one is Standard, whereas the other one is Premium.
Any help will be appreciated!
From your description, you're trying to copy Block Blob from source account to Page Blob in destination account, which is not supported in Azure Storage Service and AzCopy.
To work around it, you can firstly use AzCopy to download the Block Blobs from source account to local file system, and then upload them from local file system to destination account with option /BlobType:Page (this option is only valid when uploading from local to blob).
Premium Storage only supports page blobs. Please confirm that you are copying page blobs from standard to premium storage account. Also, specify the BlobType parameter to "page" in order to copy the data as page blobs into destination premium storage account.
From the description, I am assuming your source blob is a block blob. Azure's "Async Copy Blob" process (which is used by AzCopy as the default method) preserves the blob type. That is, you cannot convert a blob type from Block to Page through async copy blob.
Instead, can you try AzCopy again with "/SyncCopy" option along with "/BlobType:page" parameter? That might help change the destination blob type to Page.
(If that doesn't work, only other solution would be to first download the blob, and then upload it with "/BlobType:page")

Resources