azcopy in cloud shell fails attempting to copy source storage account contents to target storage account - azure

error is 'failed to perform copy command due to error: no SAS token or OAuth token is present and the resource is not public'
command is formed per the MS doc page.
azcopy copy "http://source.blob.core.windows.net?sv=2019-02-02&ss=bfqt&srt=sco&sp=rwdlacup&se=2020-02-15T02:43:29Z&st=2020-02-14T18:43:29Z&spr=https&sig=Jojd%2FWIfjCza7yNtkt%2FeaFOepyxaunBnjH6O3xCKOto%3D" "http://target.blob.core.windows.net" --recursive
SAS token was generated within minutes of attempted execution, public access on source storage account is set to container, not private. Command syntax is identical to that in the MS documentation.
I'm thinking that I have overlooked something basic, probably staring me in the face, anyone see anything obvious?

If you're following this section, then please specify sas token for the target storage url.
Here is the test with the latest version of azcopy v10.3.4.
The command:
azcopy copy "https://yy7.blob.core.windows.net/?sasToken" "https://yy88.blob.core.windows.net/?sasToken" --recursive
The test result:

Related

How to use SAS Token in Azure File Copy Pipeline Task?

I am trying to copy ADF ARM templates to a storage account using the Azure File Copy Task in my Azure release pipeline. Since the storage account has firewall and networking set up, I want to use the SAS token to allow the Pipeline agent to copy the files to the storage account.
However, I am not able to find any documentation as to how to pass the SAS token as the optional argument(or at another place).
The task version is 2*.
How do I use the SAS token for copying the files?
I changed the file copy task version to 4*. ; and then I was able to add
--sas-token=$(sasToken) into the Optional Arguments and it worked for me.

adls storage to adls storage file transfer

Trying to send files using a remote on-premises server and azcopy from one ADLS storage account to another storage account(weird requirement but needed).
azcopy cp 'https://mysourceaccount.dfs.core.windows.net/mycontainer?sxxxxxx' 'https://mydestinationaccount.dfs.core.windows.net/mycontainer' --recursive
throws an error saying below:
I tested in my environment and it is working for me Please use the below formatted command (You were missing SAS token for second container and –recursive=true)
azcopy copy "https://tstadlsstorage1.dfs.core.windows.net/testcontainer?SAS_token_for_container_in_source_storageaccount" "https://tstadlsstorage2.dfs.core.windows.net/testcontainer2?SAS_token_for_container_in_destination_storageaccount"
--recursive=true
Output--

using AZCOPY with only connectionstring

I am provided with only a 'connection string' for azcopy.
Connectionstring: DefaultEndpointsProtocol=https;AccountName=someaccoutname;AccountKey=someaccountkey;EndpointSuffix=core.windows.net
URL: https://someaccoutname.blob.core.windows.net/somename
I do not have a 'sas' token or access to the azure portal to create a sas token'.
How can I use AZCOPY to sync a folder on a VM, to that azure storage account, with only the connection string.
Use Azure Storage Explorer to download the SAS needed. Download and install it from here.
When it opens, connect to your storage account using the connection string, navigate to the container and keep it selected in the left containers pane.
Bottom left -> change from Properties tab to Actions -> Get Shared Access Signature (that's SAS)
Set the expiry date/time, check on permissions you'll need. (Add, Write etc if you want to upload something, probably Delete as well if you want to over-write older files). Create.
Copy the URL. This will be your destination string, with the SAS included.
Note: if there's a $ sign in it, replace with "%24" - at least for linux that seems to be required.
Now form your azcopy command (uploading a folder here)
azcopy copy --from-to=LocalBlob "localfolder/" "destination-with-sas" --recursive
That is simple.
The connection string has 2 information you need.
[account] = someaccoutname
[acesskey] = somthing like this 2iusdofiausd98273412934213/fsdf23409237409dfoasihdfasir9028742hvhxczoivhsadfSFAOIf34Jq==
azcopy cp https://[account].blob.core.windows.net/folder/subfolder/file.txt?[accesskey from connection string, it ends with ==] c:\temp\
You can use az storage container generate-sas to generate the SAS token, see https://learn.microsoft.com/en-us/azure/applied-ai-services/form-recognizer/generate-sas-tokens

Azcopy error "This request is not authorized to perform this operation."

I copied a container to another storage account based on the document linked below.
(DataLake Storage Gen2).
When trying, I got the following error:
this request not authorized to perform this operations using this permission
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10
If you are using AAD Token, this error is telling you that you need to add a role assignment to the user. Please go to Storage account -> Access Control -> Add -> Add role assignment, then add Storage Blob Data Owner to your login account.
If this problem persists, please provide more details.
I also faced the same problem. For me to work I just log out and log in again on the azcopy cli after doing the #BowmanZhu solution
azcopy logout
azcopy login --tenant-id xxxx-xxxx-xxxx
If you don't want to login that way there is always the option to add a SAS token at the end of the URL. If you don't want to attach the token always at the end you can try for permanent access by going through any one of these steps you find in the official documentation page.
After granting myself with role Storage Blob Data Owner on the container, then AzCopy will now behave itself and succeed in copying a file to the blob storage container.
go to storageaccount -> container -> Access control rules -> add role assignement -> Storage Blob Data Owner
In my case, my azure storage account vnet address was blocking the azcopy from copying the data over the storage account.
I added my client IP to allow a firewall address.
The SAS token has probably expired.
When I had this, I discovered it was because I'd used Azure Storage Explorer to generate a SAS that didn't have read permission, and I think it was trying to read the size/existence of a blob before writing it.
I got a clue from https://github.com/Azure/azure-storage-azcopy/issues/790 but ultimately I just regenerated a new SAS with read permission and it worked out..
I probably could ahve looked to modify the C# code using Azure Data Movement lib, to not perform a length check, but the spec was later changed to "don't overwrite" so the read permissions are probably needed anyway
Give appropriate permissions(read, write, create) while generating SAS tokens
as here
Had a similar issue. That's how was resolved
Command used was .\azcopy.exe copy "C:\Users\kriof\Pictures" "https://test645676535storageaccount.blob.core.windows.net/images?sp=rw&st=2022-02-23T11:03:50Z&se=2022-02-23T19:03:50Z&spr=https&sv=2020-08-04&sr=c&sig=QRN%2SMFtU3zaUdd4adRddNFjM2K4ik7tNPSi2WRL0%3D"
SAS token had default(Read) permission only. Adding Write permission
in Azure Portal, resolved the issue.

Azure cross-account copy using AzCopy and Shared Access Key

I want to use AzCopy to copy a blob from account A to account B. But instead of using access key for the source, I only have access to the Shared Access Key. I've tried appending the SAS after the URL, but it throws a 404 error. This is the syntax I tried
AzCopy "https://source-blob-object-url?sv=blah-blah-blah-source-sas" "https://dest-blob-object-url" /destkey:base64-dest-access-key
The error I got was
Error parsing source location "https://source-blob-object-url?sv=blah-blah-blah-source-sas":
The remote server returned an error: (404) Not Found.
How can I get AzCopy to use the SAS URL? Or that it doesn't support SAS?
Update:
With the SourceSAS and FilePattern options, I'm still getting the 404 error. This is the command I use:
AzCopy [source-container-url] [destination-container-url] [file-pattern] /SourceSAS:"?sv=2013-08-15&sr=c&si=ReadOnlyPolicy&sig=[signature-removed]" /DestKey:[destination-access-key]
This will get me a 404 Not Found. If I change the signature to make it invalid, AzCopy will throw a 403 Forbidden instead.
You're correct. Copy operation using SAS on both source and destination blobs is only supported when source and destination blobs are in same storage account. Copying across storage accounts using SAS is still not supported by Windows Azure Storage. This has been covered (though one liner only) in this blog post from storage team: http://blogs.msdn.com/b/windowsazurestorage/archive/2013/11/27/windows-azure-storage-release-introducing-cors-json-minute-metrics-and-more.aspx. From the post:
Copy blob now allows Shared Access Signature (SAS) to be used for the
destination blob if the copy is within the same storage account.
UPDATE
So I tried it and one thing I realized is that it is meant for copying all blobs from one container to another. Based on my trial/error, a few things you would need to keep in mind are:
Source SAS is for source container and not the blob. Also ensure that you have both Read and List permission on the blob container in the SAS.
If you want to copy a single file, please ensure that it is defined as "filepattern" parameter.
Based on these, can you please try the following:
AzCopy "https://<source account>.blob.core.windows.net/<source container>?<source container sas with read/list permission>" "https://<destination account>.blob.core.windows.net/<destination container>" "<source blob name to copy>" /DestKey:"destination account key"
UPDATE 2
Error parsing source location [container-location]: Object reference
not set to an instance of an object.
I was able to recreate the error. I believe the reason for this error is the version of storage client library (and thus the REST API) which is used to create SAS token. If I try to list contents of a blob container using a SAS token created by using version 3.x of the library, this is the output I get:
<?xml version="1.0" encoding="utf-8"?>
<EnumerationResults ServiceEndpoint="https://cynapta.blob.core.windows.net/" ContainerName="vhds">
<Blobs>
<Blob>
<Name>test.vhd</Name>
<Properties>
<Last-Modified>Fri, 17 May 2013 15:23:39 GMT</Last-Modified>
<Etag>0x8D02129A4ACFFD7</Etag>
<Content-Length>10486272</Content-Length>
<Content-Type>application/octet-stream</Content-Type>
<Content-Encoding />
<Content-Language />
<Content-MD5>uflK5qFmBmek/zyqad7/WQ==</Content-MD5>
<Cache-Control />
<Content-Disposition />
<x-ms-blob-sequence-number>0</x-ms-blob-sequence-number>
<BlobType>PageBlob</BlobType>
<LeaseStatus>unlocked</LeaseStatus>
<LeaseState>available</LeaseState>
</Properties>
</Blob>
</Blobs>
<NextMarker />
</EnumerationResults>
However if I try to list contents of a blob container using a SAS token created by using version 2.x of the library, this is the output I get:
<?xml version="1.0" encoding="utf-8"?>
<EnumerationResults ContainerName="https://cynapta.blob.core.windows.net/vhds">
<Blobs>
<Blob>
<Name>test.vhd</Name>
<Url>https://cynapta.blob.core.windows.net/vhds/test.vhd</Url>
<Properties>
<Last-Modified>Fri, 17 May 2013 15:23:39 GMT</Last-Modified>
<Etag>0x8D02129A4ACFFD7</Etag>
<Content-Length>10486272</Content-Length>
<Content-Type>application/octet-stream</Content-Type>
<Content-Encoding />
<Content-Language />
<Content-MD5>uflK5qFmBmek/zyqad7/WQ==</Content-MD5>
<Cache-Control />
<x-ms-blob-sequence-number>0</x-ms-blob-sequence-number>
<BlobType>PageBlob</BlobType>
<LeaseStatus>unlocked</LeaseStatus>
<LeaseState>available</LeaseState>
</Properties>
</Blob>
</Blobs>
<NextMarker />
</EnumerationResults>
Notice the difference in <EnumerationResults> XElement.
Now AzCopy uses version 2.1.0.4 version of the storage client library. As a part of copying operation it first lists the blobs in source container using the SAS token. Now as we saw above the XML returned is different in both versions so storage client library 2.1.0.4 fails to parse the XML returned by storage service. Because it fails to parse the XML, it is not able to create a Blob object and thus you get the NullReferenceException.
Solution:
One possible solution to this problem is to create a SAS token using version 2.1.0.4 version of the library. I tried doing that and was able to successfully copy the blob. Do give it a try. That should fix the problem you're facing.
Make sure you are using the latest version of the AzCopy and
check this http://blogs.msdn.com/b/windowsazurestorage/archive/2013/09/07/azcopy-transfer-data-with-re-startable-mode-and-sas-token.aspx
/DestSAS and /SourceSAS: This option allows access to storage containers and blobs with a SAS (Shared Access Signature) token. SAS token, which is generated by the storage account owner, grants access to specific containers and blobs with specifc permissions and for a specified period of time.
Example: Upload all files from a local directory to a container using SAS token which offers permits for list and write
AzCopy C:\blobData https://xyzaccount.blob.core.windows.net/xyzcontainer /DestSAS:”?sr=c&si=mypolicy&sig=XXXXX” /s
/DestSAS here is for you to specify the SAS token to access storage container, it should be enclosed in quotes.
You can use IaaS Management Studio to generate the powershell script for you. It is a commercial tool, but you can to that in the trial version. It does not use AzCopy though, but the classic blob API in powershell.
Just "Share the VHD" to get the SAS link. Then "Import from shared link", copy the SAS link you got earlier. Check at the bottom, you'll see a script icon. Put your cursor on it and it shows up.
However, in the trial, you can't copy the script, you'll need to type it by hand, but it is not very long to do so.

Resources