Azcopy 409 Public access is not permitted on this storage account - linux

I try to copy file from Linux virtual machine on Azure (in a virtual network) to an account storage.
With Azcopy login It's working but I want to make it with SAS token.
I add my virtual network in "Networking".
image
And I generate a SAS key in "Shared acces signature"
image
On my linux virtual machine I have 10.0.3.4 ip adress.
image
I run this command sudo azcopy cp ./myFile https://backupscanqa.blob.core.windows.net/backup/?[mySASKey]
image
In my log I have this:
image
I dont know where is the problem because when I try the same thing with oAuth2 connexion with azcopy login it's working.
Thanks for your help !
Edit:
I try to generate a SAS key in my container with all grants:
When I use it it's the same error:
My sas key tranfom to sp=racwdli

From the logs I could see the SAS token you are using is incorrect. In your image its only sp=r in SAS token , whereas it should be something like this in the image if you are generating the SAS token as you have mentioned.
I tested the same in my environment , added firewall in Storage account like :
Using the generated SAS token as you have mentioned , the operation get successful using the below command :
./azcopy copy "/home/ansuman/azcopy_linux_amd64_10.13.0/NOTICE.txt" "https://testansumansa123.blob.core.windows.net/backup?sv=2020-08-04&ss=bfqt&srt=sco&sp=rwdlacupitfx&se=2022-01-27T15:18:31Z&st=2022-01-27T07:18:31Z&spr=https&sig=XXXXXX"
Which is in format of
./azcopy copy "SourcePath" "storageaccounturl/container<SASToken>"
As you can see if SAS is generated by the method in your image then it will have permissions as sp=rwdlacupitfx which is all permissions in the Storage account.
To resolve the issue , Please check the SAS token you are using .
If you are generating from Storage account like you have shown in image then you can use the SAS token by appending it behind your storage account url/container.
If you are generating the SAS token from inside the container , Please make sure to have selected the necessary permissions from the drop down as shown below and then you can use the Blob SAS URl :

#AnsumanBal-MT put me on the right track.
As he very well noticed in the logs, my SAS key does not appear.
However, I did copy my key.
So I understood that from the '&' in the URL the characters were not taken into account.
After adding '' before each & the command worked correctly!
Thanks you #AnsumanBal-MT!

Related

How to Access Azure Table Storage using Shared Access Key in Postman?

I use the SAS key to make a GET request for the table, but I get the error "This request is not authorized to perform this operation.". What values should I fill in on the authorization tab to work this out?
enter image description here
x-ms-blob-type and BlobBlock work for blob storage but I need key and value for table type.
If you are using SAS, you don't need any API key. Set Authorization to 'Inherit Auth from Parent'.
Below are the settings I used to get my Table SAS.
And below are the settings in my Postman
The url I used for my query was https://anumystorage.table.core.windows.net/mytable(PartitionKey='A',RowKey='B')<your_sas_token> <-- Replace <your_sas_token> with your own Sas starting with '?sv=.....'
The information on the Table storage Rest API can be found here.
I tried to reproduce the same in my environment and got the below results successfully:
I created an Azure Storage Account and created Table storage:
I generated SAS URL by checking the below options:
Copy the Azure Table Storage SAS:
To access the Azure Table Storage, include the Table Storage Name like below:
https://StorageAccount.table.core.windows.net/TableStorage?TableSASUrl
Response:

Unable to access Azure container through browser

This is the error showing in browser when I hit the url of container:
This XML file does not appear to have any style information associated with it. The document tree is shown below.
ResourceNotFound
The specified resource does not exist. RequestId:3fc3c275-301e-000f-3193-f99692000000 Time:2022-11-16T08:13:12.8837824Z
But I am able to access the blob when I hit the URL of blob.
I tried to reproduce the Same in my environment in got the same error as below:
To resolve this issue, try to give access in container url with SAS token like below:
And generate a SAS token and include it in below Url:
https://<storage-account-name>.blob.core.windows.net/<containername>?restype=container&comp=list&<sas-token>
When I ran the same, I got the result successfully like below:
Right click to your container folder and then select change access level and you are done!

Download a file from sas token

I have created a storage_account with a container named data.
In that container I have a single .zip file.
I'm generating an Account Key SAS Token with Read permission directly on the data container :
The Blob SAS URL looks like this :
https://<STORAGE_ACCOUNT>.blob.core.windows.net/data?sp=r&st=2022-06-06T15:23:31Z&se=2022-06-06T23:23:31Z&spr=https&sv=2020-08-04&sr=c&sig=<SIGNATURE>
How am I supposed to download my zip file from that URI?
I'm always running into some Authorization error whereas I though having the link was enough and unfortunately documentation didn't help me to figure out what's wrong.
I would like to download the file from a HTTP call, not using az copy or powershell.
from your description and the URL you provided, I guess the issue is that you didn't reference the name of the zip file in the URL
so instead of
https://<STORAGE_ACCOUNT>.blob.core.windows.net/data?sp=r&st=2022-06-06T15:23:31Z&se=2022-06-06T23:23:31Z&spr=https&sv=2020-08-04&sr=c&sig=<SIGNATURE>
try
https://<STORAGE_ACCOUNT>.blob.core.windows.net/data/zipName?sp=r&st=2022-06-06T15:23:31Z&se=2022-06-06T23:23:31Z&spr=https&sv=2020-08-04&sr=c&sig=<SIGNATURE>

How to get GS_SECRET_ACCESS_KEY and GS_ACCESS_KEY_ID in google cloud storage

I try to read data from public google cloud storage (https://console.cloud.google.com/storage/browser/gcp-public-data-landsat) with Landsat images.
I use for it Python3 with GDAL.
But I have an error
ERROR 15: GS_SECRET_ACCESS_KEY+GS_ACCESS_KEY_ID, GS_OAUTH2_REFRESH_TOKEN or GOOGLE_APPLICATION_CREDENTIALS or GS_OAUTH2_PRIVATE_KEY+GS_OAUTH2_CLIENT_EMAIL configuration options and /home/qwerty/.boto not defined
How can I get GS_SECRET_ACCESS_KEY and GS_ACCESS_KEY_ID ?
gs_access_key_id and gs_secret_access_key can be generated by
1.navigating to your Google Cloud Storage console and clicking on Settings.
2.on the Settings page, navigate to the Interoperability tab.
3.on this page you can now choose a service account and click Create a new key to generate an access key and a matching secret.
Output:
New service account HMAC key
service-account#project.iam.gserviceaccount.com
Access key
xxxxxxxxxxxxxxxxxxx
Secret
xxxxxxxxxxxxxxxxxxx
Copy this key's secret if you'll need it in the future. Once you close this dialog, the secret can't be recovered.
An alternative authentication approach to reading and optionally writing GeoTIF in GCS.
In CGP Console
Create a Service Account and download a JSON keyfile. Keep it safe.
On the Storage Bucket browser, Permission tab, give the new Service Account Roles for:
Storage Legacy Bucket Owner and
Storage Legacy Bucket Reader
path_to_credentials points to your key file
GEOTIF_PATH = "/vsigs/<my-bucket>/<objectname>"
with rasterio.Env(GOOGLE_APPLICATION_CREDENTIALS=path_to_credentials):
with rasterio.open('{}'.format(GEOTIF_PATH)) as src:
print(src.width, src.height)
profile = src.profile
print(profile)
However, sometimes I am unable to open a file that I think should be openable. Also, if the path is incorrect, like including "gs://" or missing the /vsigs, I get this error:
CPLE_NotSupportedError: CPLRSASHA256Sign() not implemented: GDAL must be built against libcrypto++ or libcrypto (openssl)
But that can't be right or it wouldn't work at all? However, restarting the jupyter notebook seemed to clear it up.
I am running GDAL 2.3.3, released 2018/12/14, Python 3.7.11 on Windows.

SAS URLs not working

I'm trying to create a SAS URL for a blob storage container. I've tried multiple storage accounts and multiple methods of creating the SAS, and all of them give this result when I test the SAS URL in a browser:
<Error>
<Code>AuthenticationFailed</Code>
<Message>
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:d95bf34f-0001-0022-4430-b1a25b000000 Time:2016-05-18T18:12:30.5552096Z
</Message>
<AuthenticationErrorDetail>
Signature did not match. String to sign used was rl 2016-05-18T18:10:00Z 2016-05-19T18:10:00Z /blob/cloudappmanager/$root 2015-04-05
</AuthenticationErrorDetail>
</Error>
I tried Storage Explorer (right-click container, Get SAS, click OK with defaults):
I tried the old Storage Explorer:
And I tried PowerShell:
PS C:\Users\virklba> $context = New-AzureStorageContext -StorageAccountName msuscoreaprod
cmdlet New-AzureStorageContext at command pipeline position 1
Supply values for the following parameters:
(Type !? for Help.)
StorageAccountKey: xxxxxxxxx
PS C:\Users\virklba> New-AzureStorageContainerSASToken -Name aadlogs -Context $context -FullUri -Permission rl
https://msuscoreaprod.blob.core.windows.net/aadlogs?sv=2015-04-05&sr=c&sig=xxxxxxxx&se=2016-05-18T19%3A47%3A56Z&sp=rl
All with the same result. Is anyone else seeing this behavior, or is it just me?
You are creating a SAS on the container, and it looks like you are trying to read the container in the browser. When I paste the container SAS into the browser, I get the same error you are getting.
The container SAS (with read permissions) gives you read access to the blobs in the container. So you need to append a blob name to the SAS before you paste it into the browser, in order to read a blob.
For example, this will not work:
https://myaccount.blob.core.windows.net/lotsofblobs?st=2016-05-18T22%3A49%3A00Z&se=2016-05-19T22%3A59%3A00Z&sp=rl&sv=2015-04-05&sr=c&sig=62WHwaZGI60ub1hYcQyKg1%2FE%2F1w9HUrOPGorzoWDLvE%3D
This does work, with myblob.txt appended to the base URL:
https://myaccount.blob.core.windows.net/lotsofblobs/myblob.txt?st=2016-05-18T22%3A49%3A00Z&se=2016-05-19T22%3A59%3A00Z&sp=rl&sv=2015-04-05&sr=c&sig=62WHwaZGI60ub1hYcQyKg1%2FE%2F1w9HUrOPGorzoWDLvE%3D
Please also see Gaurav Mantri's detailed explanation here: Azure Shared Access Signature - Signature did not match
To fix this, try connecting the storage account first, then the blob.

Resources