How to Access Azure Table Storage using Shared Access Key in Postman? - azure

I use the SAS key to make a GET request for the table, but I get the error "This request is not authorized to perform this operation.". What values should I fill in on the authorization tab to work this out?
enter image description here
x-ms-blob-type and BlobBlock work for blob storage but I need key and value for table type.

If you are using SAS, you don't need any API key. Set Authorization to 'Inherit Auth from Parent'.
Below are the settings I used to get my Table SAS.
And below are the settings in my Postman
The url I used for my query was https://anumystorage.table.core.windows.net/mytable(PartitionKey='A',RowKey='B')<your_sas_token> <-- Replace <your_sas_token> with your own Sas starting with '?sv=.....'
The information on the Table storage Rest API can be found here.

I tried to reproduce the same in my environment and got the below results successfully:
I created an Azure Storage Account and created Table storage:
I generated SAS URL by checking the below options:
Copy the Azure Table Storage SAS:
To access the Azure Table Storage, include the Table Storage Name like below:
https://StorageAccount.table.core.windows.net/TableStorage?TableSASUrl
Response:

Related

rest operations Azure Datalake gen2

I want to do operations in Azure datalake gen2 using rest operations. I have a service principal with client secret and having owner access on storage account.
I am confused how to construct the request for operations. I can't find any proper example demonstrating it.
The way which I want is to:
Get access token
Make a put request with bearer authentication method
Below are documents which I am referring
Access token
Put blob operation
I want to do it through postman. It would be really helpful if someone can suggest it
I tried to reproduce the same in my environment and got below results:
I created one service principal named DataLake and added API permissions as below:
Now, I granted Storage Blob Data Contributor role to that service principal at storage account level like below:
Go to Azure Portal -> Storage Accounts -> Your storage account -> Access Control (IAM) -> Add role assignment -> Storage Blob Data Contributor
To generate the access token via Postman, I used below parameters:
POST https://login.microsoftonline.com/<tenantID>/oauth2/v2.0/token
client_id:<appID>
grant_type:client_credentials
client_secret:<secret>
scope: https://storage.azure.com/.default
Response:
When I ran the below query by including above Bearer token, I got Status 201 Created like below:
PUT https://<storageaccname>.blob.core.windows.net/<container_name>/test.txt
Authorization:Bearer <token>
x-ms-version:2017-11-09
x-ms-blob-type:BlockBlob
Response:
You need to attach the file in Postman before running the query like below:
When I checked the same in Azure Portal, file uploaded to storage account successfully like below:

How can I import all records from an Airtable table using an Azure Synapse Analytics pipeline rather than just retrieving the first 100?

When using the REST integration in an Azure Synapse pipeline and supplying the proper authorization (api_key), I'm only getting 100 records loaded into my Azure Synapse data sink. How do I ensure all records are imported?
There is a pagination offset that appears in the JSON response of Airtable. On the Source tab of the copy data step in Synapse, under Pagination rules, select QueryParameter, enter "offset" (no quotes) into the field next to QueryParameter, and enter "$['offset']" (no quotes) into the Value. That's it - no need for relative URL or a parameter configuration. The pagination rule tells synapse to look for the data element "offset" in the response and to continue fetching more data until a response no longer contains that data element in the JSON. See screenshot below. The second screenshot shows the authorization configuration.
The authorization configuration for the Airtable API is shown below - this causes Synapse to include the HTTP header and value "Authorization: Bearer " to the Airtable API. Just replace <api_key> with your Airtable api key which can be found and / or created under your account settings in Airtable.

Azcopy 409 Public access is not permitted on this storage account

I try to copy file from Linux virtual machine on Azure (in a virtual network) to an account storage.
With Azcopy login It's working but I want to make it with SAS token.
I add my virtual network in "Networking".
image
And I generate a SAS key in "Shared acces signature"
image
On my linux virtual machine I have 10.0.3.4 ip adress.
image
I run this command sudo azcopy cp ./myFile https://backupscanqa.blob.core.windows.net/backup/?[mySASKey]
image
In my log I have this:
image
I dont know where is the problem because when I try the same thing with oAuth2 connexion with azcopy login it's working.
Thanks for your help !
Edit:
I try to generate a SAS key in my container with all grants:
When I use it it's the same error:
My sas key tranfom to sp=racwdli
From the logs I could see the SAS token you are using is incorrect. In your image its only sp=r in SAS token , whereas it should be something like this in the image if you are generating the SAS token as you have mentioned.
I tested the same in my environment , added firewall in Storage account like :
Using the generated SAS token as you have mentioned , the operation get successful using the below command :
./azcopy copy "/home/ansuman/azcopy_linux_amd64_10.13.0/NOTICE.txt" "https://testansumansa123.blob.core.windows.net/backup?sv=2020-08-04&ss=bfqt&srt=sco&sp=rwdlacupitfx&se=2022-01-27T15:18:31Z&st=2022-01-27T07:18:31Z&spr=https&sig=XXXXXX"
Which is in format of
./azcopy copy "SourcePath" "storageaccounturl/container<SASToken>"
As you can see if SAS is generated by the method in your image then it will have permissions as sp=rwdlacupitfx which is all permissions in the Storage account.
To resolve the issue , Please check the SAS token you are using .
If you are generating from Storage account like you have shown in image then you can use the SAS token by appending it behind your storage account url/container.
If you are generating the SAS token from inside the container , Please make sure to have selected the necessary permissions from the drop down as shown below and then you can use the Blob SAS URl :
#AnsumanBal-MT put me on the right track.
As he very well noticed in the logs, my SAS key does not appear.
However, I did copy my key.
So I understood that from the '&' in the URL the characters were not taken into account.
After adding '' before each & the command worked correctly!
Thanks you #AnsumanBal-MT!

Creating an Azure Blob Container using a SAS

I'm trying to create a new container in a Blob Storage account using the Create Container API.
https://myaccount.blob.core.windows.net/mycontainer?restype=container
I can't get this to work, I'm struggling to get the format of the Authorization header right. Other blob services I've used allow this to be passed as a query parameter.
I have the SAS token, similar to ?sv=2019-12-12&ss=bfqt&srt=sco&sp=rwdlacupx&se=2022-02-01T16:52:59Z&st=2021-02-02T08:52:59Z&spr=https&sig=r4%2B7dlSfSO8kyd8mKawHhXNtRzInq7YI%2FIbqSr1g%2FqE%3D
How do I form the Authorization header correctly to pass this?
Thanks.
To create a blob container by using Create Container rest api, if you're using "sas token", then you don't need to add "Authorization" in the Headers.
Assume you have a correct "sas token", then you the request url should look like this(Note: you should remove the first "?" from the "sas token"):
https://myaccount.blob.core.windows.net/mycontainer?restype=container&your_sas_token(note that remove the first ? from sas token)
And in the Headers, you just need pass x-ms-date and x-ms-version.
Here is the test by using Postman:
By the way, here is the screenshot about how to generate the "sas token" for creating blob container:

Getting a blob content using user delegation SAS created using user delegation key

I have created an AAD app as per https://learn.microsoft.com/en-us/azure/storage/common/storage-auth-aad-app.
The access is given to the azure storage account for the AAD app created.
Got the client id and client secret.
To create a user delegation key and user delegation sas, I am using the approach and code as defined in
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-user-delegation-sas-create-dotnet.
(set environment variables as mentioned in article).
I am able to generate the user delegation key using method GetUserDelegationSasBlob.
The container and blob file is existing one.
Now I am using the method ReadBlobWithSasAsync to read the contents of the blob using the SAS uri as generated above.
But, I get error as below.
This request is not authorized to perform this operation using this
permission. RequestId:5d127eaf-101e-00be-6666-6a3875000000
Time:2019-09-13T19:04:15.4109144Z
Status: 403 (This request is not authorized to perform this operation
using this permission.)
ErrorCode: AuthorizationPermissionMismatch
In another approach, I am generating the user delegation key using rest api.
https://learn.microsoft.com/en-us/rest/api/storageservices/get-user-delegation-key
I am able to get user delegation key in xml format.
I am creating SAS from it as per steps in
https://learn.microsoft.com/en-us/rest/api/storageservices/create-user-delegation-sas
For signature, I am using this code, using StringToSign and secret value as received from delegation key.
var encoding = new System.Text.ASCIIEncoding();
byte[] keyByte = encoding.GetBytes(secret);
byte[] messageBytes = encoding.GetBytes(ToSign);
using (var hmacsha256 = new HMACSHA256(keyByte))
{
byte[] hashmessage = hmacsha256.ComputeHash(messageBytes);
String sig= Convert.ToBase64String(hashmessage);
}
I am doing the GET request.
I have tried various set of parameter values, like,
sr: b and c
sks: b and c
sp: racwd and r and rw and few more
skv and sv is 2018-11-09 because this version is required for creating user delegation key.
But the GET api returns the error.
AuthenticationFailed
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the
signature. RequestId:e4bc8f0f-d01e-0046-7367-6af368000000
Time:2019-09-13T19:12:27.7780695Z
Signature fields not well formed.
Try to assign the Storage Blob Data Contributor role to the storage account.
The Reader role is an Azure Resource Manager role that permits users to view storage account resources, but not modify them. It does not provide read permissions to data in Azure Storage, but only to account management resources.
Refer to this article.

Resources