I'm trying to create a table using this operation:
https://msdn.microsoft.com/en-us/library/azure/dd135729.aspx
with a json request body. However, all my efforts are rejected with the following response:
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
<code>JsonFormatNotSupported</code>
<message xml:lang="en-US">JSON format is not supported.
RequestId:41192a52-0002-007b-5334-b57662000000
Time:2016-05-23T20:48:17.4360778Z</message>
</error>
The error is mentioned here:
https://msdn.microsoft.com/en-us/library/azure/dd179438.aspx
But that's all I can find
Here's what I'm sending:
http://requestb.in/1l9sye21?inspect#1jmf39
I think the problem is that you need to add the x-ms-version header:
x-ms-version: 2015-04-05
This is required when using Shared Key / Shared Key Lite auth for the Table Service. See https://msdn.microsoft.com/en-us/library/azure/dd894041.aspx for more information.
Adding to Adam's answer: You need to specify Storage Service Versions in requests when authenticated.
For requests using Shared Key or Shared Key Lite, you must pass the x-ms-version header on the request.
For Requests using a Shared Access Signature (SAS), the SignedVersion (sv) parameter specifies the service version to use to authorize and authenticate.
See https://msdn.microsoft.com/en-us/library/azure/dd894041.aspx for more details.
The storage service version used to authenticate with may be incompatible with the version used to process the request, which will lead to some features such as json not available, thus the REST request fails with error (415) JSON format is not supported..
Refer to https://github.com/Azure/azure-storage-net/issues/32 for some information, though it's with SAS rather than SKA.
Related
I have requirement where it has to be done programmatically using POSTMAN REST API, where I have to upload a file/blob to Azure storage account and retrieve the unique URL of the specific file that I have uploaded, and I have to share that URL to third party so that they can view it on browser.
This is what I have done in the POSTMAN
Request:
PUT https://{storage-account-name}.blob.core.windows.net/{container-name}/{file-name}{SAS-token}
Headers:
x-ms-version: 2020-04-08
x-ms-blob-type: BlockBlob
x-mock-response-name: Upload Blob
Body: Attached a file from my local
Response:
I have received 200 code and file is successfully uploaded. However, in the response headers I don't see any URL or unique SAS token that I can share to my third-party client.
I have also tried adding se and sp to sas token, I got the below error
AuthenticationFailed
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:65282b4e-401e-0050-2337-43ee90000000 Time:2023-02-18T01:20:28.3522177Z
**Signature did not match. String to sign used was r 2023-02-18T09:12:15Z /blob/storage-account-name/container-name/file-name.txt 2021-06-08 b **
Note: We don't want to generate SAS token manually from Azure portal for each file and construct the URL and share it to the client due to high traffic coming in. Once it is successful using POSTMAN. I have to implement the same in IBM App Connect enterprise, ESQL coding*
All the suggestions are much appreciated. Thank you in advance.
Retrieve the unique URL of the specific file that I have uploaded programmatically and share that URL with third party so that they can view it on browser.
In addition to the se and sp parameters, the following parameters are required to construct the correct SAS URL:
signed version (sv)
signed resource (sr)
signature
Your error message says that the signature does not match the rest of the URL. Signature a hash-based message authentication code (HMAC) that you compute over the string-to-sign and key by using the SHA256 algorithm, and then encode by using Base64 encoding
You can find how to construct the string-to-sign and signature depending on the version on this documentation page.
Postman has a built-in JavaScript library that can help you to calculate HMAC:
CryptoJS.HmacSHA1("string-to-sign", "key").toString()
We're trying to use a client certificate to authenticate when calling an OData service in SAP S/4HANA. And we're calling from an azure APIM instance. As certificate we've created a self-signed certificate and configured SAP S/4HANA according to this blog (https://blogs.sap.com/2020/05/03/x.509-certificate-based-logon-to-odata-services/)
Then we test this from the browser it works like a charm.
But calling from azure APIM we get the following response from SAP S/4HANA:
<?xml version="1.0" encoding="utf-8"?> <error xmlns:xsi="http://www.w3.org/2001/XMLSchema-Instance">
<code>HTTP/404/E/Not Found</code>
<message> Service /sap/opu/odata/sap/xxxxyyyy/xxyyzz call was terminated because the corresponding service is not available.The termination occurred in system UFI with error code 404 and for the reason Not found. Please select a valid URL. If it is a valid URL, check whether service /sap/opu/odata/sap/xxxxyyyy/xxyyzz is active in transaction SICF. If you do not yet have a user ID, contact your system administrator. </message>
SAP S/4HANA support says that then calling from browser they can 'see' certificate in payload but then calling from APIM, the payload is 'empty'.
I've got the trace logs from the SAP S/4HANA gateway server and I've noticed this subtly difference calling from browser vs calling from APIM:
Browser call (successfull):
[Thr 140708195055360] HttpModGetDefRules: determined the defactions: COPY_CERT_TO_MPI (1)
APIM call (failed):
[Thr 140708197697280] HttpModGetDefRules: determined the defactions: NOTHING (0)
So the certificate is obviously reaching SAP S/4HANA gateway server but not the SAP S/4HANA Odata server. So somehow, for some reason it's lost on the SAP S/4HANA gateway server only then it comes from azure APIM.
I've tried to make the calls 100% identical (same headers same values) but I can't control the way the certificate is added in azure apim or can one ?
I read that one can set the certificate body using policy below:
<authentication-certificate body="#(context.Variables.GetValueOrDefault<byte[]>("byteCertificate"))" password="optional-certificate-password" />
but I can't figure out how to get a proper value for "byteCertificate".
Has anyone done this? Or has anyone had a similar issue?
We finally found the solution!
Thanks to microsoft APIM support team, thanks a lot :)
APIM acts like a reverse proxy and adds headers related to this role. The header "X-Forwarded-For" causes SAP to deny the request with the above misleading error message. We found a solution that SAP could configure:
ICM parameter "icm/HTTPS/accept_ccert_for_x_forwarded_for_requests" has to be set to "true" - per default it's set to "false".
(The header can't be deleted with a policy on APIM side.)
I am trying to make a REST call to Azure Storage using the following code.
But It shows following error:
<?xml version="1.0" encoding="UTF-8"?>
<Error>
<Code>AuthenticationFailed</Code>
<Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.\n' +
'RequestId:11c07be7-301e-0003-556f-42091d000000\n' +
'Time:2021-05-06T11:59:40.1049759Z</Message>
<AuthenticationErrorDetail>Audience validation failed. Audience did not match.</AuthenticationErrorDetail>
</Error>
I have already assigned roles:
And have API permissions set:
But still this error. Can anyone help?
The audience of your access token is not correct. The aud(audience) should look like https://xxxx.blob.core.windows.net.
Make sure the scope is https://{account-name}.blob.core.windows.net/user_impersonation when requesting for access token.
I am trying to authenticate using the Azure Storage emulator's fixed account/key used for the Azure storage emulator Shared Key authentication.
When sending an anonymous request I get the correct response
but when adding Authorization Header I get:
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<error xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
<code>InvalidAuthenticationInfo</code>
<message xml:lang="en-US">Authentication information is not given in the correct format. Check the value of Authorization header.
RequestId:6d2cc79e-6bce-451c-a6f0-f10e0876f640
Time:2019-07-29T19:22:48.6402756Z</message>
</error>.
This is the key-value pair for the Authorization header:
Any idea on how to resolve this? I have followed documentation but no luck.
Considering you're using a Shared Access Signature (SAS) URL, you don't need to add Authorization header as authorization information is already included in your SAS URL (sig part of your URL).
One thing that you may want to do is change the value of Accept header and set its value to application/json;odata=fullmetadata.
Authorization header comes into picture when you don't use SAS. I noted that you're simply passing your account key as part of your authorization header. That won't work. You will actually need to compute the authorization header. Please see this link for more details: https://learn.microsoft.com/en-us/rest/api/storageservices/authorize-with-shared-key.
I want to Add service certificate to my CloudService of Azure via REST API. Currently I have this:
heres my Request Body: based on this documentation
<?xml version="1.0" encoding="utf-8"?>
<CertificateFile xmlns="http://schemas.microsoft.com/windowsazure">
<Data>MIIB3TCCAUagAwIBAgIQfgPuTBadfItGHpKyYRiRoTANBgkqhkiG9w0BAQUFADAtMSswKQYDVQQDHiIAYwBlAGQAdgBpAGwAbABcAGMAZQBkAHYAaQBsAGwAMAAwMB4XDTEzMDcxNTA4MjIwN1oXDTE0MDcxNTE0MjIwN1owLTErMCkGA1UEAx4iAGMAZQBkAHYAaQBsAGwAXABjAGUAZAB2AGkAbABsADAAMDCBnzANBgkqhkiG9w0BAQEFAAOBjQAwgYkCgYEA2AB6hYWdqu/IG2Jo17tuHpmVsxNqGG5ROnNTtKZd4c7UtQ287EW5McRXqkfBmiwGIe3Pb3S0bd0q51YhT1WhQkGMMwwYLiAmALBct4OK3KNodl0t+rSO5R0Wq9YIaLq3o5HTGAR4wdKhaC/n1uJWPzD+TjkRRHnfEagWPtVjsMECAwEAATANBgkqhkiG9w0BAQUFAAOBgQCBLwjd3e1PaxvEy+Y1nqpTX8q/0ZsS2jVjCgRti0vehLGFlDEbL2rMhzGMo9zjQbXFzGCInMukFUhEI+OWkOBnBIIiYvTkKIFQWpLN7imIiRVuuqGFwslmESBySSO40M56jPXZ7/D0g8d8WDCfO1YoneDv4CuJE97lMTTyEjUJwg==</Data>
<CertificateFormat>cer</CertificateFormat>
</CertificateFile>
but it gives me error:The specified certificate's file format is invalid. The certificate file must be a Base64-encoded .pfx file.
Im certain about the certificate in Base64. What I did to get this is I used and upload manually the same My.cer file to different cloudservice and use List Service Certificate to view its Base64.
Simple answer to your problem is that Service Management API documentation is screwed up. Essentially the API does not care of CertificateFormat node and you have to always pass pfx there. You would also need to provide Password node as well with no value there because you're uploading a cer file. So based on all of this, please try the following as your request body:
<?xml version="1.0" encoding="utf-8"?>
<CertificateFile xmlns="http://schemas.microsoft.com/windowsazure">
<Data>MIIB3TCCAUagAwIBAgIQfgPuTBadfItGHpKyYRiRoTANBgkqhkiG9w0BAQUFADAtMSswKQYDVQQDHiIAYwBlAGQAdgBpAGwAbABcAGMAZQBkAHYAaQBsAGwAMAAwMB4XDTEzMDcxNTA4MjIwN1oXDTE0MDcxNTE0MjIwN1owLTErMCkGA1UEAx4iAGMAZQBkAHYAaQBsAGwAXABjAGUAZAB2AGkAbABsADAAMDCBnzANBgkqhkiG9w0BAQEFAAOBjQAwgYkCgYEA2AB6hYWdqu/IG2Jo17tuHpmVsxNqGG5ROnNTtKZd4c7UtQ287EW5McRXqkfBmiwGIe3Pb3S0bd0q51YhT1WhQkGMMwwYLiAmALBct4OK3KNodl0t+rSO5R0Wq9YIaLq3o5HTGAR4wdKhaC/n1uJWPzD+TjkRRHnfEagWPtVjsMECAwEAATANBgkqhkiG9w0BAQUFAAOBgQCBLwjd3e1PaxvEy+Y1nqpTX8q/0ZsS2jVjCgRti0vehLGFlDEbL2rMhzGMo9zjQbXFzGCInMukFUhEI+OWkOBnBIIiYvTkKIFQWpLN7imIiRVuuqGFwslmESBySSO40M56jPXZ7/D0g8d8WDCfO1YoneDv4CuJE97lMTTyEjUJwg==</Data>
<CertificateFormat>pfx</CertificateFormat>
<Password></Password>
</CertificateFile>
Many thanks to #AzureCoder from http://elastacloud.com/ for putting me in the right direction.