I am building an ASP.NET Azure Web Application (Web Role) which controls access to files stored in Azure Blob Storage.
On a GET request, my HttpHandler authenticates the user and creates a Shared Access Signature for this specific file and user with a short time frame (say 30 mins). The client is a media player which checks for updated media files using HEAD, and if the Last-modified header differs, it will make a GET request. Therefore I do not want to create a SAS url but rather return LAst-modified, Etag and Content-length headers in response to the HEAD request. Is this bad practice? In case the file is up to date, there is no need to download the file again and thus no need to create a SAS url.
Example request:
GET /testblob.zip
Host: myblobapp.azurewebsites.net
Authorization: Zm9v:YmFy
Response:
HTTP/1.1 303 See other
Location: https://myblobstorage.blob.core.windows.net/blobcontainer/testblob.zip?SHARED_ACCESS_SIGNATURE_DATA
Any thoughts?
Is there a specific reason to force the client to make a HEAD request first? It can instead authenticate using your service, get a SAS token, make a GET request using If-Modified-Since header against Azure Storage, and download the blob only if it was modified since the last download. Please see Specifying Conditional Headers for Blob Service Operations for more information on conditional headers that Azure Storage Blob service supports.
Related
I am trying to call the Azure-hosted API endpoint using a 3rd party application. Application sends the HTTP request with the header values ContentType = "*/*";
Azure Application Gateway WAF blocks the request showing the below diagnostics logs.
I am aware that, I can add an exclusion in the Web Application Firewall settings, however, I am not able to extract the Request Header Name since the logs do not show the value due to the wildcard(ContentType = "*/*";) content type sent by the 3rd party app.
I added the below rule, but it is still blocking the request.
How can I allow the request via the AGW?
Please find the logs below.
Thanks in advance.
Hello I am trying to deploy my Azure Machine Learning pipeline with a REST endpoint. My problem is that I was able to generate an endpoint but has some sensitive information in it (ex: subscription id, resource group, etc). How can I generate a URL that forwards the request body to my Azure ML REST endpoint?
also, here is an approach I've done:
Used Application Gateway Redirect (this approach didn't forward the request body. It instead turned my POST request into a GET request when it redirected to the correct URL.)
The issue is raised because of some of the default security headers dependent on REST API and web based. Need to set the REST API CSP HEADER. Check the request and response headers in config file of the web application.
I set up cors in azure storage (URL-for example: www.abc.com, GET, * , *, 200)
Then i just copy the link from storage
https://demo.blob.core.windows.net/demo/demo.png
And use it on postman or localhost (web), but still can display pictures, is it normal?
I suppose postman and localhost website will not be able to get images, once cors is set up for azure storage.
CORS prevents cross domain requests that are usually send by AJAX requests. If such a request is send from your browser it will perform a preflight request to see if your current domain is allowed to make such a request. As example it would prevent this site from sending a POST request in the background to api.<yourbank>.com to transfer money.
It won't stop anybody from embedding an image or other file on their website as the browser won't perform such a preflight request unless they call the resource through an AJAX request. Likewise Postman won't do that as it's a testing tool where you explicitely define the request you want to send without being on another 'domain'.
Is there a way to save the output of an Azure Data Factory Web Activity into a dataset?
Here is my current use case:
I have to dynamically build a JSON post request
The API I'm trying to reach requires a SSL certificate so I have to use the Web Activity Authentication Client Certificate option.
The API also requires basic authentication so I input the Content -Type and authorization guid in the header section of the Web Activity.
Once I get the JSON response from my post request I need to save the response into a blob storage some where.
I tried using the Copy Data Set HTTPS or Rest API as a data set source but both only allow one type of authentication certificate or Basic authentication.
Is there a way I can configure the Rest API or HTTPS data set source handle both types of authentications (SSL and Basic Authorization) or capture all the Web Activity output into a blob storage?
Thank you all for your help! I'm desperate at the moment lol..
Here is what my Web Activity looks like (sorry I had hide part of the credentials for security purposes:
Please use Http dataset in your Coput Activity.
When we create the linked service of the Http dataset, select client certificate option and embedded data, then we need to upload the SSL certificate.
The offical document is here.
I am using the Azure 12 month trial account and hosting an excel file on the Storage account through Azure Portal.
I generate a Shared access signature with End date as three months from today and affix the generated SAS token to the File's URL.
I am able to access the file using this process. However, the token quickly expires after some invocations of the URL. The issue was most recently observed after overwriting the file with an updated file on Azure storage account, followed by regenerating the SAS token.
The URL with SAS token suffixed to it looks like:
https://xxxxxx.file.core.windows.net/folder_name/yyyyy.xlsx?sv=2019-02-02&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-12-30T16:04:08Z&st=2019-10-22T08:04:08Z&spr=https,http&sig=xxxxx%yyyyy%zzzz
Here is the error I see:
<Error>
<Code>ConditionHeadersNotSupported</Code>
<Message>
Condition headers are not supported. RequestId:<XXXXX> Time:<YYYYYY>
</Message>
</Error>
The error is random and the URL works intermittently.
Has anyone observed this issue and what could be a fix?
I can reproduce your error.
This does not mean that the SAS token has been expired. Because if you test the Azure Blob Storage, everything will be ok. Error comes from the browser we are using know. The browser add the if condition header.
When there is no if header, you can use it normally.
That is because File Storage does not support if header. And the request which has if header will not be accepted by File Storage.
This is the Offical doc about what kind of header that the File Storage support.
So it is not the responsibility of SAS token. It is a fault about the browser. If it is not special, I suggest you use Azure Blob Storage, it will not cause problems.
The problem you're having is due to Microsoft's Windows-Azure-File/1.0 Microsoft-HTTPAPI/2.0 set up.
The service hasn't been designed primarily for browsers to access files, and is limited in what headers it supports.
Browsers will typically look at the local cached copied of files before attempting to download a new copy. They do this by examining the local file attributes and asking the web server to give them the file "IF" it has been modified after the date corresponding to the cache through the use of the If-Modified-Since header, just as BowmanZhu said.
Instead of ignoring the header, the server is throwing an error. To overcome this, you need to perform a hard reload of the page. In Chrome, you can do this by pressing CTRL + SHIFT + R.