CDN does not return cached response - azure

I have a function app running in azure and I have wrapped the response to be returned as httpResponseMessage:
public static HttpResponseMessage CreateOkRequestMessageResponse(
this HttpRequestMessage req,
string fileName,
bool cache = false,
TimeSpan? maxAge = null)
{
HttpResponseMessage result = new HttpResponseMessage(HttpStatusCode.OK);
result.Content = new StreamContent(new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.None, 4096, FileOptions.DeleteOnClose));
result.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
{
FileName = Path.GetFileName(fileName),
};
result.Headers.Add("Cache-Control", cache ? $"public, max-age={maxAge.GetValueOrDefault().TotalSeconds}" : "no-cache");
return result;
}
When I inspect the calls to the service directly using Postman I can see e.g.:
Then, when I call it via CDN I see:
Which I assume means that it did not return from the cache but instead called the service again.
After a few calls I get finally the response looking like this:
Which is finally is cached.
The CDN is setup using bicep template:
Is there something missing to have CDN work as expected?
The call to the service takes around 3 minutes and I have run multiple calls in a row with the same request but it is still not being returned from the cache.

Related

REST Api is not returning expected data

I wrote an Auth API where it should retrieve the details from my user, but I'm getting error 404 instead. All my users are stored in an Azure Storage Account, and I was using the TableClient class to handle with my table. However I am not able to go any further when I started to do this Auth. I spent over one week only on this function, and I got no progress on this, here is my code:
[FunctionName(nameof(Auth))]
public static async Task<IActionResult> Auth(
[HttpTrigger(AuthorizationLevel.Admin, "POST", Route = "auth")] HttpRequest req,
[Table("User", Connection = "AzureWebJobsStorage")] TableClient tdClient,
ILogger log)
{
string url = String.Format("http://localhost:7235/api/");
HttpMessageHandler handler = new HttpClientHandler()
{
};
var httpClient = new HttpClient(handler)
{
BaseAddress = new Uri(url),
Timeout = new TimeSpan(0, 2, 0)
};
httpClient.DefaultRequestHeaders.Add("ContentType", "application/json");
var plainTextBytes = System.Text.Encoding.UTF8.GetBytes("roy.mitchel#somecompany.com:pass1234");
string val = System.Convert.ToBase64String(plainTextBytes);
httpClient.DefaultRequestHeaders.Add("Authorization", "Basic " + val);
HttpResponseMessage response = httpClient.GetAsync(url).Result;
return new OkObjectResult(response);
}
How can auth my user using this class? Iam doing on the right way?
Thanks.
Debugging any API issues by looking at just the code, is (as you have discovered) a horribly painful process.
I'd strongly recommend using a MITM proxy (like burp) to get visibility of exactly what is sent to, and received from the API. By using this approach, it generally becomes really obvious what is wrong.
If you can't use this approach, then enable logging for the raw HTTP request and response, as outlined in this guide.

Azure.Storage.Blobs.BlobServiceClient CopyFromUri() doesn't seem to be finished copying before returning latest ETag

I am trying to use StartCopyFromUri or StartCopyFromUriAsync to copy a blob from one storage account to another. Even though status.HasCompleted when I try to get the ETag either through
1. var etag = await _siteStorageClient.GetBlobETag(containerPath, asset.BlobName);
//this is the response from WaitForCompletionAsync
2. var etag = complete.GetRawResponse().Headers.Where(x => x.Name == "ETag").FirstOrDefault().Value;
I've tried both methods and both return an Etag that doesn't match what is shown in the properties of the blob when I log in through Azure Portal. It is almost as if the file wasn't done copying(or race condition) when the Etag check was executed. I couldn't find any usage samples on github for the SDK.
Any ideas what could be going awry?
This a similar question but using the older SDK. How to copy a blob from one container to another container using Azure Blob storage SDK
//Storage class
public async Task<CopyFromUriOperation> CopyFile(string containerName, string blobName, Uri sourceUri)
{
var container = _blobServiceClient.GetBlobContainerClient(containerName);
var blockBlobClient = container.GetBlockBlobClient(blobName);
//Made this the synchronous version try and block
//this is the target client
var status = await blockBlobClient.StartCopyFromUriAsync(sourceUri);
while(!status.HasCompleted)
{
//Per Documentation this calls UpdateStatusAsync() periodically
//until status.HasCompleted is true
await status.WaitForCompletionAsync();
}
return status;
}
//Calling Code
var status = await _siteStorageClient.CopyFile(container,BlobName, sasUri);
var etag = await _siteStorageClient.GetBlobETag(container, BlobName);
I was able to get this working after a few tries and troubleshooting. It would only happen in the Azure environment and not when running the web app locally.
Initially the status.WaitForCompletionAsync() was inside the loop and I started getting a socket error. I believe it was getting called too many times and was causing port exhaustion(just speculation at this point).
But this is what is working now.
public async Task<CopyFromUriOperation> CopyFile(string containerName, string blobName,Uri sourceUri)
{
var container = _blobServiceClient.GetBlobContainerClient(containerName);
var blockBlobClient = container.GetBlockBlobClient(blobName);
var status = await blockBlobClient.StartCopyFromUriAsync(sourceUri);
await status.WaitForCompletionAsync();
while(status.HasCompleted == false)
{
await Task.Delay(100);
}
return status;
}

Stream media file using asp.net core and azure cloudblob

I want to create endpoint that will stream video stream that is stored in azure CloudBlob. Here is snippet of my code:
public async Task<IActionResult> GetVideo(string videoId)
{
var videoStream = await _contentStorage.Get(videoId);
var fileStreamResult = new FileStreamResult(videoStream, mimeType);
fileStreamResult.EnableRangeProcessing = true;
return fileStreamResult;
}
and in ContentStorage
public async Task<StoredContent> Get(string id)
{
var block = _blobContainer.GetBlobClient(id);
var ms = await block.OpenReadAsync();
return ms;
}
I had everything working fine except iPhones and safari, after some debugging it turned out that my endpoint is returning 200 http code, but it should 206 - partial content. So I made some changes into my code, here is some snippet:
public async Task<IActionResult> GetVideo(string videoId)
{
var videoStream = await _contentStorage.Get(videoId);
var ms = new MemoryStream();
await videoStream.CopyToAsync(ms);
var fileStreamResult = new FileStreamResult(ms, mimeType);
fileStreamResult.EnableRangeProcessing = true;
return fileStreamResult;
}
Now when I test it on iphone or by postman response is 206, and it's working fine. But I thing that copping the video stream into new memorystream is a valid approach.
Correct me if I'm wrong but I understand this code as for every partial of the video, I'm downloading whole video from blob storage, cut it and then return just the piece within range.
It's not sure for me how to handle this case, is there any out of the box solution for that, or do I need to read range header from request and use OpenReadAsync with parameters as position and buffer side? Or there is another way?
Solution for me was to update Azure.Storage.Blobs library. I had 12.6.0 and after update to 12.7.0 it started working as expected, since they added:
Added seekability to BaseBlobClient.OpenRead().

Azure Functions: No such file or directory

I have a Problem with my Azure Function. I use a Linux based App service Plan (B1). After deploying my Code and running the Azure function I get this Error.
2020-03-11T07:02:55.985 [Error] Executed 'PdfRender_dotnet_framework' (Failed, Id=52201ad7-8012-4f93-bc17-0accae6a1540)
No such file or directory
The strange thing about this is that I can change to this Directory in the console and in Kudu with bash/ssh. So it seems that everything was deployed fine, but why is it still showing me this error message.
I checked many times if everything was deployed right and i didn't find any issues.
Error Message
Directory in Console
Directory in Bash (Kudu)
This is my function.json
I also noticed that the azure function reach the code.
My Function Code.
public static class dotnet_core_pdf
{
[FunctionName("dotnet_core_pdf")]
public static HttpResponseMessage Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequest req, TraceWriter log, ExecutionContext executionContext)
{
string name = req.Query["url"];
log.Info(name);
//Initialize HTML to PDF converter
HtmlToPdfConverter htmlConverter = new HtmlToPdfConverter();
WebKitConverterSettings settings = new WebKitConverterSettings();
//Set WebKit path
settings.WebKitPath = Path.Combine(executionContext.FunctionAppDirectory, "QtBinariesWindows");
//Assign WebKit settings to HTML converter
htmlConverter.ConverterSettings = settings;
//Convert URL to PDF
PdfDocument document = htmlConverter.Convert(name);
MemoryStream ms = new MemoryStream();
//Save the PDF document
document.Save(ms);
ms.Position = 0;
HttpResponseMessage response = new HttpResponseMessage(HttpStatusCode.OK);
response.Content = new ByteArrayContent(ms.ToArray());
response.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
{
FileName = "HTMLToPDFAzure.pdf"
};
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/pdf");
return response;
}
}
}
NOTICE: I changed the folders Name, I am using .Net Core not .Net Framework

.net core webapi causes iis application pool to shutdown

Background:
I'm building a .net core webapi does practically nothing more than checking if a given URL exists and returns the result. If a URL exists and is a redirect (301, 302), the api follows the redirect and returns that result as well. The webapi is called by an SPA which does an api-call for every given url in a checkrequest-queue. So, if someone adds 500 urls to the queue the SPA will loop through it and will send 500 calls to the API – something I could improve upon.
The problem:
My IIS application pool is being shut down on a regular basis due to high CPU usage and/or memory usage:
A worker process serving application pool 'api.domain.com(domain)(4.0)(pool)' has requested a recycle because it reached its private bytes memory limit.
The only way to get my API going again is to manually restart the application. I don't think the operations performed by the API are that demanding, but I surely must be doing something wrong here. Can somebody help me please? The code called by the SPA is:
var checkResponse = new CheckResponse();
var httpMethod = new HttpMethod(request.HttpMethod.ToUpper());
var httpRequestMessage = new HttpRequestMessage(httpMethod, request.Url);
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls;
var httpResponseMessage = await httpClient.SendAsync(httpRequestMessage);
checkResponse.RequestMessage = httpResponseMessage.RequestMessage;
checkResponse.Headers = httpResponseMessage.Headers;
checkResponse.StatusCode = httpResponseMessage.StatusCode;
switch (httpResponseMessage.StatusCode)
{
case HttpStatusCode.Ambiguous:
case HttpStatusCode.Found:
case HttpStatusCode.Moved:
case HttpStatusCode.NotModified:
case HttpStatusCode.RedirectMethod:
case HttpStatusCode.TemporaryRedirect:
case HttpStatusCode.UseProxy:
var redirectRequest = new CheckRequest
{
Url = httpResponseMessage.Headers.Location.AbsoluteUri,
HttpMethod = request.HttpMethod,
CustomHeaders = request.CustomHeaders
};
checkResponse.RedirectResponse = await CheckUrl(redirectRequest);
break;
}
The Action on my ApiController:
[HttpPost]
public async Task<IActionResult> Post([FromBody] CheckRequest request)
{
if (!ModelState.IsValid)
{
return BadRequest(ModelState);
}
var result = await CheckService.CheckUrl(request);
return Ok(result);
}

Resources