Make Azure function blob binding case-insensitive - azure

I have a C# Azure HTTPTrigger Function which will get a file from my blob storage.
This is a public REST method. The caller sends a string, which can be in any case.
Because the files are stored case sensitive, I'm not using a direct Blob binding but use the more generic binding, convert the input to upper-case and create a new Blob binding to get the blob stream:
public async Task<IActionResult> GetFile(
[HttpTrigger(AuthorizationLevel.Function, "get", Route = "file/{name}")] HttpRequest req,
IBinder binder, ExecutionContext context, string name)
{
// The file name is always upper-case:
var nameUpper = name.ToUpperInvariant();
// Get the file (the file has no extension), using the IBinder parameter instead of the [Blob()] parameter because we need to make name upper-case:
var blobStream = await binder.BindAsync<byte[]>(new BlobAttribute($"%BlobContainerName%/{nameUpper}", FileAccess.Read),
req.HttpContext.RequestAborted).ConfigureAwait(false);
Is this the most optimal way? Or can this be made simpler/easier?
It feels a bit hacky.

Related

Azure Functions .NET 5 Isolated HttpTrigger Path Variable Input Binding

I'm just trying to figure out how to do something in .NET 5 that worked in 3.1 and before.
In 3.1, the route variable binds correctly to the Guid parameter of the same name:
[FunctionName("Function1")]
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "records/{clientId:Guid}")] HttpRequest req,
Guid clientId,
ILogger log)
{
return new OkObjectResult(clientId);
}
A comparable .NET 5 version of this same function fails to bind the path variable:
[Function("Function1")]
public static HttpResponseData Run([HttpTrigger(AuthorizationLevel.Function, "get", Route = "records/{clientId:Guid}")] HttpRequestData req,
Guid clientId,
FunctionContext executionContext)
{
var response = req.CreateResponse(HttpStatusCode.OK);
response.WriteString(clientId.ToString());
return response;
}
The error that is thrown is as follows:
Exception:
Microsoft.Azure.Functions.Worker.Diagnostics.Exceptions.FunctionInputConverterException:
Error converting 1 input parameters for Function 'Function1': Cannot
convert input parameter 'clientId' to type 'System.Guid' from type
'System.String'.
I can change the type of the parameter to string and then parse it into a Guid after the fact, of course, but I'd like to know if it's still possible to do it the aforementioned way.
There are two things:
Why isn't the route constraint taken into account (i.e. why does a conversion need to occur)
I found a related git issue. That doesn't seem to be fixed, though it makes me wonder how you managed to make it work with .NET Core 3.1 :)
Why isn't the input string automatically converted to a Guid
I had a look at the code to understand where the exception is raised.
The model binding is using a list of IConverter to convert between the input type and the binding type. In your case, the input type is string and the binding type is Guid, and there's no built-in converter that can do that. You can't even create your own IConverter, because it's an internal interface.
Note: here's an example of IConverter that converts a string to a byte array: StringToByteConverter
So basically, there's nothing you can do apart from your suggestion to parse the Guid yourself.

storage account - export blob size and date [duplicate]

I want to create an Azure function that deletes files from azure blob storage when last modifed older than 30 days.
Can anyone help or have a documentation to do that?
Assuming your storage account's type is either General Purpose v2 (GPv2) or Blob Storage, you actually don't have to do anything by yourself. Azure Storage can do this for you.
You'll use Blob Lifecycle Management and define a policy there to delete blobs if they are older than 30 days and Azure Storage will take care of deletion for you.
You can learn more about it here: https://learn.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts.
You can create a Timer Trigger function, fetch the list of items from the Blob Container and delete the files which does not match your criteria of last modified date.
Create a Timer Trigger function.
Fetch the list of blobs using CloudBlobContainer.
Cast the blob items to proper type and check LastModified property.
Delete the blob which doesn't match criteria.
I hope that answers the question.
I have used HTTP as the trigger as you didn't specify one and it's easier to test but the logic would be the same for a Timer trigger etc. Also assumed C#:
[FunctionName("HttpTriggeredFunction")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
[Blob("sandbox", Connection = "StorageConnectionString")] CloudBlobContainer container,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
// Get a list of all blobs in your container
BlobResultSegment result = await container.ListBlobsSegmentedAsync(null);
// Iterate each blob
foreach (IListBlobItem item in result.Results)
{
// cast item to CloudBlockBlob to enable access to .Properties
CloudBlockBlob blob = (CloudBlockBlob)item;
// Calculate when LastModified is compared to today
TimeSpan? diff = DateTime.Today - blob.Properties.LastModified;
if (diff?.Days > 30)
{
// Delete as necessary
await blob.DeleteAsync();
}
}
return new OkObjectResult(null);
}
Edit - How to download JSON file and deserialize to object using Newtonsoft.Json:
public class MyClass
{
public string Name { get; set; }
}
var json = await blob.DownloadTextAsync();
var myClass = JsonConvert.DeserializeObject<MyClass>(json);

Get function key (name) used when calling Azure Function

I need to be able to identify the key (ideally key name) provided in the header (x-functions-key) for the POST to the Azure Function in the Run method, e.g.
Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req, ILogger log, ClaimsPrincipal principal)
It is great to be able to protect access to the Azure Function adding Function Keys in the Azure Portal panel, but I must be able to tell which function key was used. Ideally it would be possible to associate claims on each function key, but as long as I can at least figure out which key was used I will be happy.
Simply get the claim "http://schemas.microsoft.com/2017/07/functions/claims/keyid" from the req.HttpContext.User.Claims object. It contains the key id in case a Function key was used.
Works like a charm, and does not require external lookups.
const string KEY_CLAIM = "http://schemas.microsoft.com/2017/07/functions/claims/keyid";
public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
var claim = req.HttpContext.User.Claims.FirstOrDefault(c => c.Type == KEY_CLAIM);
if (claim == null)
{
log.LogError("Something went SUPER wrong");
throw new UnauthorizedAccessException();
}
else
{
log.LogInformation( "Processing call from {callSource}", claim.Value);
}
Sajeetharan answered how you can get the Keys using REST API.
About the ability to use RBAC, you need to use Managed Identities and you can find more information about how to set it up: https://learn.microsoft.com/en-us/azure/app-service/overview-managed-identity?tabs=dotnet
In Azure Functions v1:
[FunctionName("MyAuthenticatedFunction")]
public static async Task<HttpResponseMessage> MyAuthenticatedFunction([HttpTrigger(AuthorizationLevel.Function)] System.Net.Http.HttpRequestMessage reqMsg, ILogger log)
{
if (reqMsg.Properties.TryGetValue("MS_AzureFunctionsKeyId", out object val))
log.LogInformation($"MS_AzureFunctionsKeyId: {val}");
}
Code reference: WebJobs.Script.WebHost/Filters/AuthorizationLevelAttribute.cs#L77

Delete files older than X number of days from Azure Blob Storage using Azure function

I want to create an Azure function that deletes files from azure blob storage when last modifed older than 30 days.
Can anyone help or have a documentation to do that?
Assuming your storage account's type is either General Purpose v2 (GPv2) or Blob Storage, you actually don't have to do anything by yourself. Azure Storage can do this for you.
You'll use Blob Lifecycle Management and define a policy there to delete blobs if they are older than 30 days and Azure Storage will take care of deletion for you.
You can learn more about it here: https://learn.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts.
You can create a Timer Trigger function, fetch the list of items from the Blob Container and delete the files which does not match your criteria of last modified date.
Create a Timer Trigger function.
Fetch the list of blobs using CloudBlobContainer.
Cast the blob items to proper type and check LastModified property.
Delete the blob which doesn't match criteria.
I hope that answers the question.
I have used HTTP as the trigger as you didn't specify one and it's easier to test but the logic would be the same for a Timer trigger etc. Also assumed C#:
[FunctionName("HttpTriggeredFunction")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
[Blob("sandbox", Connection = "StorageConnectionString")] CloudBlobContainer container,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
// Get a list of all blobs in your container
BlobResultSegment result = await container.ListBlobsSegmentedAsync(null);
// Iterate each blob
foreach (IListBlobItem item in result.Results)
{
// cast item to CloudBlockBlob to enable access to .Properties
CloudBlockBlob blob = (CloudBlockBlob)item;
// Calculate when LastModified is compared to today
TimeSpan? diff = DateTime.Today - blob.Properties.LastModified;
if (diff?.Days > 30)
{
// Delete as necessary
await blob.DeleteAsync();
}
}
return new OkObjectResult(null);
}
Edit - How to download JSON file and deserialize to object using Newtonsoft.Json:
public class MyClass
{
public string Name { get; set; }
}
var json = await blob.DownloadTextAsync();
var myClass = JsonConvert.DeserializeObject<MyClass>(json);

How to pass file as parameter from Azure logic apps and receive it in Azure function?

I am able to pass parameters and values from logic apps to Azure function. But I am wondering how could I pass file as parameter and receive it in Azure function?
A snippet of how I am passing parameter from Azure logic apps:
In Azure function to receive simple parameter and its value:
public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
{
string jsonContent = await req.Content.ReadAsStringAsync();
dynamic data = JsonConvert.DeserializeObject(jsonContent);
string dateValue = data.fileName;
}
Given that the function will be receiving a JSON payload, a couple options are:
If the file content is small you can read it and send it along in
the body as a simple string property (e.g. { fileName: '...',
content: '...' })
Send along only the path to the file contents (e.g. a blob path or
an Azure Files path) and read the file contents in your function.

Resources