Azure Functions - StorageBlob Trigger method signature using CloudBlockBlob not Stream - azure

Creating Azure Functions targeting .Net Standard 2.0 using Visual Studio 2017.
Using the Add New Azure Function wizard, a blob trigger method is successfully created with the following method signature.
public static void Run([BlobTrigger("attachments-collection/{name}")] Stream myBlob, string name, ILogger log)
This method compiles and works fine.
However, we want to be able access the metadata connected to the CloudBlockBlob being saved to storage, which as far as I know is not possible using a stream. Other answers on this site such as (Azure Function Blob Trigger CloudBlockBlob binding) suggest you can bind to a CloudBlockBlob instead of a Stream and access the metadata that way. But the suggested solution does not compile in latest version of Azure Functions.
Microsoft's online documentation (https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#trigger---usage) also seems to confirm that it is possible to bind the trigger to a CloudBlockBlob rather than a Stream, but gives no example of the syntax.
Could someone please clarify the exact syntax required to enable Azure Function Blob storage trigger to bind to a CloudBlockBlob instead of the standard Stream?
Thanks

Thanks to Jerry Liu's s insights, this problem has been solved.
Method:
Use the default storage package for Azure Storage that is installed when you create a new Function App
Microsoft.Azure.WebJobs.Extensions.Storage (3.0.1)
This installs dependency
WindowsAzure.Storage (9.3.1)
Then both of the following method signatures will run correctly
public static async Task Run([BlobTrigger("samples-workitems/{name}")]Stream myBlob, string name, ILogger log)
and
public static async Task Run([BlobTrigger("samples-workitems/{name}")]CloudBlockBlob myBlob, string name, ILogger log)

Actually CloudBlockBlob does work, we don't need FileAccess.ReadWrite as it's BlobTrigger instead of Blob input or output.
public static Task Run([BlobTrigger("samples-workitems/{name}")]CloudBlockBlob blob, string name, ILogger log)
Update for Can't bind BlobTrigger to CloudBlockBlob
There's an issue tracking here, Function SDK has some problem integrating with WindowsAzure.Storge >=v9.3.2. So just remove any WindowsAzure.Storage package reference, Function SDK references v9.3.1 internally by default.

Related

Azure Function: Http Trigger instead of Blob Trigger for more resilient operations

I have a scenario where I write some data to an azure storage blob, trigger an azure function to process the data, and write the result to another blob storage record. I've come across a weird scenarios where if the function hasn't triggered for a while(couple days) it will stop responding to the trigger unless I navigate to the azure portal and restart the function. This also happens when I use VSTS to publish the function in my CI/CD pipeline. Again, a restart is required.
To get around this, I would prefer to use an HTTP trigger where I can at least get a status code response to my request and have a better sense that my function has actually been triggered.
Here is the method for the blob trigger that is working:
[FunctionName("ProcessOpenOrders")]
public static async Task Run([BlobTrigger("%TriggerBlobPath%/{name}")]Stream myBlob, string name, TraceWriter traceWriter, [Blob("%OutboundBlobPath%/{name}", FileAccess.Write)] Stream outputStream, ExecutionContext context)
The TriggerBlobPath and OutboudBlobPath are slot setting configurations. This is important because I need the blob storage record name as a parameter so I know what to read and I use the same record name as the output.
For an HTTP trigger, I would need that name in a similar way. My question is how?
Something like this I'm guessing:
public static async Task Run([HttpTrigger] HttpRequestMessage request, [Blob("%InboundBlobPath%/{name}", FileAccess.Read)]Stream myBlob, string name, TraceWriter traceWriter, [Blob("%OutboundBlobPath%/{name}", FileAccess.Write)] Stream outputStream, ExecutionContext context)
but I get the following error:
Microsoft.Azure.WebJobs.Host: Unable to resolve binding parameter 'name'. Binding expressions must map to either a value provided by the trigger or a property of the value the trigger is bound to, or must be a system binding expression (e.g. sys.randguid, sys.utcnow, etc.).
If anyone knows how to implement an HttpTrigger in place of a blob trigger, but get the same functionality, that would be very helpful. Otherwise, if someone has an idea on how to guarantee that the blob trigger actually triggered, that would also be very helpful.
Thanks!
I think the official guidance is to use Event Grid trigger to react on blob events. See Reacting to Blob storage events and Event Grid trigger for Azure Functions.

Azure Blob Trigger. Configure Blob Path in Configuration Files

I am using Blob trigger in my project to process the content of files.
I am using Azure Blob Trigger to initiate the process of a file execution.
[FunctionName("FunctionImportCatalogue")]
public static void Run([BlobTrigger("importcontainer/{name}", Connection = "StorageConnection")]Stream myBlob, string name, TraceWriter log)
{}
Depending on where the code is published the blobcontainer should change accordingly. I mean I want the "importcontainer" to be configured in config files. Can I do that?
As far as I know, you could configure it in the local.settings.json.
Add the code below to the Values in the file, my sample container named 'workitems'.
"importcontainer": "workitems"
Then change the code below in the .cs file.
public static void Run([BlobTrigger("%importcontainer%/{name}", Connection = "StorageConnection")]Stream myBlob, string name, TraceWriter log)
Then publish the Function to Azure, you should set importcontainer in the Application settings in the portal like screenshot below, because the setting will be used.
Run the function and add a blob to the container, it works fine on my side.

How do i write an Azure Blob storage trigger that transfers to a data lake

I want to create a blob storage trigger that takes any files put into blob storage (a fast process) and transfers them to Data Lake storage (NOT to another Blob Storage location).
Can this be done?
Can it be done using JavaScript, or does it require C#?
Does sample code exist showing how to do this? If so, would you be so kind as to point me to it?
Note: we've created a pipeline that will go from Blob Storage to Data lake storage. That's not what I'm asking about here.
You could potentially use an Azure Function or Azure Logic App to detect new files on Blob Storage and either call your webhook to trigger the pipeline or do the move itself.
Can this be done?
As jamesbascle mentioned that we could use Azure function to do that.
Can it be done using JavaScript, or does it require C#?
It can be done with javascript or C#.
Does sample code exist showing how to do this? If so, would you be so kind as to point me to it?
How to create a Blob storage triggered function, please refer to this document. We also could get the C#/javascript demo code from this document.
JavaScript code
module.exports = function(context) {
context.log('Node.js Blob trigger function processed', context.bindings.myBlob);
context.done();
};
C# code
[FunctionName("BlobTriggerCSharp")]
public static void Run([BlobTrigger("samples-workitems/{name}")] Stream myBlob, string name, TraceWriter log)
{
log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
}

Azure Function Blob Trigger CloudBlockBlob binding

I have the following Azure function triggered when a file is uploaded to Blob Storage
[FunctionName("ImageAnalysis")]
public static async void Run(
[BlobTrigger("imageanalysis/{name}", Connection = "AzureWebJobsStorage")] Stream myBlob,
string name,
TraceWriter log)
{
log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
}
I want to process the Blob that has been uploaded so ideally I would like it as a CloudBlockBlob instead of a Stream. Then I can just do some work and then delete the blob.
myBlob.DeleteIfExists()
Is there an easy way to cast or convert my Stream to CloudBlockBlob or do I need to use input/output bindings or something else?
Looking through the docs I see examples which use CloudBlockBlob but I can't seem to get it to work so think I am missing something?
Use this syntax for the binding. The trick is specifying FileAccess.ReadWrite in the attribute. The docs rather confusingly refer to this as "inout" for some reason.
[Blob("imageanalysis/{name}", FileAccess.ReadWrite, Connection = "AzureWebJobsStorage")] CloudBlockBlob blob, string name

Table Attribute binding not working when using IQueryable<T>

I have an Azure app function that i would like to read from an azure storage Table. I am new to azure functions and table storage. But can not get the samples to work properly.
the functions are created in visual Studio 2017 and published to azure.
The app function have 2 functions, 1 for posting and one for getting.
The post functions work as expected.
The getter function fails with below error:
internal error 500: "'TableName' can't be invoked from Azure WebJobs SDK. Is it missing Azure WebJobs SDK attributes?"
Get function:
[FunctionName("FunctionName")]
public static HttpResponseMessage Run([HttpTrigger(AuthorizationLevel.Anonymous, "get")]HttpRequestMessage req, [Table("TableName", Connection = "Default")]IQueryable<Person> inTable, TraceWriter log)
{
return req.CreateResponse(HttpStatusCode.OK);
}
The only difference between the post and get methods is their signature.
the post sets the tableattribute for the ICollector interface.
[Table("TableName", Connection = "Default")]ICollector<Person> outTable
And the get sets the table attribute for the IQueryable Interface
[Table("TableName", Connection = "Default")]IQueryable<Person> inTable
Any input is appreciated.
This is almost the default sample provided for working with table storage.
When using IQueryable, the T must implement ITableEntity (or derive from TableEntity) from the Storage SDK. ICollector does not have this constraint.
Be sure to use the same version of the storage SDK that the your project already has from its Microsot.NET.SDK.Functions reference (and not pull in a new, different storage SDK that loads side-by-side).
There are some more details here, https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-table, although those examples are *.CSX (where the bindings are in a separate function.json file) and
not *.cs (where the bindings are inline attributes). But the C# types and rules are the same.

Resources