I am using Blob trigger in my project to process the content of files.
I am using Azure Blob Trigger to initiate the process of a file execution.
[FunctionName("FunctionImportCatalogue")]
public static void Run([BlobTrigger("importcontainer/{name}", Connection = "StorageConnection")]Stream myBlob, string name, TraceWriter log)
{}
Depending on where the code is published the blobcontainer should change accordingly. I mean I want the "importcontainer" to be configured in config files. Can I do that?
As far as I know, you could configure it in the local.settings.json.
Add the code below to the Values in the file, my sample container named 'workitems'.
"importcontainer": "workitems"
Then change the code below in the .cs file.
public static void Run([BlobTrigger("%importcontainer%/{name}", Connection = "StorageConnection")]Stream myBlob, string name, TraceWriter log)
Then publish the Function to Azure, you should set importcontainer in the Application settings in the portal like screenshot below, because the setting will be used.
Run the function and add a blob to the container, it works fine on my side.
Related
This question is similar to Azure Blob Storage trigger Function not firing
However, their problem was that their Azure Function wasn't awaking immediately, giving the impression it wasn't processing triggers from Azure Blob Storage when in fact it was after 10 minutes which is exactly as the MS docs claim.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-trigger?tabs=csharp
My problem is different. My blob has been sitting in the container now for 9 hours and it still hasn't been processed.
All it does is post a message onto ServiceBus.
[FunctionName("IncomingFileDetected")]
[return: ServiceBus("incoming-file-received", EntityType = Microsoft.Azure.WebJobs.ServiceBus.EntityType.Topic)]
public static IncomingFile Run(
[BlobTrigger("incoming-files/{filename}", Connection = "ConnectionStrings:MutableStorage")]
Stream contents,
string filename,
ILogger log)
{
log.LogInformation($"Detected new blob file: {filename}");
return new IncomingFile(filename);
}
No messages have appeared in the service bus.
Now, after 9 hours, I have restarted the function app and the blob was processed within about 10 minutes.
Update:
Thanks for Peter Morris's sharing, the problem comes from is service plan is d1. So first make sure you are based on the three kinds of plans: consumption plan, premium plan and app service plan. When we use azure function, even only test, we should use a consumption plan. The smallest in production is S1, which is normally used for testing.
Original Answer:
The below code works fine on my side. Even the consumption plan is no problem.
using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace FunctionApp35
{
public static class Function1
{
[FunctionName("Function1")]
[return: ServiceBus("test", Connection = "ServiceBusConnection")]
public static string Run([BlobTrigger("samples-workitems/{name}", Connection = "str")]Stream myBlob, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
string a = "111111111111111";
return a;
}
}
}
This is my local settings:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=0730bowmanwindow;AccountKey=lti/ThmF+mw9BebOacp9gVazIh76Q39ecikHSCkaTcGK5hmInspX+EkjzpNmvCPWsnvapWziHQHL+kKt2V+lZw==;EndpointSuffix=core.windows.net",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"str": "DefaultEndpointsProtocol=xxxxxx",
"ServiceBusConnection": "Endpoint=sb://bowmantestxxxxxx"
}
}
The str is from this place:
The ServiceBusConnection is from this place:
And please notice that the blob will will not be removed from container after the azure function is triggered. Also, dont forget to create at least one subscription in your service bus topic.
All all the above also works fine after the function be deployed to azure.(The difference from local is you need to add settings in configuration settings instead of local.settings.json)
I have an Azure storage account with a queue in it.
Now in my function (created in Visual Studio) I have this:
public IActionResult Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]
HttpRequest req,
[Queue("temperatuurmeting"), StorageAccount("AzureWebJobsStorage")]
ICollector<string> messages,
ILogger log)
{
`
I host the code in BitBucket, and when I commit something it gets deployed to Azure, and it works.
But.... the AzureWebJobsStorage connection string is (automatically created when I created my function in Visual Studio) in the local.settings.json, which I just read should not be in the code repository.
But: when I do not include the local.settings.json in de code repo, and then deploy it to Azure, where should I put the AzureWebJobsStorage connectionstring so that my running function can find it?
Or is AzureWebJobsStorage a default name and already somewhere in Azure created by something other than me and can I just remove the local.settings.json?
Locally:
"Connection": "Endpoint=sb://****.servicebus.windows.net/;SharedAccessKeyName=****;SharedAccessKey=*********"
in local.settings.json
public static void Run([ServiceBusTrigger("attended", Connection = "Connection")]string myQueueItem, ILogger log)
in the function
When you publish go to the portal
go to your function app settings
variables
and then add the variable you used
Creating Azure Functions targeting .Net Standard 2.0 using Visual Studio 2017.
Using the Add New Azure Function wizard, a blob trigger method is successfully created with the following method signature.
public static void Run([BlobTrigger("attachments-collection/{name}")] Stream myBlob, string name, ILogger log)
This method compiles and works fine.
However, we want to be able access the metadata connected to the CloudBlockBlob being saved to storage, which as far as I know is not possible using a stream. Other answers on this site such as (Azure Function Blob Trigger CloudBlockBlob binding) suggest you can bind to a CloudBlockBlob instead of a Stream and access the metadata that way. But the suggested solution does not compile in latest version of Azure Functions.
Microsoft's online documentation (https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#trigger---usage) also seems to confirm that it is possible to bind the trigger to a CloudBlockBlob rather than a Stream, but gives no example of the syntax.
Could someone please clarify the exact syntax required to enable Azure Function Blob storage trigger to bind to a CloudBlockBlob instead of the standard Stream?
Thanks
Thanks to Jerry Liu's s insights, this problem has been solved.
Method:
Use the default storage package for Azure Storage that is installed when you create a new Function App
Microsoft.Azure.WebJobs.Extensions.Storage (3.0.1)
This installs dependency
WindowsAzure.Storage (9.3.1)
Then both of the following method signatures will run correctly
public static async Task Run([BlobTrigger("samples-workitems/{name}")]Stream myBlob, string name, ILogger log)
and
public static async Task Run([BlobTrigger("samples-workitems/{name}")]CloudBlockBlob myBlob, string name, ILogger log)
Actually CloudBlockBlob does work, we don't need FileAccess.ReadWrite as it's BlobTrigger instead of Blob input or output.
public static Task Run([BlobTrigger("samples-workitems/{name}")]CloudBlockBlob blob, string name, ILogger log)
Update for Can't bind BlobTrigger to CloudBlockBlob
There's an issue tracking here, Function SDK has some problem integrating with WindowsAzure.Storge >=v9.3.2. So just remove any WindowsAzure.Storage package reference, Function SDK references v9.3.1 internally by default.
Current Scenario:
We have created an azure function which will generate thumbnail of each image added to a container named testcontainer and will save them to another container named thumbnail-testcontainer. Problem is, we have to manually create thumbnail-testcontainer. We want this thumbnail container creation process to be automated(using azure function), as the number of containers in azure storage account is more.
Can this container creation step be automated using azure functions which are executed via triggers?
We have taken reference from page https://www.michaelcrump.net/azure-tips-and-tricks75/ but didn't get much help.
Have a look at output binding, we don't need to manually create container which is used to save output. Once we specify a container in output binding path, function will automatically create it if it doesn't exist.
[FunctionName("FunctionTest")]
public static void Run(
[BlobTrigger("one/{name}")]Stream myBlob,
[Blob("another/{name}", FileAccess.Write)]Stream anotherBlob,
string name, TraceWriter log)
{
myBlob.CopyTo(anotherBlob);
log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
}
I have an Azure webjob that has a few blob triggered functions within. I uploaded this to Azure via the Add job dialog on the portal and set it to "Run Continuously" The behavior expected was that any time a blob is added /modified to the containers specified in the blob trigger the corresponding function be called. This however does not happen.
The only way to trigger the functions (after having uploaded blobs) is to Stop the web job and restart it again.
Every time I restart the job the functions
seem to be triggered and triggered only once. Any subsequent blob updates don’t seem to trigger them again.
On the portal however the WebJob shows as 'Running' however no functions get triggered after the initial trigger.
The main function for this web job looks like this :
static void Main()
{
var host = new JobHost();
host.RunAndBlock();
}
What could be the issue ?
The trigger functions are standard blob triggered functions and work for the first time - hence I am not yet sharing that code.
UPDATE
The function signature looks like this
public static void UpdateData([BlobTrigger("inputcontainer/{env}-update-{name}")] Stream input, string name, string env, TextWriter logger)
public static void DeleteData([BlobTrigger("inputcontainer/{env}-delete-{name}")] Stream input, string name, string env, TextWriter logger)
Because of how the blob triggers are implemented, it can take up to 10 minutes for the function to be invoked.
If the function is not triggered even after 10 minutes, please share with us the function signature and the names of blobs that you are uploading.