How to make Azure Function fire on queue change when using containers? - node.js

I've created an Azure Functions app and dockerized it. The container is running and doesn't give me any errors. Now I want this function app to be triggered by a storage queue insertion. I've read the microsoft docs and added the trigger in my function.json file.
{
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in",
"queueName": "searches",
"connection": "StorageAccountConnectionString"
}
The StorageAccountConnectionString is an app setting which I added to the functions app in the azure portal under configuration.
I verified the existence of the searches queue as well.
Now nothing seems to be happening when I insert a message into the searches queue. The dequeue counts doesn't increment, and the function execution counter also doesn't change.
Does anyone know how I can make this work?

If you are deploying the docker image to the function app then you can setup the environment variable in the docker file itself.
just add the following command in your dockerfile.
ENV StorageAccountConnectionString = <Your Connection String>
This way there will be a reference to the variable in the docker container. Here my storage queue trigger was working after I add the command.
You can also refer to this article by Kevin lee

I have no idea, but it was fixed by generating a function from VS Code Azure Functions extension. Then the function picked up the queue change.

Related

Azure Function App Service Bus subscription triggered not consuming messages

I implemented an Azure Function App with a service bus subscription trigger. It works all well in my laptop debugging it from Visual Studio, getting triggered every time a message is pushed to the service bus topic. However, after deploying it to Azure, it is not triggered when a message is published to the service bus topic.
After some debugging and research, I found that it was working well locally as it was using a emulated storage account; however, in the cloud, it needs to have a storage account. In my case, the problem was that the configuration settings were missing the details for the storage account. It needs to be either a connection string setting (if you are using SAS tokens), or as in my case, the two following entries, as I use managed identities instead (why it needed both representations is still unclear to me):
{
"name": "AzureWebJobsStorage:accountName",
"value": "yourstorageaccountname",
"slotSetting": false
},
{
"name": "AzureWebJobsStorage__accountName",
"value": "yourstorageaccountname",
"slotSetting": false
}

Azure Function Blob trigger not responding to manual import

I am trying to get the Azure function to trigger when a blob file is uploaded. The function was deployed from an azure DevOps release.
Azure deploy steps (shows most relevant information):
Storage account:
- has a folder [blob folder] where I upload files to.
Azure function code:
public async static Task Run([BlobTrigger("[blob folder]/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
{
// any breakpoint here is never hit.
}
function.json:
{
"generatedBy": "Microsoft.NET.Sdk.Functions-1.0.24",
"configurationSource": "attributes",
"bindings": [
{
"type": "blobTrigger",
"connection": "AzureWebJobsStorage",
"path": "[blob folder]/{name}",
"name": "myBlob"
}
],
"disabled": false,
"scriptFile": "../bin/[dllname].dll",
"entryPoint": "[namespace].[function].[command]"
}
The storage and function are part of the same resource group. The Function app settings contains the values AzureWebJobsDashboard and AzureWebJobsStorage. I read these should be available in the function settings in this post. I also turned off the Function setting "Always On" for the function, and made sure the function is running.
Running and debugging the function locally (with the Azure storage emulator and storage explorer) works fine. The function is triggered after I upload a file to the blob folder.
In Azure, it seems like nothing happens. I'm not super familiar with the Azure environment, so any help is appreciated.
The issue was caused by the FUNCTIONS_EXTENSION_VERSION setting in the application settings. I needed to update this setting to the correct version.
I managed to get the debugging running using Jason Robert's blog post and debug the trigger event of my function.

Azure function ApiHubFileTrigger doesn't execute with consumption plan when not monitoring in portal

I'm using the apiHubFileTrigger with the OneDrive for business connector(onedriveforbusiness_ONEDRIVEFORBUSINESS).
{
"bindings": [
{
"type": "apiHubFileTrigger",
"name": "myFile",
"direction": "in",
"path": "/InputFile/{file}",
"connection": "onedriveforbusiness_ONEDRIVEFORBUSINESS"
}
],
"disabled": false
}
It's working well when I am monitoring the script inside the azure portal. But soon as I close the editor and wait for some time, the function is not be triggered on new files copied to onedrive for business. There is no error or anything in invocation logs. (No invocation at all)
The function is written in C#. The function will use the input file and perform some operations based on file. Since its working when I am inside the portal and monitoring it, the issue is not related to code.
I'm running the consumption plan so the problem has nothing to do with "always on"
To summarize the discussion in the comments, there is an outstanding bug with Consumption Plan functions where triggers sometimes get out of sync and don't fire unless the portal is open if they are published in certain ways (details here: https://github.com/Azure/Azure-Functions/issues/210)
In those cases, hitting the "refresh" button next to the function app in the portal will sync the trigger.
In this case, there was an issue with storage account connection string parsing. The workaround is to switch to an App Service Plan function (making sure that "Always On" is "On").

How to get all blob storage(new or updated one) using azure functions time trigger

I have requirement where i need to get all the blob storage which are updated or added after a particular time duration.
Example :- In container I have list of zip file as a blob, I need to get all the updated or newly added blob in a given interval like after every 1 hour I need to get all the newly added or updated blob.
So I have used azure function where created one time trigger function but could not able to get all the blob(updated or newly added).
Could anyone let me know how I can solve this problem.
function.json file
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */2 * * * *"
},
{
"type": "blob",
"name": "myBlob",
"path": "*****/******.zip",
"connection": "***************",
"direction": "in"
}
],
"disabled": false
}
index.js
module.exports = function(context, trigger, inputBlob) {
context.log(inputBlob);
//it's also available on context.bindings
context.log(context.bindings.inputBlob); // will log the same thing as above
context.done();
}
Thanks in Advance.
Ideally, Functions work best if you use them in reactive way, i.e. when Function is run on Blob change event directly (or via Event Grid).
If you have to stick to timer and then find all changed Blobs, Azure Function bindings won't help you. In this case, remove the input binding that you were trying to declare and search for changed blobs with Blob Storage API. I believe Azure Storage SDK for Node.js supports listing the blobs, but I haven't used it.
Your scenario is a good candidate for using an Azure Event Grid (now in the preview) solution with an event-driven blob storage publisher. More details here.
Basically, there is no limitation for number of containers, blob storages either the Azure subscriptions. If your blob storage has been subscribed for the event interest, such as the blob has been created or deleted, the custom filtered event message can be delivered to the subscriber, for instance, the EventGridTrigger Function.
The following screen snippet shows an example of the event-driven blob storages:
The following logs shows a received an event message by function, when the blob has been deleted:
Note, that the event message sent by blob storage publisher can be filtered in the subscription based on subject and/or eventType properties. In other words, each subscription can tell to Event Grid for its event source interest.
In the case of streaming events and their analyzing, the Event Grid can be subscribed for Event Hub subscriber, see the following screen snippet:
All events from the source interest will ingest to the Event Hub which it represents an entry point of the stream pipeline. The stream of the events, such as the event messages of the created/deleted blobs across the accounts and/or azure subscriptions is analyzing by ASA job based on the needs. The output of the ASA job will trigger an Azure Function to finish a business requirements.
More details about the Event Hub as a destination for Event Grid is here.

How to write azure function that can get triggered when an item appears in azure service bus queue using C#?

The new azure function preview contains a few templates for C#. But there is no service bus queue template for C#. There is a trigger template for node with service bus. But on close inspection, it only supported Notification Hubs and not service bus queue. Is it even possible to write an azure function that can only be triggered when an item appears in azure service bus queue? If it is not possible now, will there be such a template in the near future?
Thanks.
Raghu/..
Update: The below steps and information still hold, however we now have a "ServiceBusQueueTrigger - C#" template live in the portal, so the workaround steps are no longer necessary :)
ServiceBus IS supported already for C#, we just need to add a template for it (we'll add very soon). In general, templates are just starting points - you can always modify templates by adding additional bindings, or start from the empty template and build up your own Function.
Until we get the template uploaded, you can get this working yourself by starting from the C# Empty template. For example, you can enter binding info like the following in the Advanced Editor on the Integrate tab:
{
"bindings": [
{
"type": "serviceBusTrigger",
"name": "message",
"direction": "in",
"queueName": "samples-input",
"connection": "myServiceBus"
}
]
}
Make sure your Function App has an AppSetting matching the name of the connection property, containing your ServiceBus connection string. It looks like we currently have some issues with the connection string picker for ServiceBus (which will also be fixed very soon), but you can use "Function app settings"/"Go to App Service Settings"/"Application Settings" to add this app setting. Then you can use the corresponding Function code:
using System;
using Microsoft.Azure.WebJobs.Host;
public static void Run(string message, TraceWriter log)
{
log.Verbose($"C# ServiceBus Queue function processed message: {message}");
}
This function will then be invoked whenever new messages are added to ServiceBus queue samples-input.
Per https://azure.microsoft.com/en-us/documentation/articles/functions-reference/, there is no binding with the SB.
The best way to do that instead of doing something that is (at least by some chance) being in work in product group is to submit your idea on the UserVoice - https://feedback.azure.com/forums/355860-azure-functions .

Resources