I am following along with this guide: https://learn.microsoft.com/en-us/azure/storage/queues/storage-dotnet-how-to-use-queues and attempting to create a simple queue in a time triggered function. It is not recognizing CloudStorageAcount, CloudConfigurationManager, CloudQueueClient, etc.
Here is my run.csx file
using Microsoft.Azure;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Queue;
using System;
public static void Run(TimerInfo myTimer, TraceWriter log) {
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the queue client.
CloudQueueClient queueClient =
storageAccount.CreateCloudQueueClient();
// Retrieve a reference to a container.
CloudQueue queue = queueClient.GetQueueReference("myqueue");
// Create the queue if it doesn't already exist
queue.CreateIfNotExists();
}
Here is my project.json file:
{
"frameworks": {
"net45":{
"dependencies": {
"Microsoft.WindowsAzure.ConfigurationManager" : "3.2.3",
"Microsoft.WindowsAzure.Storage" : "8.0.0"
},
"frameworks": {
"netcoreapp1.0": {
"dependencies": {
"Microsoft.NETCore.App": {
"type": "platform",
"version": "1.0.0"
}
},
"imports": "dnxcore50"
}
}
}
}
}
The package Microsoft.WindowsAzure.Storage is referenced by Azure Functions itself by default. Remove the whole project.json file and add this line to the top of your function:
#r "Microsoft.WindowsAzure.Storage"
But you might not even need that. Azure Functions have higher-level API to work with Storage Queues, both for sending (output bindings) and receiving (triggers). Refer to Azure Queue storage bindings for Azure Functions.
One more advice: it's preferred to use precompiled functions deployed as class libraries compiled with Visual Studio or VS Code. This way it's much easier to manage dependencies and troubleshoot.
Related
I've tried to build a solution in which I have multiple Function Apps deployed (in multiple Azure Regions) which all share the same Task Hub (set via host.json) storage account. Now my idea was that since each of them should listen to new work for the Activity Functions, the load should get somewhat distributed. But thats not happening - at least with what I have tried so far. It looks like the Function App for the Activity Functions is already determined by which Orchestrator is picked. (I had to deploy the Orchestrator in together with the Activity Functions or else it wouldn't even kick off).
So my question is: Would such a scenario be possible to achieve using Durable Functions?
host.json
{
"version": "2.0",
"extensions": {
"durableTask": {
"hubName": "MyTaskHub",
"storageProvider": {
"connectionStringName": "DurableStorage",
"useAppLease": false
}
}
}
}
Thank you Anand Sowmithiran. Posting your suggestion as an answer so that it will be helpful for other community members who face similar kind of issues.
Now my idea was that since each of them should listen to new work for the Activity Functions, the load should get somewhat distributed. But thats not happening - at least with what I have tried so far. It looks like the Function App for the Activity Functions is already determined by which Orchestrator is picked.
As per the Microsoft Document the Task hub should be different for each durable function which means it should have their own durable function. below is the sample Task Hub code
{
"version": "2.0",
"extensions": {
"durableTask": {
"hubName": "%MyTaskHub%"
}
}
}
In addition to host.json, task hub names can also be configured in orchestration client binding metadata.
[FunctionName("HttpStart")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, methods: "post", Route = "orchestrators/{functionName}")] HttpRequestMessage req,
[DurableClient(TaskHub = "%MyTaskHub%")] IDurableOrchestrationClient starter,
string functionName,
ILogger log)
{
// Function input comes from the request content.
object eventData = await req.Content.ReadAsAsync<object>();
string instanceId = await starter.StartNewAsync(functionName, eventData);
log.LogInformation($"Started orchestration with ID = '{instanceId}'.");
return starter.CreateCheckStatusResponse(req, instanceId);
}
I'm developing an application where IOT devices are connected with the Azure IOT Hub. and its realtime data can be visible on the web view. However, I'm facing an error, I'm trying to bind the data Azure function with SignalR, but when I run the application I receive the following error message.
The listener for function 'SignalR' was unable to start. Microsoft.Azure.EventHubs.Processor: Encountered error while fetching the list of EventHub PartitionIds. System.Private.CoreLib: The link address '$management' did not match any of the expected formats.
Error Description Image
I've tried everything to fix it but failed every time. I'd really appreciate if someone would help me find the solution to this problem.
Here is the script I'm using from this link
Here is my SignalR.cs class
public static class SignalR
{
[FunctionName("SignalR")]
public static async Task Run(
[IoTHubTrigger("messages/events", Connection = "IoTHubTriggerConnection", ConsumerGroup = "$Default")]EventData message,
[SignalR(HubName = "broadcast")]IAsyncCollector<SignalRMessage> signalRMessages,
ILogger log)
{
var deviceData = JsonConvert.DeserializeObject<DeviceData>(Encoding.UTF8.GetString(message.Body.Array));
deviceData.DeviceId = Convert.ToString(message.SystemProperties["iothub-connection-device-id"]);
log.LogInformation($"C# IoT Hub trigger function processed a message: {JsonConvert.SerializeObject(deviceData)}");
await signalRMessages.AddAsync(new SignalRMessage()
{
Target = "notify",
Arguments = new[] { JsonConvert.SerializeObject(deviceData) }
});
}
}
Here is my SignalRConnection.cs class
public static class SignalRConnection
{
[FunctionName("SignalRConnection")]
public static SignalRConnectionInfo Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = null)] HttpRequest req,
[SignalRConnectionInfo(HubName = "broadcast")] SignalRConnectionInfo info,
ILogger log) => info;
}
Here is my local.settings.json file
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"AzureSignalRConnectionString": "",
"MSDEPLOY_RENAME_LOCKED_FILES": 1,
"IoTHubTriggerConnection": ""
},
"Host": {
"LocalHttpPort": 7071,
"CORS": "*"
}
}
for IoTHubTriggerConnection, I'm using the connection string of iothubjohnsoncontrol (displayed in image below).
IOT Hub Keys Image
for AzureSignalRConnectionString, I'm using the connection string of signalrjohnsoncontrol (displayed in image below).
SignalR Keys Image
Could you please check if you have given EventHub Compatible name and EventHub compatible connections string from here
Please try replacing messages/events with EventHub-Compatible name and IoTHubTriggerConnection as EventHub compatible endpoint from portal.
Almost similar discussion here :
https://github.com/Azure/azure-event-hubs-dotnet/issues/103
I have a usecase like that to Push Iot data to Azure data explorer and this is what my Function looks like
Iot Hub Connection string which is EventHub compatibale
Hope this helps.
I have an azure function which is triggered when a zip file is uploaded to an azure blob storage container. I unzip the file in memory and process the contents and add/update the result into a database. While for the db part I can use the in memory db option. Somehow am not too sure how to simulate the blob trigger for unit testing this azure function.
All the official samples and some blogs mostly talk about Http triggers(mocking httprequest) and queue triggers (using IAsynCollection).
[FunctionName("AzureBlobTrigger")]
public void Run([BlobTrigger("logprocessing/{name}", Connection = "AzureWebJobsStorage")]Stream blobStream, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {blobStream.Length} Bytes");
//processing logic
}
There is a project about Unit test/ Integration test about azure function including blob trigger in github, please take a try at your side. Note that the unit test code is in FunctionApp.Tests folder.
Some code snippet about blob trigger from github:
unit test code of BlobFunction.cs
namespace FunctionApp.Tests
{
public class BlobFunction : FunctionTest
{
[Fact]
public async Task BlobFunction_ValidStreamAndName()
{
Stream s = new MemoryStream();
using(StreamWriter sw = new StreamWriter(s))
{
await sw.WriteLineAsync("This is a test");
BlobTrigger.Run(s, "testBlob", log);
}
}
}
}
I have an existing function app to which that I need to deploy a serviceBusTrigger function after user action in the application. To do this, I have been using the following post Deploy Azure function from code (c#).
Azure function version 2.0 for this function app
I am currently creating a zip containing /function.json that is being posted to the api/zip endpoint using the following method
public void CreateAzureFunctionToMonitorQueue(string functionName, string serviceBusQueueName, string path)
{
// create zip of new function
// ZipFile.CreateFromDirectory will create a zip with the directory's name containing the directory's contents
// so CreateZipOfFunc creates a structure <functionName>/<functionName>/function.json
// so that the zip contains <functionName>/function.json
// which is the required input for api/zip/ from kudu api
var zipFile = CreateZipOfFunc(functionName, serviceBusQueueName, path);
var file = File.ReadAllBytes(zipFile);
MemoryStream stream = new MemoryStream(file);
using (var client = new HttpClient())
{
// deploy zip using Kudu REST api
client.DefaultRequestHeaders.Add("Authorization", "Basic " + _base64Auth);
var baseUrl = new Uri($"https://{_webFunctionAppName}.scm.azurewebsites.net/");
var requestURl = baseUrl + "api/zip/site/wwwroot";
var httpContent = new StreamContent(stream);
var response = client.PutAsync(requestURl, httpContent).Result;
}
// remove files
Directory.Delete($"{path}{functionName}", true);
File.Delete($"{path}{functionName}.zip");
// deployment using Kudu REST api requires the function triggers to be manually synced
SyncTriggers();
}
At the end, I run SyncTriggers which manually syncs the function app's trigger because I read that deploying this way requires this for all trigger except http, found here https://learn.microsoft.com/en-us/azure/azure-functions/functions-deployment-technologies. I am using the second method of manually syncing triggers. Below is the method
public void SyncTriggers()
{
using (var client = new HttpClient())
{
var requestUrl = $"https://{_webFunctionAppName}.azurewebsites.net/admin/host/synctriggers?code={_MASTER_KEY}";
var httpContent = new StringContent("");
var response = client.PostAsync(requestUrl, httpContent).Result;
}
}
The result of both requests is successful, and when I look in the azure portal, the new function is there with a function.json file that matches a working serviceBusTrigger when deployed using Visual Studio webdeploy.
For testing this, I first disable the working azure function, run the above code and then push a new message to the monitored queue; however, nothing happens when the message becomes active.
If I enable the function deployed using VS that already existed, that function will fire and handle the message.
Existing function's function.json file looks like the following
{
"generatedBy": "Microsoft.NET.Sdk.Functions-1.0.26",
"configurationSource": "attributes",
"bindings": [
{
"type": "serviceBusTrigger",
"connection": "ServiceBusConnection",
"queueName": "myqueue",
"name": "queueItem"
}
],
"disabled": false,
"scriptFile": "../bin/MyProject.MyLibrary.dll",
"entryPoint": "MyProject.MyLibrary.MyClass.RunAsync"
}
The function deployed using the above method has a function.json like the following
{
"generatedBy": "Microsoft.NET.Sdk.Functions - 1.0.26",
"configurationSource": "attributes",
"bindings": [
{
"type": "serviceBusTrigger",
"connection": "ServiceBusConnection",
"queueName": "myqueue",
"name": "queueItem"
}
],
"disabled": false,
"scriptFile": "../bin/MyProject.MyLibrary.dll",
"entryPoint": "MyProject.MyLibrary.MyClass.RunAsync"
}
Am I missing something? My use case for this is, after I create a new queue, I want to create a monitoring azure function to listen to it. Creating the queue using NameSpaceManager works fine and pushing messages to it works fine as well. I just can't seem to get this test case working where I am creating a function to monitor an existing queue.
Possibly the function is not register and properly setup when I am calling SyncTrigger??
Thanks
Edit: I just saw this post https://blogs.msdn.microsoft.com/benjaminperkins/2018/08/07/why-does-my-azure-function-sometimes-stop-being-triggered/
which says:
Your endpoint must trigger/bind to only one Azure Function
Does that mean that I can only have one function registered to a function in my uploaded dll? Should I instead upload a copy of the entrypoint method I want with a different name as a .csx with the function.json file?
Edit2: That seems to just be related to az fns to binding resource, does not seem to help
Edit3: After a lot of research, it seems that the configurationSource for my generated function.json should not be "attributes" but instead "config". Going to test this now.
Edit4: The issue was fixed by removing generatedBy and changing configurationSource to "config" in the function.json file I am generating.
Edit5: removed misleading questions
The issue is that functions deployed using VS have the two properties in function.json
"generatedBy": "Microsoft.NET.Sdk.Functions - 1.0.26",
"configurationSource": "attributes",
which really just tell azure from where the function was generated and to use the attributes of the method that is the function's entrypoint as the configuration source. Removing generatedBy allows one to edit the function in azure portal and changing configurationSource to "config" tells the azure function runtime to use the function.json file for binding configuration and what not.
However, in the azure portal, I am getting a message
Error:
Function (MyFunction) Error: Configuration error: all functions in D:\home\site\wwwroot\bin\MyProject.MyLibrary.dll must have the same value for 'configurationSource'.
Session Id: MY_SESSION_ID
Timestamp: 2019-06-30T21:04:22.709Z
My function that I deployed still worked, and other existing functions that have http bindings that the deployed function calls still work (they have function.json files that have generatedBy Skd.Functions and configurationSource: "attributes", so this does not seem to be a make or break error at the moment; however, I will be changing my deployment so that all have configurationSource: "config"
so changing the output function.json to be
{
"configurationSource": "config",
"bindings": [
{
"type": "serviceBusTrigger",
"connection": "ServiceBusConnection",
"queueName": "myqueue",
"name": "queueItem"
}
],
"disabled": false,
"scriptFile": "../bin/MyProject.MyLibrary.dll",
"entryPoint": "MyProject.MyLibrary.MyClass.RunAsync"
}
resolved my issue
I am trying to upgrade my DocumentDB nuget package from 1.13 to 1.18
I am facing issue while upgrading my azure function which is having a DocumentClient binding.
In DocumentDB 1.13 the binding sections does not take :{Id} as an binding parameter and was creating the DocumentClient object perfectly . Whereas the DocumentDB 1.18 needs {Id} as an binding parameter [ Which i dont want , as I want to iterate through entire documents in the collection ]
my host.json binding before 1.18 was
{
"frameworks": {
"net46": {
"dependencies": {
"Dynamitey": "1.0.2",
"Microsoft.Azure.DocumentDB": "1.13.0",
"Microsoft.Azure.WebJobs.Extensions.DocumentDB": "1.0.0"
}
}
}
my local.settings.json had only
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "
DefaultEndpointsProtocol=xxxxx/xxxxx==;EndpointSuffix=core.windows.net",
"AzureWebJobsDashboard": "",
"AzureWebJobsDocumentDBConnectionString":
"AccountEndpoint=xxxxx/;AccountKey=xxxx==;",
}
}
and my azure function looks like
[FunctionName("DeleteAVFeedAuditData")]
public static async Task Run([TimerTrigger("0 0/1 * * * *")]TimerInfo myTimer, [DocumentDB]DocumentClient client,
TraceWriter log)
{
var c = client;
log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
var value=ConfigurationManager.AppSettings["AVAuditFlushAfterDays"];
var collectionUri = UriFactory.CreateDocumentCollectionUri("AVFeedAudit", "AuditRecords");
//var documents = client.CreateDocumentQuery(collectionUri,"Select * from c where c.EndedAt");
//foreach (Document d in documents)
//{
// await client.DeleteDocumentAsync(d.SelfLink);
//}
}
}
Now when running the azure function with updated package of documentDB 1.18 it says to bind the {Id} which will give only the single document with the specified id . Whereas my requirment is same as the previous version of DocumentDB 1.13.
Please tell how can i get the entire documents binded with my DocumentClient with the new updated package.
According to your description, I checked this issue and reproduced this issue as follows:
Please tell how can i get the entire documents binded with my DocumentClient with the new updated package.
Based on your scenario, I would recommend you construct the DocumentClient by yourself instead of using the binding to DocumentClient for a workaround to achieve your purpose.
DocumentClient client = new DocumentClient(new Uri("https://<your-account-name>.documents.azure.com:443/"), "<your-account-key>");
And you could configure the serviceEndpoint and accountKey under your local.settings.json file just as the app setting AzureWebJobsStorage. Then you could use the following code for retrieving your setting value:
ConfigurationManager.AppSettings["your-appsetting-key"];
Moreover, here is a issue about constructing the DocumentClient from the connection string, you could refer to it.
UPDATE:
For 1.18, the following code could work as expected:
[FunctionName("Function1")]
public static void Run([TimerTrigger("*/10 * * * * *")]TimerInfo myTimer, [DocumentDB("brucedb01", "brucecoll01",ConnectionStringSetting = "AzureWebJobsDocumentDBConnectionString")] IEnumerable<dynamic> documents, TraceWriter log)
{
foreach (JObject doc in documents)
{
//doc.SelectToken("_self").Value<string>();
log.Info(doc.ToString());
}
}