Azure Function (Service Bus Trigger) Not Getting started when a new message comes into the service bus queue - azure

Created an Azure Function which is a service bus triggered in Visual Studio and published to Azure from Visual Studio.
Whenever a message goes to queue, the function is running fine from local when manually run. But the expectation is the function should automatically trigger when a message is in the queue.
I am just adding a new message manually and seeing the logs if the function got triggered automatically but it is not. When I checked the Application Insight I found the below error logs
The listener for function 'ProcessVideos' was unable to start. Service Bus account connection string 'connection' does not exist. Make sure that it is a defined App Setting.*"
Code for local.settings.json where the Service Bus connection string is set.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"connection": "Endpoint=sb://videoupload10000.servicebus.windows.net/;SharedAccessKeyName=Listen;SharedAccessKey=80n8a0MCmh+3UZN4+4B7gDy4gp3hKCxfDI/9urDmaP8=;"
}
}
Code for the actual function.
using System;
using System.Text;
using System.Text.Json;
using System.Threading.Tasks;
using Azure.Messaging.ServiceBus;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Microsoft.Azure.Cosmos;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace ReceiveMessages
{
public static class Process
{
private static string blob_connection_string = "DefaultEndpointsProtocol=https;AccountName=videostorage1000;AccountKey=y6CVtXafqKuShZuv6BMbVj9DrymzVdNDpjDVxp6hZMvuRRjcCz/i8TrOGfM5T/JCvfG33sY3xqqW+ASt3p6V+Q==;EndpointSuffix=core.windows.net";
private static string source_container_name = "unprocessed";
private static string destination_container_name = "processed";
private static readonly string _connection_string = "AccountEndpoint=https://videodbupdate.documents.azure.com:443/;AccountKey=gmR051bG7uq7o2i519m7J9nh6tb4LLctfOQ3nPMUxMu9QJWsmh1SPiY8ylvxoY3bn7kWR4cS2qwanBdIoXSrpg==;";
private static readonly string _database_name = "appdb";
private static readonly string _container_name = "video";
[FunctionName("ProcessVideos")]
public static async Task Run([ServiceBusTrigger("videoqueue", Connection = "connection")]ServiceBusReceivedMessage myQueueItem, ILogger log)
{
ReceivedMessage _message = JsonSerializer.Deserialize<ReceivedMessage>(Encoding.UTF8.GetString(myQueueItem.Body));
BlobServiceClient _client = new BlobServiceClient(blob_connection_string);
BlobContainerClient _source_container_client = _client.GetBlobContainerClient(source_container_name);
BlobClient _source_blob_client = _source_container_client.GetBlobClient(_message.VideoName);
BlobContainerClient _destination_container_client = _client.GetBlobContainerClient(destination_container_name);
BlobClient _destination_blob_client = _destination_container_client.GetBlobClient(_message.VideoName);
CosmosClient _cosmosclient = new CosmosClient(_connection_string, new CosmosClientOptions());
Container _container = _cosmosclient.GetContainer(_database_name, _container_name);
BlobDownloadInfo _info = _source_blob_client.Download();
// Copy the blob to the destination container
await _destination_blob_client.StartCopyFromUriAsync(_source_blob_client.Uri);
log.LogInformation(_info.Details.LastModified.ToString());
log.LogInformation(_info.ContentLength.ToString());
BlobDetails _blobdetails = new BlobDetails();
_blobdetails.BlobName = _message.VideoName;
_blobdetails.BlobLocation = "https://videostorage100.blob.core.windows.net/processed/" + _message.VideoName;
_blobdetails.ContentLength = _info.ContentLength.ToString();
_blobdetails.LastModified = _info.Details.LastModified.ToString();
_blobdetails.id = Guid.NewGuid().ToString();
//_container.CreateItemAsync(_blobdetails, new PartitionKey(_message.VideoName)).GetAwaiter().GetResult();
// await _container.CreateItemAsync(_blobdetails, new PartitionKey(_message.VideoName));
Console.WriteLine("Item created");
// Delete the blob from the unprocessed container
_source_blob_client.Delete();
// Add the details of the blob to an Azure Cosmos DB account
}
}
}

The local settings are not uploaded to the cloud. to add your connection string you need to do the following. Go to your function app in azure. Select "configuration" under "settings" from the left side menu items. On this screen, you should click on the button "+ New Application Settings". Once the popup opens add "connection" as the name and your connection string as the value. Click on "OK" and then on the next screen click on "save" to save and apply the settings. Hope this helps

For a python project, your connection value in function.json needs to refer to the value in local.settings.json. Should be similar for you:
function.json:
"connection": "AzureWebJobsMyServiceBus"
local.settings.json:
"AzureWebJobsMyServiceBus": "Endpoint=sb://..."

Related

When deploy Azure function to Azure how did it know that it should read the AzureFunctionSettings from App settings instead of from local.setting.json

I created my first Azure Function which integrate with SharePoint Online list, using those main points:-
1-I created an Azure App with self-sign certificate to authorize my Azure function.
2-I created a new Azure Function project using Visual Studio 2019. here are the main components
-Function.cs:-
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using PnP.Core.Services;
using PnP.Core.Model.SharePoint;
using System.Collections.Generic;
namespace FunctionApp1
{
public class Function1
{
private readonly IPnPContextFactory pnpContextFactory;
public Function1(IPnPContextFactory pnpContextFactory)
{
this.pnpContextFactory = pnpContextFactory;
}
[FunctionName("Function1")]
public void Run([TimerTrigger("0 */5 * * * *")]TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
using (var context = pnpContextFactory.Create("Default"))
{
var myList = context.Web.Lists.GetByTitle("SubFolders");
Dictionary<string, object> values = new Dictionary<string, object>
{
{ "Title", System.DateTime.Now }
};
// Use the AddBatch method to add the request to the current batch
myList.Items.AddBatch(values);
context.Execute();
}
}
}
}
-Startup.cs:-
using Microsoft.Azure.Functions.Extensions.DependencyInjection;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using PnP.Core.Auth;
using System.Security.Cryptography.X509Certificates;
[assembly: FunctionsStartup(typeof(FunctionApp1.Startup))]
namespace FunctionApp1
{
class Startup :FunctionsStartup
{
public override void Configure(IFunctionsHostBuilder builder)
{
var config = builder.GetContext().Configuration;
var azureFunctionSettings = new AzureFunctionSettings();
config.Bind(azureFunctionSettings);
builder.Services.AddPnPCore(options =>
{
options.DisableTelemetry = true;
var authProvider = new X509CertificateAuthenticationProvider(azureFunctionSettings.ClientId,
azureFunctionSettings.TenantId,
StoreName.My,
StoreLocation.CurrentUser,
azureFunctionSettings.CertificateThumbprint);
options.DefaultAuthenticationProvider = authProvider;
options.Sites.Add("Default", new PnP.Core.Services.Builder.Configuration.PnPCoreSiteOptions
{
SiteUrl = azureFunctionSettings.SiteUrl,
AuthenticationProvider = authProvider
});
});
}
}
}
-local.setting.json:-
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"SiteUrl": "https://***.sharepoint.com/",
"TenantId": "0b***",
"ClientId": "92***",
"CertificateThumbPrint": "EB***",
"WEBSITE_LOAD_CERTIFICATES": "EB***"
}
}
then i deploy it to Azure and it is working well, where each 5 minutes it adds a new list item.
But what i am unable to understand, is that when i test the function locally, the function reads its setting from the local.settings.json file, but after deploying it to Azure it start reading its settings from the online Azure App settings.. so how it did this behind the senses ?
This is by design.
App settings in a function app contain configuration options that affect all functions for that function app. When you run locally, these settings are accessed as local environment variables.
and
You can use application settings to override host.json setting values without having to change the host.json file itself. This is helpful for scenarios where you need to configure or modify specific host.json settings for a specific environment. This also lets you change host.json settings without having to republish your project.
Taken from App settings reference for Azure Functions.

Adding Custom Dimension to Request Telemetry - Azure functions

I am creating a new Function app using v2.x and I am integrating Application Insights for request logging that is automatically being done as Azure Function is now integrated with App Insights (as mentioned in the documentation link). What I would need to do is log few custom fields in the custom dimensions in Application Insights Request Telemetry. Is it possible without using Custom Request logging (using TrackRequest method)
About adding custom properties, you could refer to this tutorial:Add properties: ITelemetryInitializer. The below is my test a HTTP trigger function.
public static class Function1
{
private static string key = "Your InstrumentationKey";
private static TelemetryClient telemetry = new TelemetryClient() { InstrumentationKey = key };
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
if (!telemetry.Context.Properties.ContainsKey("Function_appName"))
{
telemetry.Context.Properties.Add("Function_appName", "testfunc");
}
else
{
telemetry.Context.Properties["Function_appName"] = "testfunc";
}
telemetry.TrackEvent("eventtest");
telemetry.TrackTrace("tracetest");
string name = req.Query["name"];
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name = name ?? data?.name;
return name != null
? (ActionResult)new OkObjectResult($"Hello, {name}")
: new BadRequestObjectResult("Please pass a name on the query string or in the request body");
}
}
After running this function, go to the Application Insights Search could check the data Or go to Logs(Analytics).
Update:
You should use ITelemetry Initializer(which can add custom dimension to a specified telemetry like only for request) in function app, please follow the steps below:
1.In Visual studio, create a function app(In my test, I create a blob triggerd function), and install the following nuget packages:
Microsoft.ApplicationInsights, version 2.10.0
Microsoft.NET.Sdk.Functions, version 1.0.29
2.Then in the Function1.cs, write code like below:
using Microsoft.ApplicationInsights.Channel;
using Microsoft.ApplicationInsights.DataContracts;
using Microsoft.ApplicationInsights.Extensibility;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Hosting;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using System.IO;
[assembly: WebJobsStartup(typeof(FunctionApp21.MyStartup))]
namespace FunctionApp21
{
public static class Function1
{
[FunctionName("Function1")]
public static void Run([BlobTrigger("samples-workitems/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
}
}
internal class MyTelemetryInitializer : ITelemetryInitializer
{
public void Initialize(ITelemetry telemetry)
{
//use telemetry is RequestTelemetry to make sure only add to request
if (telemetry != null && telemetry is RequestTelemetry && !telemetry.Context.GlobalProperties.ContainsKey("my_custom_dimen22"))
{
telemetry.Context.GlobalProperties.Add("my_custom_dimen22", "Hello, this is custom dimension for request!!!");
}
}
}
public class MyStartup : IWebJobsStartup
{
public void Configure(IWebJobsBuilder builder)
{
builder.Services.AddSingleton<ITelemetryInitializer, MyTelemetryInitializer>();
}
}
}
3.Publish it to azure, then nav to azure portal -> the published function app -> Monitor -> Add an application insights.
4.Run the function from azure. And wait for a few minutes -> nav to the application insights portal, check the telemetry data, and you can see the custom dimension is only added to request telemetry:
The other solutions don't quite answer the question, how to add custom properties to the request telemetry. There is a very simple solution, add the following within your function's code:
Activity.Current?.AddTag("my_prop", "my_value");
You'll need:
using System.Diagnostics;
This then can be dynamic per function invocation / request, rather a fixed global property.

How to get notification when webjob status was aborted in azure

Azure WebJob, how to be notified if it aborted?
(1)Always Availability is on for the service.
(2) SCM_COMMAND_IDLE_TIMEOUT = 2000.
WEBJOBS_IDLE_TIMEOUT = 2000.
But as i'm new to this. can you please help me on this one where can i put the logic
You could add the logic in the Function.cs file. For more information you could refer to the detail steps
Steps:
1.follow official document to create a webjob project.
2.Add Functions.cs in the project
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions;
using SendGrid;
public class Functions
{
//demo webjob trigger
public static void ProcessQueueMessage([QueueTrigger("queue")] string message, TextWriter log)
{
log.WriteLine(message);
}
// error monitor
public static void ErrorMonitor([ErrorTrigger("0:30:00", 10, Throttle = "1:00:00")]TraceFilter filter, [SendGrid] SendGridMessage message)
{
message.Subject = "WebJobs Error Alert";
message.Text = filter.GetDetailedMessage(5);
}
}
3.If want to use ErrorTrigger and SendGrid we need to config it in the Program.cs file.
static void Main()
{
var config = new JobHostConfiguration();
if (config.IsDevelopment)
{
config.UseDevelopmentSettings();
}
config.UseCore();
config.UseSendGrid(new SendGridConfiguration
{
ApiKey = "xxxxx",
FromAddress = new Email("emailaddress","name"),
ToAddress = new Email("emailaddress","name")
});
var host = new JobHost(config);
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
4.If we want to test it locally, we need to add Storage connection string in the App Settings collection.
<connectionStrings>
<add name="AzureWebJobsStorage" connectionString="{storage connection string}" />
</connectionStrings>

verify or debug azure hybrid connection

I've got an Azure Function running with a Hybrid Connection to an on premise server. Everything works nicely. In the future we will have multiple hybrid connections in multiple locations, and its possible they might have the same server name and port combination.
Is there a way to verify (or debug) the service bus namespace or UserMetadata properties of the hybrid connection being used, before the SQL operation is executed?
Here is run.csx from the function app:
#r "System.Data"
using System.Net;
using System.Collections.Generic;
using System.Configuration;
using Newtonsoft.Json;
using System.Data.SqlClient;
using System.Text;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
string connectionString = "Server=MyServerName;Initial Catalog=BillingDB;";
string queryString = "select top 2 * from Billing";
List<Billing> bills = new List<Billing>();
SqlConnection conn = new SqlConnection(connectionString);
conn.Open();
/***************
Here is where I would like to be able to verify/validate
the namespace or UserMetadata of the hybrid connection so that I'm sure
we're connecting to the desired database. "Server" in the connection string
is not enough in our case.
******************/
SqlCommand DBCmd = new SqlCommand(queryString, conn);
SqlDataReader myDataReader;
myDataReader = DBCmd.ExecuteReader();
while (myDataReader.Read())
{
bills.Add(new Billing
{
Student_ID = Convert.ToInt32(myDataReader[0]),
Transaction_Number = Convert.ToInt32(myDataReader[1]),
Log = myDataReader[2].ToString(),
Amount_Owed = Convert.ToDecimal(myDataReader[3])
}
);
}
myDataReader.Close();
conn.Close();
var json = JsonConvert.SerializeObject(bills);
log.Info("json: " + json);
return req.CreateResponse(HttpStatusCode.OK,json);
}
public class Billing {
public int Student_ID { get; set; }
public int Transaction_Number { get; set; }
public string Log { get; set; }
public decimal Amount_Owed { get; set; }
}
I was eventually able to solve this by making a GET Request to the Azure Resource Manager REST API (by clicking on Resource Manager in the Application Settings, it provides the url needed to make a callout to as well as the expected response body). In addition to this an Active Directory application needs to be created to acquire a token to access the resources.
https://management.azure.com/subscriptions/{subscriptionid}/resourceGroups/{resourcegroupname}/providers/Microsoft.Web/sites/{functionname}/hybridConnectionRelays?api-version=2016-08-01
This returns a JSON object listing the properties of the Hybrid Connections which are 'Connected' to the individual application/function app.

Azure Functions and DocumentDB triggers

Is it possible to specify the DocumentDB is to fire triggers when writing to DocumentDB?
I have an Azure function that pulls JSON messages off a Service Bus Queue and puts them into DocumentDB like so:
using System;
using System.Threading.Tasks;
public static string Run(string myQueueItem, TraceWriter log)
{
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
return myQueueItem;
}
This inserts new documents into the database as they are added to the service bus queue, however I need DocumentDB to process these as they are added and add attachments. This cannot be done in the present setup and I would like to tell DocumentDB to fire a trigger.
I have tried something like this:
using System;
using System.Threading.Tasks;
public static string Run(string myQueueItem, TraceWriter log)
{
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
return "x-ms-documentdb-post-trigger-include: addDocument\n" + myQueueItem;
}
It doesn't work and gives me errors like this:
Exception while executing function:
Functions.ServiceBusQueueTriggerCSharp1. Microsoft.Azure.WebJobs.Host:
Error while handling parameter _return after function returned:.
Newtonsoft.Json: Unexpected character encountered while parsing value:
x. Path '', line 0, position 0.
I like this setup because I can saturate the queue with requests to add records and they just buffer until the database can deal with it, which deals with spikes in demand, but it allows data offload from the client machine as fast as the network can carry it and then the queue/database combination gets caught up when demand drops again.
You could refer to the following code sample to create document with the trigger enabled in Azure Functions.
using System;
using System.Threading.Tasks;
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
public static void Run(string myQueueItem, TraceWriter log)
{
string EndpointUri = "https://{documentdb account name}.documents.azure.com:443/";
string PrimaryKey = "{PrimaryKey}";
DocumentClient client = new DocumentClient(new Uri(EndpointUri), PrimaryKey);
client.CreateDocumentAsync(UriFactory.CreateDocumentCollectionUri("{databaseid}", "{collenctionid}"), new MyChunk { MyProperty = "hello" },
new RequestOptions
{
PreTriggerInclude = new List<string> { "YourTriggerName" },
}).Wait();
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
}
public class MyChunk
{
public string MyProperty { get; set; }
}
Note: for using Microsoft.Azure.DocumentDB NuGet package in a C# function, please upload a project.json file to the function's folder in the function app's file system.
project.json
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.Azure.DocumentDB": "1.13.1"
}
}
}
}
Besides, please make sure you have created triggers in your DocumentDB, for details about creating triggers, please refer to this article.

Resources