Azure Insights: tracking custom property through the chain of function executions - azure

I have Azure Function1->Function2->Service flow of calls in my Azure app. There are multiple concurrent calls of Function1 and each could be identified by some unique input Document Id. I wonder how in c# code I can set something in Azure Insight context to that document id in the beginning of Funciton1, so that any [traces] or [exceptions] or [dependencies] logged to Azure Insights in any of the follow up calls contains the document id. I noticed all of them have customDimension nested list of properties, so maybe somehow add one more property to there. Also if Function1 runs multiple times in parallel, I do not want these document id to be mixed up.
Goal is to be able to track this document id in all kinds of logs with min amount of additional c# code, avoid passing the document id from function to other functions and other services, so looking into any type of log (wheatear it's traces or exceptions or other) I'm able to immediately identify document the execution belonged to. Is it possible?

To attach a custom property to all logs whithin an azure function is not that diffucult, one could simply use a telemetry initializer to do that:
public class TelemetryEnrichment : ITelemetryInitializer
{
public void Initialize(ITelemetry telemetry)
{
if (!(telemetry is ISupportProperties item)) return;
// Demonstrate static property
item.Properties["Environment"] = "Production";
}
}
If it is an http triggered function you can enricht the request telemetry like this:
var requestTelemetry = req.HttpContext.Features.Get<RequestTelemetry>();
requestTelemetry.Properties.Add("aProp", "aValue");
You also want to have to property logged by other functions that are called by the entry function. This is not easy doable: you will need to pass the id to the other function manually, for example by passing it using the url of that function.
However, if you have to Id attached to the logs of the entry function you can easily create a query to correlate the logs. Based on the operation id you can get the whole picture of the communication flow between the functions and services, see the docs:
That way, you do not need to include the Id as a custom property to each and every telemetry item.

Related

ServiceStack: business logic that depends on the database itself

I'm exploring ServiceStack and I'm not sure what is the best way to implement some business logic.
Using the "Bookings CRUD" example I would like to enforce the following rule:
a given Booking can only be saved (either created or updated) if the hotel has enough free rooms for the particular dates of that booking
Please note that I'm not asking how to calculate "free rooms".
What I'm asking is, from the architectural point of view, how should this be done.
For example, one way would be:
create a request DTO to query the number of configured rooms (lets call it "QueryRooms")
use the existing "QueryBookings" to query current bookings present in database
create a " : Service" class to customize the Booking Service, in order to intercept the "CreateBooking" and "UpdateBooking" requests
inside the custom methods for "CreateBooking" and "UpdateBooking", somehow get the results of "QueryRooms" and "QueryBookings", check if there are enough free rooms for the current request, and proceed only if so
This doesn't look very clean, because the service "CreateBooking" and "UpdateBooking" would depend of "QueryRooms" and "QueryBookings".
What would be an elegant and effcient solution, using ServiceStatck?
You can override AutoQuery CRUD operations with your own Service implementation using the AutoQuery DTO.
Where you can use the Service Gateway to call existing Services which you can use to perform any additional validation & modify the request DTO before executing the AutoQuery operation to implement the API, e.g:
public class MyCrudServices : Service
{
public IAutoQueryDb AutoQuery { get; set; }
public object Post(CreateBooking request)
{
var response = Gateway.Send(new QueryRooms
{
From = request.BookingStartDate,
To = request.BookingEndDate,
});
if (response.Results.Count == 0)
throw new Exception("No rooms available during those dates");
request.RoomNumber = response.Results[0].Id;
return AutoQuery.Create(request, base.Request);
}
}
Note: calling in-process Services with the Service Gateway is efficient as it calls the C# method implementation directly, i.e. without incurring any HTTP overhead.

Why are all activities showing Unknown for the User/Actor .NET Server API w/React-JS client components

I am working on a proof of concept with GetStream.io using the .NET server side API to add activities and the react-js client components to render feeds. For some reason every activity is coming into my feeds with Unknown in bold at the top. I assume this is supposed to be the username or something? I read a post about passing in a reference to the user instead of the string userId but the .NET API constructor signature creating a new Activity only takes in a string userId parameter. I have verified that I am passing in a valid userId. Any suggestions on what I am doing wrong here?
Stream stores the unique reference and replaces it at read time. In some complex cases, you need to be able to generate a reference to an existing object and embed that inside of an activity.
Then, can you try this way:
// Since we know their IDs we can create a reference without reading from APIs
var userRef = Users.Ref(userId);
// And then add an activity with these references
var activity = new Activity(userRef, activityAction, message)

Timer based Azure function with Table storage, HTTP request, and Azure Service Bus

I have a process written in a console application right now that fires on a scheduled task to read data from Azure table storage and based on that data, make API calls to a third party vendor we use, deserialize the response data, loop over an array in the results, save the individual iterations of the loop into a different table in Azure table storage, and then publish messages for each iteration of the loop to Azure service bus where those messages are consumed by another client.
In an effort to get more of our tasks into the cloud, I've done some research and it seems that an Azure function would be a good candidate to replace my console application. I spun up a new Azure function project in Visual Studio 2019 as a "timer" function and then dove into some reading where I got lost really fast.
The reading I've done talks about using "bindings" in my Run() method arguments decorated with attributes for connection strings etc but I'm not sure that is the direction I should be heading. It sounds like that would make it easier for authentication to my table storage, but I can't figure out how to use those "hooks" to query my table and then perform inserts. I haven't even gotten to the service bus stuff yet nor looked into making HTTP calls to our third party vendor's api.
I know this is a very broad question and I don't have any code to post because I'm having a tough time even getting out of the starting blocks with this. The MS documentation is all over the map and I can't find anything specific to my needs and I promise I've spent a fair bit of time trying.
Are Azure functions even the right path I should be travelling? If not, what other options are out there?
TIA
You should keep with Azure Functions with the Time Trigger to replace your console app.
The bindings (which can be used for input /output) are helpers to save you some lines of code, for example:
Rather than using the following code to insert data into azure table:
// Retrieve storage account information from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConnectionString);
// Create a table client for interacting with the table service
CloudTableClient tableClient = storageAccount.CreateCloudTableClient(new TableClientConfiguration());
// Create a table client for interacting with the table service
CloudTable table = tableClient.GetTableReference("MyTable");
//some code to populate an entity
var entity = new { PartitionKey = "Http", RowKey = Guid.NewGuid().ToString(), Text = input.Text };
// Create the InsertOrReplace table operation
TableOperation insertOrMergeOperation = TableOperation.InsertOrMerge(entity);
// Execute the operation.
TableResult result = await table.ExecuteAsync(insertOrMergeOperation);
you would use:
[FunctionName("TableOutput")]
[return: Table("MyTable")]
public static MyPoco TableOutput([HttpTrigger] dynamic input, ILogger log)
{
log.LogInformation($"C# http trigger function processed: {input.Text}");
return new MyPoco { PartitionKey = "Http", RowKey = Guid.NewGuid().ToString(), Text = input.Text };
}
PS: the input trigger in the previous code is a HTTP Trigger, but was only to explain how to use output binding.
you can find more information in here:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-table
and you should watch: https://learn.microsoft.com/en-us/learn/modules/chain-azure-functions-data-using-bindings/

Azure Functions - How to change the Invocation ID within the function?

I have a series of Azure Functions, and I'd like to keep track of them by the InovcationId. In Application Insights, the InvocationId is called the operation_Id.
What I'm trying to do is set the operation_Id to be the same across several different Azure Functions.
I can read this property inside the Azure Function when I pass in ExecutionContext by following this answer, but I can't seem to alter it. Is there some way to change this value from inside the Azure Function?
public static class TestOperationId
{
[FunctionName("TestOperationId")]
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req,
ILogger log,
ExecutionContext exeCtx
)
{
var input = await req.Content.ReadAsStringAsync();
log.Info(input);
exeCtx.InvocationId = Guid.Parse(input);
return req.CreateResponse(HttpStatusCode.OK);
}
}
The definition for the InvocationId field is defined as
Provides the invocation ID, uniquely identifying the current invocation
Azure Functions doesn't provide changing this, as it would mean that code could override the platform's methods to detect unique invocations of Functions, which would interfere with things like Billing and Metrics for the platform.
It sounds like what you really want is cross-function correlation. There is work being done with the Application Insights team to help support this, but in the meantime, you can see solutions that others are currently utilizing, like here.

Extending log4net - Adding additional data to each log

We're working on logging in our applications, using log4net. We'd like to capture certain information automatically with every call. The code calling log.Info or log.Warn should call them normally, without specify this information.
I'm looking for a way to create something we can plug into log4net. Something between the ILog applications use to log and the appenders, so that we can put this information into the log message somehow. Either into ThreadContext, or the LogEventInfo.
The information we're looking to capture is asp.net related; the request url, user agent, etc. There's also some information from the apps .config file we want to include (an application id).
I want to get between the normal ILog.Info and appender so that this information is also automatically included for 3rd party libraries which also use log4net (Nhibernate, NServiceBus, etc).
Any suggestions on where the extensibility I want would be?
Thanks
What you are looking for is called log event context. This tutorial explains how it works:
http://www.beefycode.com/post/Log4Net-Tutorial-pt-6-Log-Event-Context.aspx
In particular the chapter 'Calculated Context Values' will interesting for you.
Update:
My idea was to use the global context. It is easy to see how this works for something like application ID (in fact there you do not even need a calculated context object). Dynamic information like request url could be done like this:
public class RequestUrlContext
{
public override string ToString()
{
string url;
// retrieve url from request
return url;
}
}
The object is global but the method is called on the thread that handles the request and thus you get the correct information. I also recommend that you create one class per "information entity" so that you have a lot of flexibility with the output in the log destination.

Resources