I have following code:
public static void Main()
{
_servicesBusConnectionString = ConfigurationManager.AppSettings["Microsoft.ServiceBus.ConnectionString"];
_namespaceManager = NamespaceManager.CreateFromConnectionString(_servicesBusConnectionString);
_applicationInsightsInstrumentationKey = ConfigurationManager.AppSettings["appInsightsInstrumentationKey"];
JobHostConfiguration config = new JobHostConfiguration();
ServiceBusConfiguration serviceBusConfig = new ServiceBusConfiguration
{
ConnectionString = _servicesBusConnectionString
};
config.UseServiceBus(serviceBusConfig);
config.LoggerFactory = new LoggerFactory()
.AddApplicationInsights(_applicationInsightsInstrumentationKey, null)
.AddConsole();
config.Tracing.ConsoleLevel = TraceLevel.Off;
var host = new JobHost(config);
host.RunAndBlock();
}
and in function:
public static async Task ProcessMessages([ServiceBusTrigger(ServiceBusQueueNames.SomeQueueName)]BrokeredMessage brokeredMessage, TextWriter log)
{
try
{
_log = log;
_log.WriteLine("WebJob started processing of a message");
await _log.FlushAsync();}
It logs message to console, but not to ApplicationInsights.
Instrumentation key set properly.
Can you please say why it does not log to the ApplicationInsights?
I'v written this code accordingly https://github.com/Azure/azure-webjobs-sdk/wiki/Application-Insights-Integration
first thing to check: is your instrumentation key valid at runtime?
then, if the ikey is what you expect it to be, is it for the resource you expect it to be (you aren't looking at a dev application insights instance but the telemetry is going to a prod instance?)
if you debug this locally, do you see application insights assemblies getting loaded?
do you see console output from application insights with the debugger running?
Related
After searching lot on google and trying to find what could be the problem, logged issue in github repo where from I had read about serilog implmentation in .Net Core function app - https://github.com/serilog/serilog-sinks-applicationinsights/issues/179
Serilog is not logging complete message in Azure application insights, no idea what could be the reason. However on console it is logging complete message. Below is code snippet in Startup.cs
public override void Configure(IFunctionsHostBuilder builder)
{
var logger = ConfigureLogging();
builder.Services.AddLogging(lb => lb.AddSerilog(logger));
}
private Logger ConfigureLogging()
{
var telemetryConfiguration = TelemetryConfiguration.CreateDefault();
telemetryConfiguration.InstrumentationKey =
Environment.GetEnvironmentVariable("APPINSIGHTS_INSTRUMENTATIONKEY");
int defaultLoggingSwitch = 3;//Warning
int tloggingSwitch = 3;//Warning
int tSloggingSwitch = 3;//Warning
Int32.TryParse(Environment.GetEnvironmentVariable("DefaultLogging"), out defaultLoggingSwitch);
Int32.TryParse(Environment.GetEnvironmentVariable("TMPLoggingSwitch"), out tloggingSwitch);
Int32.TryParse(Environment.GetEnvironmentVariable("TESLoggingSwitch"), out tSloggingSwitch);
LoggingLevelSwitch SeriLogLevelSwitch = new LoggingLevelSwitch((LogEventLevel)defaultLoggingSwitch);
LoggingLevelSwitch TMPLoggingSwitch = new LoggingLevelSwitch((LogEventLevel)tloggingSwitch);
LoggingLevelSwitch TESLoggingSwitch = new LoggingLevelSwitch((LogEventLevel)tSloggingSwitch);
var logger = new LoggerConfiguration()
.MinimumLevel.ControlledBy(SeriLogLevelSwitch)
.MinimumLevel.Override("ClassName", TMPLoggingSwitch)
.MinimumLevel.Override("IEventsService", TESLoggingSwitch)
.Enrich.FromLogContext()
.WriteTo.ApplicationInsights(telemetryConfiguration, TelemetryConverter.Events)
.CreateLogger();
return logger;
}
Consuming in Eventhub based function app as shown below -
Injecting logger in Function App class -
public EventHubProcessing(ITypeService teService, IConfiguration configuration, IServiceScopeFactory serviceScopeFactory, ILogger<ISampleClass> logger)
{
log = logger;
}
Run method below -
public async Task Run([EventHubTrigger("%EVENTHUB-RECIEVE%", Connection = "EVENTHUB-RECIEVE-CONN",ConsumerGroup = "%ConsumerGroup%")] EventData[] events, Microsoft.Azure.WebJobs.ExecutionContext executionContext, CancellationToken cancellationToken)
{
string json = Encoding.UTF8.GetString(eventData.Body.Array, eventData.Body.Offset, eventData.Body.Count);
log.LogInformation($"Event Hub trigger function processed a message: {json}");
}
Below are nuget package versions -
Serilog Nuget versions
Please let me know if anything else is required.
There are two places in Azure Application Insight which shows log message. At one place it was not showing complete message which I think is supposed to show only initial part of log message but in another field complete message was shown. So it is not an issue at all.
I work on .NET Core 2.2 console application that uses Microsoft.Extensions.Logging and is configured to send logs to Azure Application Insights using Microsoft.ApplicationInsights.Extensibility by:
services.AddSingleton(x =>
new TelemetryClient(
new TelemetryConfiguration
{
InstrumentationKey = "xxxx"
}));
...
var loggerFactory = serviceProvider.GetService<ILoggerFactory>();
loggerFactory.AddApplicationInsights(serviceProvider, logLevel);
It works ok: I can read logs in Application Insights. But the application can be started simultanously in few instances (in different Docker containers). How can I distinguish traces from different instances? I can use source FileName, but I don't know how I should inject it.
I tried to use Scope:
var logger = loggerFactory.CreateLogger<Worker>();
logger.BeginScope(dto.FileName);
logger.LogInformation($"Start logging.");
It's interesting that my configuration is almost identical as in example: https://github.com/MicrosoftDocs/azure-docs/issues/12673
But in my case I can't see the property "FileName" in Application Insights.
For console project, if you want to use the custom ITelemetryInitializer, you should use this format: .TelemetryInitializers.Add(new CustomInitializer());
Official doc is here.
I test it at my side, and it works. The role name can be set.
Sample code is below:
static void Main(string[] args)
{
TelemetryConfiguration configuration = TelemetryConfiguration.CreateDefault();
configuration.InstrumentationKey = "xxxxx";
configuration.TelemetryInitializers.Add(new CustomInitializer());
var client = new TelemetryClient(configuration);
ServiceCollection services = new ServiceCollection();
services.AddSingleton(x => client);
var provider = services.BuildServiceProvider();
var loggerFactory = new LoggerFactory();
loggerFactory.AddApplicationInsights(provider, LogLevel.Information);
var logger = loggerFactory.CreateLogger<Program>();
logger.LogInformation("a test message 111...");
Console.WriteLine("Hello World!");
Console.ReadLine();
}
Check the role name in azure portal:
If you really have no way to distinguish them you can use a custom telemetry initializer like this:
public class CustomInitializer : ITelemetryInitializer
{
public void Initialize(ITelemetry telemetry)
{
telemetry.Context.Cloud.RoleName = Environment.MachineName;
}
}
and/or you can add a custom property:
public class CustomInitializer : ITelemetryInitializer
{
public void Initialize(ITelemetry telemetry)
{
if(telemetry is ISupportProperties)
{
((ISupportProperties)telemetry).Properties["MyIdentifier"] = Environment.MachineName;
}
}
}
In this example I used Environment.MachineName but you can of course use something else if needed. Like this work Id parameter of yours.
the wire it up using:
services.AddSingleton<ITelemetryInitializer, CustomInitializer>();
Azure WebJob, how to be notified if it aborted?
(1)Always Availability is on for the service.
(2) SCM_COMMAND_IDLE_TIMEOUT = 2000.
WEBJOBS_IDLE_TIMEOUT = 2000.
But as i'm new to this. can you please help me on this one where can i put the logic
You could add the logic in the Function.cs file. For more information you could refer to the detail steps
Steps:
1.follow official document to create a webjob project.
2.Add Functions.cs in the project
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions;
using SendGrid;
public class Functions
{
//demo webjob trigger
public static void ProcessQueueMessage([QueueTrigger("queue")] string message, TextWriter log)
{
log.WriteLine(message);
}
// error monitor
public static void ErrorMonitor([ErrorTrigger("0:30:00", 10, Throttle = "1:00:00")]TraceFilter filter, [SendGrid] SendGridMessage message)
{
message.Subject = "WebJobs Error Alert";
message.Text = filter.GetDetailedMessage(5);
}
}
3.If want to use ErrorTrigger and SendGrid we need to config it in the Program.cs file.
static void Main()
{
var config = new JobHostConfiguration();
if (config.IsDevelopment)
{
config.UseDevelopmentSettings();
}
config.UseCore();
config.UseSendGrid(new SendGridConfiguration
{
ApiKey = "xxxxx",
FromAddress = new Email("emailaddress","name"),
ToAddress = new Email("emailaddress","name")
});
var host = new JobHost(config);
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
4.If we want to test it locally, we need to add Storage connection string in the App Settings collection.
<connectionStrings>
<add name="AzureWebJobsStorage" connectionString="{storage connection string}" />
</connectionStrings>
I have written following code to read the data from text file and then separate the sentences and send them to azure event hub.
I am able to send the data to event hub but not able to push data in document db.
How to push data in document db using webjobs? I am running webjobs from console application, Do I need add more configuration to run it locally?
Program.cs
static void Main(string[] args)
{
JobHostConfiguration config = new JobHostConfiguration();
config.Tracing.ConsoleLevel = System.Diagnostics.TraceLevel.Error;
var eventHubConfig = new EventHubConfiguration();
eventHubConfig.AddReceiver(eventHubName, connectionString);
config.UseEventHub(eventHubConfig);
JobHost host = new JobHost(config);
config.DashboardConnectionString = StorageConnectionString;
if (config.IsDevelopment)
{
config.UseDevelopmentSettings();
}
//Send test messages
Task.Run(() =>
{
SendMessagesToEventHub();
});
host.RunAndBlock();
}
function.cs
class Functions
{
public static void Run([EventHubTrigger("azurepochub")] EventData message, [Microsoft.Azure.WebJobs.DocumentDB("testcosmosdb01122018", "Items", ConnectionStringSetting = "dbConnctionString")]out dynamic document)
{
string data = Encoding.UTF8.GetString(message.GetBytes());
document = data;
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine($"Message received. Data: '{data}'");
Console.ResetColor();
}
}
You could get answer from github Azure WebJobs SDK Extensions. If we want to insert document to the Azure documentdb, we need to insert the object into it. In your case, your output is string.
Do I need add more configuration to run it locally?
I also do demo for that. The following is the detail steps
1.Create an .net framework Webjob project.
2.Add the AzureWebJobsDocumentDBConnectionString in the App.config file.
<appSettings>
<!-- Service Bus specific app setings for messaging connections -->
<add key="Microsoft.ServiceBus.ConnectionString" value="Endpoint=sb://[your namespace].servicebus.windows.net;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=[your secret]" />
<add key ="AzureWebJobsDocumentDBConnectionString" value="xxxxx"/>
</appSettings>
3.Add the following code in the Program.cs file.
using System;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.DocumentDB;
using Microsoft.Azure.WebJobs.ServiceBus;
using Microsoft.ServiceBus.Messaging;
namespace WebJobTest
{
// To learn more about Microsoft Azure WebJobs SDK, please see https://go.microsoft.com/fwlink/?LinkID=320976
class Program
{
// Please set the following connection strings in app.config for this WebJob to run:
// AzureWebJobsDashboard and AzureWebJobsStorage
private static string eventHubName = "eventhubNam";
private static string connectionString = "eventhub connectionstring";
static void Main()
{
JobHostConfiguration config = new JobHostConfiguration();
config.Tracing.ConsoleLevel = System.Diagnostics.TraceLevel.Error;
var eventHubConfig = new EventHubConfiguration();
eventHubConfig.AddReceiver(eventHubName, connectionString);
config.UseDocumentDB(new DocumentDBConfiguration
{
ConnectionString = "DocumentDB ConnectionString"
});
config.UseEventHub(eventHubConfig);
config.DashboardConnectionString = "storage connection string";
JobHost host = new JobHost(config);
if (config.IsDevelopment)
{
config.UseDevelopmentSettings();
}
//Send test messages
Task.Run(() =>
{
SendMessagesToEventHub();
});
host.RunAndBlock();
}
static void SendMessagesToEventHub()
{
var eventHubClient = EventHubClient.CreateFromConnectionString(connectionString, eventHubName);
try
{
var message = Guid.NewGuid().ToString();
Console.WriteLine("{0} > Sending message: {1}", DateTime.Now, message);
eventHubClient.Send(new EventData(Encoding.UTF8.GetBytes(message)));
}
catch (Exception exception)
{
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine("{0} > Exception: {1}", DateTime.Now, exception.Message);
Console.ResetColor();
}
Thread.Sleep(200);
}
}
}
4.In the function.cs
public static void Run([EventHubTrigger("eventhub name")] EventData message, [DocumentDB("document Database name", "collection", ConnectionStringSetting = "AzureWebJobsDocumentDBConnectionString")]out Item document)
{
string data = Encoding.UTF8.GetString(message.GetBytes());
document = new Item{Text = data};
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine($"Message received. Data: '{data}'");
Console.ResetColor();
}
public class Item
{
public string Text;
}
5.Check the result from console.
Finish debug from the console then check from the Azure documentdb
I have a ASP.NET Web API application with supporting Azure Web Job with functions that are triggered by messages added to a storage queue by the API's controllers. Testing the Web API is simple enough using OWIN but how do I test the web jobs?
Do I run a console app in memory in the test runner? Execute the function directly (that wouldn't be a proper integration test though)? It is a continious job so the app doesn't exit. To make matters worse Azure Web Job-functions are void so there's no output to assert.
There is no need to run console app in memory. You can run JobHost in the memory of your integration test.
var host = new JobHost();
You could use host.Call() or host.RunAndBlock(). You would need to point to Azure storage account as webjobs are not supported in localhost.
It depends on what your function is doing, but you could manually add a message to a queue, add a blob or whatever. You could assert by querying the storage where your webjob executed result, etc.
While #boris-lipschitz is correct, when your job is continious (as op says it is), you can't do anything after calling host.RunAndBlock().
However, if you run the host in a separate thread, you can continue with the test as desired. Although, you have to do some kind of polling in the end of the test to know when the job has run.
Example
Function to be tested (A simple copy from one blob to another, triggered by created blob):
public void CopyBlob(
[BlobTrigger("input/{name}")] TextReader input,
[Blob("output/{name}")] out string output)
{
output = input.ReadToEnd();
}
Test function:
[Test]
public void CopyBlobTest()
{
var blobClient = GetBlobClient("UseDevelopmentStorage=true;");
//Start host in separate thread
var thread = new Thread(() =>
{
Thread.CurrentThread.IsBackground = true;
var host = new JobHost();
host.RunAndBlock();
});
thread.Start();
//Trigger job by writing some content to a blob
using (var stream = new MemoryStream())
using (var stringWriter = new StreamWriter(stream))
{
stringWriter.Write("TestContent");
stringWriter.Flush();
stream.Seek(0, SeekOrigin.Begin);
blobClient.UploadStream("input", "blobName", stream);
}
//Check every second for up to 20 seconds, to see if blob have been created in output and assert content if it has
var maxTries = 20;
while (maxTries-- > 0)
{
if (!blobClient.Exists("output", "blobName"))
{
Thread.Sleep(1000);
continue;
}
using (var stream = blobClient.OpenRead("output", "blobName"))
using (var streamReader = new StreamReader(stream))
{
Assert.AreEqual("TestContent", streamReader.ReadToEnd());
}
break;
}
}
I've been able to simulate this really easily by simply doing the following, and it seems to work fine for me:
private JobHost _webJob;
[OneTimeSetUp]
public void StartupFixture()
{
_webJob = Program.GetHost();
_webJob.Start();
}
[OneTimeTearDown]
public void TearDownFixture()
{
_webJob?.Stop();
}
Where the WebJob Code looks like:
public class Program
{
public static void Main()
{
var host = GetHost();
host.RunAndBlock();
}
public static JobHost GetHost()
{
...
}
}