I'm using log4net.Appender.AzureAppendBlobAppender to log my web app's info & errors. Sometimes I'm getting the "BlockCountExceedsLimit" exception. It is due to the append blob accepts only 50,000 block commits after that it through the exception (Conflict (409)). I have checked the code and found that it waits for the 512 log events and flush each log entry separately to the append blob. So, we can log only 50,000 log entries in a day.
Can anyone please help me on this? Does anyone know any alternate for this?
Thanks,
Karthik
According to your description, I assumed that you are using log4net.Appender.Azure nuget package. As you can see under AzureAppendBlobAppender.cs:
private static string Filename(string directoryName)
{
return string.Format("{0}/{1}.entry.log.xml",
directoryName,
DateTime.Today.ToString("yyyy_MM_dd",
DateTimeFormatInfo.InvariantInfo));
}
Per my understanding, you could follow AzureAppendBlobAppender.cs to write your custom AzureAppendBlobAppender and adjust the Filename,SendBuffer methods to meet your requirement.
I'm using log4net.Appender.AzureAppendBlobAppender to log my web app's info & errors.
Since you use azure web app to host your application, you could use the built-in Application Logging (Blob), and azure side would help you generate the logs hourly. You could log into Azure Portal, choose your web app, enable application logging (Blob), set the logging level to Information, details you could follow Enable diagnostics logging for web apps in Azure App Service.
For your application, you could use the following code to log info and errors.
System.Diagnostics.Trace.TraceError("xxxxx");
System.Diagnostics.Trace.TraceInformation("xxxxx");
I've changed the code to a little bit to append the blob, once the buffer reaches the threshold value (512 log entries) it'll flush the log entries in single commit.
Related
I have written some simple functions and enabled Application Insights,
Its all showing as connected and I can see that's its tracking http statues, eg I get a failed request count and server response times etc.
I understand that I can add application insights to node with the following code
let appInsights = require("applicationinsights");
appInsights.setup("[your ikey]").start();
But I was hoping it would just work without this, I can see that the function is outputting logs when I use the log stream
But when I use app insights I don't see anything in any of the log tables
Do I need to add insights via code to my function or I am missing some secret config option.
That's also a good idea to add application insights module into your node project to achieve monitor feature for your function. Both code and codeless are good choices.
In my opinion, the biggest difference between code and codeless monitor is the custom telemetry data. But I think in most scenarios, default information collected is enough for daily using, official doc says:
Application Insights collects log, performance, and error data, and
automatically detects performance anomalies.
So I think it's ok for you when you could get traces and error messages after adding appinsights module and recreate a new appinsights instance. And you can also try to use codeless configuration I mentioned in the comment(azure portal-> your function written by nodejs-> Application insights-> enable-> create new resource)
This is a similar post to Azure Web App Trace logs not appearing in log, however the original poster seems to have abandoned the question without resolving/accepting an answer.
I am trying to trace an issue that only happens on the Azure web app (now called app service). I'm unable to perform any remote debugging due to our company policies, so tracing is our best tool.
However, I've tried following various tutorials, but I still can't seem to get any of my trace information logged.
I've tried:
Setting the Application Logging (Filesystem) Level to Verbose, Information, Error -- nothing.
Looking for the logs in
the FTP server at /LogFiles/Application
the KUDU interface at https://.scm.azurewebsites.net and again, navigated to /LogFiles/Application
portal's Monitoring > Live stream (the section under Diagnostic Logs for the website)
Nada. I've even waited a few hours (thinking it might be a delay), and still nothing.
I setup a very basic hello world ASPX and all it does (in the Page_Load) is try to write 'hello' to the trace log using
Trace.TraceError
Trace.TraceInformation
Trace.TraceWarning
Trace.WriteLine
Console.Out.WriteLine
Console.Error.WriteLine
Some weird stuff I've also tried
setting my debug=true in my web.config
setting CustomErrors from RemoteOnly to Off
trying to use System.Diagnostics.TextWriterTraceListener
Anyone have any ideas I might try?
Exceptions in your live web app are reported by Application Insights. You can correlate failed requests with exceptions and other events at both the client and server, so that you can quickly diagnose the causes. You may refer this article: https://learn.microsoft.com/en-us/azure/application-insights/app-insights-asp-net-exceptions.
If you use NLog, log4Net or System.Diagnostics.Trace for diagnostic tracing in your ASP.NET application, you can have your logs sent to Azure Application Insights, where you can explore and search them. Your logs will be merged with the other telemetry coming from your application, so that you can identify the traces associated with servicing each user request, and correlate them with other events and exception reports. You may refer this article: https://learn.microsoft.com/en-us/azure/application-insights/app-insights-asp-net-trace-logs.
I'm able to successfully call my functions and make them do what I want them to do. The problem is that it doesn't look like the logs are being saved anywhere and I don't see how I can view them. Which I'll want to do in the event of an error. As a test I have my working function just doing a log.Info as soon as it's called. When testing locally it prints the message to the console. I believe I've enabled everything correctly but let me explain what I've done in case I didn't.
In my app service, under Monitoring -> Diagnostic Logs, I have enabled everything. Application Logging (filesystem) verbose, Application Logging (Blob) verbose (with the storage location set), detailed error messages and failed request tracing turned on.
In my function, I'm using the TraceWriter object that's passed to my run method (I started from a template).
Please note that functions are set to require authentication. If I click on the "Monitor" tab nothing appears. It just says "Loading..." forever and there's no information. Perhaps this is because of the authentication?
I used the Azure Storage Explorer to browse to my blob. The "log" blob exists, and I do see a set of nested directories that lead up to now. However it just contains a 354 byte file that contains a few lines of some random info. This file never seems to update or get larger.
I used FTP to try and browse to where the logs might be, but there's no directory on there that contains any log files.
I also went to KUDU for my function app ({myfunctionapp}.scm.azurewebsites.net/azurejobs/#/functions). While I do see that my function was called successfully, I don't see anything from the call to log.Info anywhere.
I tried using a different logger, and as a test did: System.Diagnostics.Trace.TraceError("test error");
I also don't see this message appearing anywhere.
Am I missing something as far as set up goes? Is the problem the fact that I require authentication? If it's the latter, is there still a way to view logs? I definitely have to have auth enabled. Thanks. And if it helps, below are links to what my settings and the monitor tab look like.
Settings: https://postimg.org/image/u57m2xbl5/
Monitor: https://postimg.org/image/uou10arch/
Authentication should not cause any problems with logging and Log.Info should work out of the box, no setup required.
I highly recommend that you enable AlwaysOn for your dedicated function app. The long loading of the Monitor tab could be because your site is in a 'cold' state, where it takes longer to start up.
If you go to {myfunctionapp}.scm.azurewebsites.net/DebugConsole and navigate to LogFiles/Application/Functions do you see any expected logs there? Also, when you run a function from the portal do you see logs in the log window?
Same thing happened to me if I had Fiddler open....close Fiddler and all is good.
I've started looking at adding Azure Application Insights to my app. The documentation and the SDK seems to be a bit sparse ...
I've added a call to Microsoft.ApplicationInsights.WindowsAppInitializer.InitializeAsync and the data is being successfully reported to the Azure Portal.
However, I want to provide a setting in the app so that the user can turn collection on and off. Is there a way of stopping collection or can I only "not start" collection? In other words, if the user changes the setting value, can I react to it straight away or just when the app starts up?
Thanks.
I have done this:
To dynamically stop and start the collection and transmission of telemetry:
using Microsoft.ApplicationInsights.Extensibility;
TelemetryConfiguration.Active.DisableTelemetry = true;
To disable selected standard collectors - for example,
performance counters
HTTP requests
dependencies
Delete or comment out the relevant lines in ApplicationInsights.config. You could do this, for example, if you want to send your own TrackRequest data.
Taken from App Insights Documentation:
I deployed a nodejs app on Google App engine following this tutorial https://github.com/GoogleCloudPlatform/appengine-nodejs-quickstart it was successful and now I want to check the logs of the nodejs server, like in development from the terminal console. The Vms are managed by google but even if I ssh to them I don't know where to look for the logs.
You can read the stdout of the docker container that your app runs by doing docker logs <container id> in the VM instance. You can get the container id from docker ps.
No need to SSH into the instance though. You can simply fetch the logs from the Developers Console under Monitoring > Logs.
As #tamberg mentioned in a comment, the easiest option I've found for looking at logs produced by Google App Engine instances running Node.js is to simply use the log viewer at:
https://console.cloud.google.com/logs/viewer?resource=gae_app
Detailed instructions from https://cloud.google.com/appengine/docs/standard/nodejs/building-app/viewing-service-logs are as follows:
Open the Logs Viewer in the GCP Console: https://console.cloud.google.com/logs/viewer?resource=gae_app
In the first filter dropdown at the top of the page, ensure GAE Application is selected, and choose Default Service.
Use the second filter dropdown to select only stdout and click OK. This limits the viewer to logs sent to standard output.
Use the text field above the dropdowns to search for the name you used when you submitted your form. You should see the logs corresponding to your submissions.
The default logging is really awful. None of my console.log messages show up! There are a few ways you can fix this.
1) Write logs to a log file.
For example, /var/log/app_engine/custom_logs/applogs.log
https://cloud.google.com/appengine/articles/logging
"Cloud Logging and Managed VMs apps Applications using App Engine
Managed VMs should write custom log files to the VM's log directory at
/var/log/app_engine/custom_logs. These files are automatically
collected and made available in the Logs Viewer. Custom log files
must have the suffix .log or .log.json. If the suffix is .log.json,
the logs must be in JSON format with one JSON object per line. If the
suffix is .log, log entries are treated as plain text."
2) Use winston with winston-gae.
Create a transport that will send the logs to appengine.
3) Use gcloud-logging module
Too verbose for my liking, but it is another option.