I'm setting up logging for Azure service.
Currently, messages I get in wadlogstable look like this:
<Properties>
<EventTickCount SqlType="bigint">635193311660155844</EventTickCount>
<DeploymentId SqlType="nvarchar(max)">deployment21(67) </DeploymentId>
<Role SqlType="nvarchar(max)">HTMLConverterWebRole </Role>
<RoleInstance SqlType="nvarchar(max)">deployment21(67).HTMLConverterWrapper.Cloud.HTMLConverterWebRole_IN_0 </RoleInstance>
<Level SqlType="int">2</Level>
<EventId SqlType="int">0</EventId>
<Pid SqlType="int">6900</Pid>
<Tid SqlType="int">14840</Tid>
<Message SqlType="nvarchar(max)">2013-11-06 12:39:25.8449|ERROR|My error message</Message>
</Properties>
I haven't been to production yet, but I suppose that it's pretty inconvenient to search in xml. What are the best practices for this? Can I customize the elements in it? I don't think I really need Pid, Tid, also I don't see a purpose of EventId.
Update: I'm actually using NLog right now, but I'm doing it as described here: http://awkwardcoder.blogspot.com/2012/03/getting-nlog-working-with-azure-is-as.html
So it posts logs to Trace target and as I understand traces are captured by DiagnosticMonitorTraceListener, ending in Windows Azure table. So I'm using NLog to format my "Message" element in the resulting xml, also "Level" and "EventId" are elements are dependent on which NLog method I call (Logger.Debug* or Logger.Error* etc.), but I don't have access to general format of this xml. Also, I would probably prefer custom logging table with dedicated fields for "Level", "Date" and so on, so I don't have to parse it in each log query.
Unfortunately you don't have control over the format of the data which gets logged automatically by Windows Azure Diagnostics. You could get fine grained control if you use custom logging. For custom logging you could use something like NLog. In that scenario, the data logged by your application is stored in files and get automatically transferred to blob storage using Windows Azure Diagnostics.
You can also use Perfomance Counters + 3rd party tools to display results (i.e New Relic) or you can build your own dashboard.
http://www.windowsazure.com/en-us/develop/net/common-tasks/performance-profiling/
http://www.codeproject.com/Articles/303686/Windows-Azure-Diagnostics-Performance-Counters-In
http://michaelwasham.com/2011/09/19/windows-azure-diagnostics-and-powershell-performance-counters/
Related
we need to convert XML data into csv/excel/table in Azure cloud.
below is the sample xml code.
<SOAP-ENV:Envelope
xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/">
<SOAP-ENV:Body>
<ns2:getProjectsResponse
xmlns:ns2="http://www.logic8.com/eq/webservices/generated">
<ns2:Project>
<ns2:fileName>P10001</ns2:fileName>
<ns2:alias>project1</ns2:alias>
</ns2:Project>
<ns2:Project>
<ns2:fileName>P10002</ns2:fileName>
<ns2:alias>project2</ns2:alias>
</ns2:Project>
<ns2:Project>
<ns2:fileName>P10003</ns2:fileName>
<ns2:alias>project3</ns2:alias>
</ns2:Project>
</ns2:getProjectsResponse>
</SOAP-ENV:Body>
</SOAP-ENV:Envelope>
Expected output:
Can anyone help me on this.
You could try this way, firstly convert xml to json then use create csv table action to implement it. The below is my test flow.
I use blob to get the xml content. The Compose action input is json(xml(body('Get_blob_content'))), then will get thejson data. Then is the Create CSV table From, ause the from data should be array, so it should be outputs('Compose')['SOAP-ENV:Envelope']['SOAP-ENV:Body']['ns2:getProjectsResponse']['ns2:Project'].
The last thing is customize the header and the value, the ProjectID value should be item()['ns2:fileName'] and the ProjectDescription should be item()['ns2:alias'].
And here is the flow output, suppose this is what you want, hope this could help you.
Do you have any experience with Azure? I ask because from your question it sounds like you’re not sure where to start or which service to use. I’m also curious if you were given a requirement to use Azure or you thought Azure might be the solution yourself. Also where is this XML coming from? It looks like a SOAP request.
If you are a developer I’d consider authoring a Web App in .Net, it can use MVC, Core, Web APIs, and use it to consume this SOAP request and translate it and save the file.
For this I’d consider using an XMLDocument class to load the XML and parse through it.
But if you absolutely need to use Azure, the closest thing that would help automate this is Azure Logic Apps. It offers many “no-code” solutions to plug in connectors that can transform and save data.
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-enterprise-integration-transform#how-to-use-a-transform
If you elaborate on your situation I’d be happy to offer further solutions
I'm using log4net.Appender.AzureAppendBlobAppender to log my web app's info & errors. Sometimes I'm getting the "BlockCountExceedsLimit" exception. It is due to the append blob accepts only 50,000 block commits after that it through the exception (Conflict (409)). I have checked the code and found that it waits for the 512 log events and flush each log entry separately to the append blob. So, we can log only 50,000 log entries in a day.
Can anyone please help me on this? Does anyone know any alternate for this?
Thanks,
Karthik
According to your description, I assumed that you are using log4net.Appender.Azure nuget package. As you can see under AzureAppendBlobAppender.cs:
private static string Filename(string directoryName)
{
return string.Format("{0}/{1}.entry.log.xml",
directoryName,
DateTime.Today.ToString("yyyy_MM_dd",
DateTimeFormatInfo.InvariantInfo));
}
Per my understanding, you could follow AzureAppendBlobAppender.cs to write your custom AzureAppendBlobAppender and adjust the Filename,SendBuffer methods to meet your requirement.
I'm using log4net.Appender.AzureAppendBlobAppender to log my web app's info & errors.
Since you use azure web app to host your application, you could use the built-in Application Logging (Blob), and azure side would help you generate the logs hourly. You could log into Azure Portal, choose your web app, enable application logging (Blob), set the logging level to Information, details you could follow Enable diagnostics logging for web apps in Azure App Service.
For your application, you could use the following code to log info and errors.
System.Diagnostics.Trace.TraceError("xxxxx");
System.Diagnostics.Trace.TraceInformation("xxxxx");
I've changed the code to a little bit to append the blob, once the buffer reaches the threshold value (512 log entries) it'll flush the log entries in single commit.
I'm working with a production system that has a moderate amount of load. The amount of trace events and AI sends up is way too detailed, and makes it difficult to wade through logs later.
Each request to the server has information such as:
Message='Selected formatter='JsonMediaTypeFormatter', content-type='application/json; charset=utf-8'', Operation=DefaultContentNegotiator.Negotiate
and
Message='Action returned 'RZ.API.Support.Controllers.OperationActionResult`1[System.Collections.Generic.List`1[RZ.Entity.System.ClientMessage]]'', Operation=ReflectedHttpActionDescriptor.ExecuteAsync
There are maybe 30 entries for each request!
I just need the request type:
12/16/2015, 9:17:29 AM - REQUEST
GET /api/v1/user/messages
And the result code - as well as any custom stuff I do along the way.
So basically I want to trim most the traces except the request and the result (and any errors etc).
I have my eye on this bad boy in the AI config:
<Add Type="Microsoft.ApplicationInsights.Web.RequestTrackingTelemetryModule, Microsoft.AI.Web"/>
... but I cannot for the life of me see any doco on how to ask it to reduce the amount of stuff that is sent!
Any help is much appreciated.
Jordan.
P.S. All the extra logging has put us over the 15m a month plan, we had to upgrade!
RequestTrackingTelemetryModule does not do anything like you described. It adds requests, exceptions and dependencies collection. And in you example you are saying you see verbose WebApi traces being forwarded to ApplicationInsights. I assume you actually use Application Insights logging adapter.
Here you can read how WebApi traces can be forwarded to AI Version 1: http://apmtips.com/blog/2014/11/13/collect-asp-dot-net-mvc-web-api-traces-with-application-insights/
Here you can read how WebApi traces can be forwarded to AI Version 2:
http://apmtips.com/blog/2016/01/05/webapi-tracing-powered-by-ai-in-vs2015-update1/
Source code of logging adapters: https://github.com/Microsoft/ApplicationInsights-dotnet-logging
Documentation: https://azure.microsoft.com/en-us/documentation/articles/app-insights-search-diagnostic-logs/#trace
So you have multiple options:
Do not use logging adapters
Change verbosity of WebApi tracing (read http://www.asp.net/web-api/overview/testing-and-debugging/tracing-in-aspnet-web-api). I would prefer this one since you probably want to collect failures.
Remove WebApi tracing (as you did)
To answer my own question.
In my WebApiConfig file, I had:
config.EnableSystemDiagnosticsTracing();
Removing this line drastically cut down the clutter to what I was trying to achieve.
As of version 2.0 of the Application Insights SDKs, you can also limit the data sent by enabling sampling:
https://azure.microsoft.com/en-us/documentation/articles/app-insights-sampling/
if you add
<MaxTelemetryItemsPerSecond>5</MaxTelemetryItemsPerSecond>
to your ApplicationInsights.config, the sdk can limit how much goes out. The article above has a LOT more settings/configuration you can use to get other specific behavior, but the one above is the simplest.
As far as I know there are no configuration options available for the RequestTrackingTelemetryModule. You could just turn it off (by uninstalling the respective NuGet package or commenting the xml) and / or install different / additional telemetry modules.
See app-insights-configuration-with-applicationinsights-config for a list of modules and configuration options.
Is there a way through the brasiers like firebug or another browser plugin to do traces or log console from a cfc file.
I'm completely new to CF so sorry if this seems like a stupid question.
If you want logs to be visible in the browser ColdFire is your best choice. With it, you can see all of ColdFusion's extended debugging information even on a production site. Unless you have the proper authentication via ColdFire the server won't spit out the extended info.
As #gillesc recommended, you can use LogBox which is extracted from the ColdBox framework. The ColdBox Framework has a debugging mode that allows you to trace messages to the bottom of the page, or, to a separate window. This is useful even on production sites since you can observe the tracer methods from other users.
Finally, you can simply print to the console using writeDump(var="my log message",output="console") for quick debugging--or--use the <cflog> tag to save log messages to a named log file which you can monitor using tail. For a dead simple solution, you can save the log file to the root of your site and simply press F5 to see the new log entries; however, I do not recommend this practice (unless you are saving credit card information and share that file with me :).
Hope this reply helps.
Aaron
There is a cftrace tag that will allow you to log output to the console, among other spots in your application and development environment.
<cftrace category="init data" type="Information" var="myvartooutput" />
Calling this tag will output the relevant content in a few places:
The console in ColdFusion Builder, if you are using that IDE
In Dreamweaver, the Adobe docs mention a server debug tab/view (I don't use DW, so am not sure)
At the end of the request in the debug output
In cftrace.log, which is in your log directory (/COLDFUSION/INSTALL/DIR/logs/cftrace.log)
You can also use the tag cflog to write data to one of the standard log files or you may choose to have it write the desired data to a custom log file.
<cflog file="customlog" application="no" text="Output #somevar#!" />
If "customlog" does not exist, CF will create it for you (in the same location noted above).
Hope that helps!
EDIT: I offered this more of an alternative way to using to Firebug ... if you want the logs/traces but were not necessarily wed to a browser/plug-in.
If you've got CF Builder you can actually set up a debugger, but it's terribly slow. Here's the documentation on that: http://help.adobe.com/en_US/ColdFusionBuilder/Using/WS0ef8c004658c1089-31c11ef1121cdfd6aa0-7fff.html
There's also ColdFire, which is a Firebug add-on. Never used it before but I hear good things: https://github.com/nmische/ColdFire/
Try ColdFire for firebug extension
http://coldfire.riaforge.org/
I am using log4net in a web app, and log all page errors to a SQL server. I was wondering if there was any way to retrieve the entry ID generated by it. I'm going off of the documentation found here
http://logging.apache.org/log4net/release/config-examples.html
I want to use this ID as a reference number I can show to a customer so that they may contact customer support to lookup in the system and not have to go through a log file.
Apart from writing your own appender as floyddotnet suggested you could consider:
Use a GUID. You can easily generate it in your application and will serve most of your purposes. Drawback: It may be inconvenient for the customers if they try to tell your support stuff about it on the phone. If you have only email support than this is maybe not an issue.
Consider creating an incident number outside of the logging framework. A quick call to a stored procedure that returns an ID that you save in a nullable field in your log table.
A combination of the above: Use a Guid and after logging you call a stored procedure that creates an incident and returns the ID.
Writing an appender that returns the ID creates a dependency between your application and appenders that you normally do not have: Log4net was designed with a clear separation between logging and writing the log messages somewhere. The appender that you need would affect that separation.
Since the ID is generated by the database and not by log4net, I don't believe this information is available to you.
What I've done in using log4net for such conditions is to include a datetime stamp in the message that goes down to the millisecond and present that to the user as a reference number. You can do then do a simple SQL query to get to the message in the log table.
I'm not sure its posible but you can write your own Appender for log4net end store this information in the log4net-context.
Howto writing an appender for log4net:
http://www.alteridem.net/2008/01/10/writing-an-appender-for-log4net/
Context-Description:
http://logging.apache.org/log4net/release/manual/contexts.html