I want to know if is there a way to see all the requests made to my service fabric application while it is already published on azure servers, or at least capture all requests and save them in some place, I need informations like source and body.
Thanks in advance!
To see all the requests and responses, you first need to log them in somewhere. Here are the available approaches:
Streaming into VS Cloud Explorer
You could use ServiceEventSource to log the valuable information and
then you will be able to see it by attaching to your SF cluster via
CloudExplorer in VS. Here is you could find more info - Debug your
Service Fabric application by using Visual
Studio.
Windows Azure Diagnostics
WAD extension that you could install on your VM-s uploads logs to Azure Storage, and also has the option to send logs to Azure Application Insights or Event Hubs. Check out Event aggregation and collection using Windows Azure Diagnostics.
EventFlow
Using EventFlow allows you to have services send their logs directly to an analysis and visualization platform, and/or to storage. Other libraries (ILogger, Serilog, etc.) might be used for the same purpose, but EventFlow has the benefit of having been designed specifically for in-process log collection and to support Service Fabric services.
Event analysis and visualization with OMS
When OMS is configured, you will have access to a specific OMS workspace, from where data can be queried or visualized in dashboards.
After data is received by Log Analytics, OMS has several Management Solutions that are prepackaged solutions to monitor incoming data, customized to several scenarios. These include a Service Fabric Analytics solution and a Containers solution, which are the two most relevant ones to diagnostics and monitoring when using Service Fabric clusters. Find more info on Event analysis and visualization with OMS and Assess Service Fabric applications and micro-services with the Azure portal.
And there is a number of ways how you could capture source and body, or whatever you need. Below you could find some:
Apply ActionFilterAttribute on your controller class if you don't
have one, and log all the information you need within OnActionExecuted method
Add the middleware in Startup class -
public static void ConfigureApp(IAppBuilder appBuilder)
{
// Configure Web API for self-host.
HttpConfiguration config = new HttpConfiguration();
config.Routes.MapHttpRoute(
name: "DefaultApi",
routeTemplate: "api/{controller}/{id}",
defaults: new { id = RouteParameter.Optional }
);
appBuilder.Use(async (IOwinContext context, Func<Task> next) =>
{
await next.Invoke();
// log anything you want here
ServiceEventSource.Current.Message($"Response status code = {context.Response.StatusCode}");
});
appBuilder.UseWebApi(config);
}
Related
I'm new to the azure environment and wondering how it's possible to monitor an azure container app? Currently I've deployed a nodejs application by running a container app and I know how to query some logs by using the protocols section.
What I'm really looking into is how to get metrics like incoming requests or vcpu usage but I don't know how to get those metrics using azure monitoring.
How can I access those values?
It is possible to add Azure application insights SDK to your nodejs project. It will monitor your app activity like incoming/outcoming requests, database operations and etc. Also there is an option to add automatic metrics gathering:
See this documentation link for details.
let appInsights = require("applicationinsights");
appInsights.setup("<instrumentation_key>")
.setAutoDependencyCorrelation(true)
.setAutoCollectRequests(true)
.setAutoCollectPerformance(true, true)
.setAutoCollectExceptions(true)
.setAutoCollectDependencies(true)
.setAutoCollectConsole(true)
.setUseDiskRetryCaching(true)
.setSendLiveMetrics(true)
.start();
Are the resource logs (which are part of platform logs) from Azure supported in QRadar or do we need to build a custom parser for each of the resource type in the subscription?
I read the DSM documentation of QRadar, and it mentions platform activity logs, but not resource logs. Let’s take an example where we get gateway logs, websocket connection logs, request logs, etc. from our Azure deployment. Are all resource logs supported by QRadar to be taken from event hub and integrate to QRadar (list of supported resource logs by QRadar)?
if I understand your question correctly you are looking to extend existing parsers to QR without having to implement custom properties.
For this IBM has published the "IBM QRadar Content Extension for Azure":
https://exchange.xforce.ibmcloud.com/hub/extension/7a89f51852efa37de0809457ef1006dd
I recommend installing another extension "Microsoft Azure Security Center Connected Assets & Risks Connector" (https://exchange.xforce.ibmcloud.com/hub/extension/0dbfab6a22bca7add7a99fa19fdd426f), which allows you to monitor other risk events via ASC and integrate assets that are not yet parsed into the QR.
And probably the best scenario how to solve issue with Azure log data is to run side-by-side QR + Sentinel and use Azure Sentinel and turn on Data Connectors for Azure specific resources. This keeps you up to date with integration, data parsing and current buildin rules. We have this scenario deployed and it is for selected sources (Exchange, Teams, risk signins, etc.) and we monitor them via buildin rules in Sentinel. Subsequently, we integrate them into the QR see. https://techcommunity.microsoft.com/t5/microsoft-sentinel-blog/azure-sentinel-side-by-side-with-qradar/ba-p/1488333. We finally store the logs in QRadar, but we use Sentinel for Azure-specific rules and then integrate the incidents into QR.
Regards.
In this question, regarding Azure Application Insights Analytics, it answered a concern about managing the log(s) output and associated Azure communication of those logs or "telemetry" back to the Azure cloud service.
However, the ServerTelemetryChannel existing from the class ITelemetryChannel doesn't seem to be apart of the nodejs sdk.
The question becomes, is that ability therefore not apart of the node sdk? If so, is there a workaround for similar functionality?
As per this code, it looks like there is an option to have retries on locally saved telemetry in AI node.js SDK:
public setDiskRetryMode(value: boolean, resendInterval?: number, maxBytesOnDisk?: number) {
This option seems to be on by default:
let _isDiskRetry = true;
We've logs(W3CIISLogs) on Log analytics workspace for websites hosted on VMs. Similarly we have app insights enabled for websites hosted on App service. Now we want to access telemetry data of both type of websites thru single interface, either via app insights or via Log analytics. Just wondering if it's possible and what's the best way.
With Azure Monitor you can now query not only across multiple Log Analytics workspaces, but also data from a specific Application Insights app in the same resource group, another resource group, or another subscription. This provides you with a system-wide view of your data. You can only perform these types of queries in Log Analytics.
Querying across Log Analytics workspaces and from Application Insights - reference another workspace in your query, use the workspace identifier and for an app from Application Insights, use the app identifier.
Cross-resource query limits:
The number of Application Insights resources that you can include in
a single query is limited to 100.
Cross-resource query is not supported in View Designer. You can Author a query in Log
Analytics and pin it to Azure dashboard and visualize a log search.
Cross-resource query in log alerts is supported in the new
scheduledQueryRules API. By default, Azure Monitor uses the legacy
Log Analytics Alert API for creating new log alert rules from Azure
portal, unless you switch from legacy Log Alerts API. After the
switch, the new API becomes the default for new alert rules in Azure
portal and it lets you create cross-resource query log alerts rules.
You can create cross-resource query log alert rules without making
the switch by using the ARM template for scheduledQueryRules API –
but this alert rule is manageable though scheduledQueryRules API and
not from Azure portal.
Documentation Reference - Cross-Resource Log queries in Azure Monitor
Hope the above information helps.
I have a MVC website that gets published to Azure where it uses an Azure SQL Database.
The time has come where we need to run a scheduled task to send SMS reminders. I was under the impression that Azure Web Jobs was a good fit for this but am having some issues getting it up and running.
I have added a console app to my website solution and referenced by EF data model from the console app (which I would like to publish as a web job).
The current console app looks as follows:
class Program
{
static void Main(string[] args)
{
JobHost host = new JobHost();
host.RunAndBlock();
}
public static void ProcessNotifications()
{
var uow = new KirkleesDatabase.DAL.UnitOfWork();
uow.CommunicationRepository.SendPALSAppointmentReminders();
}
}
Running the console app will then throw the exception:
Additional information: User account connection string is missing. This can be set via the 'AzureJobsData' connection string or via the constructor.
This suggests that the Web Job is specifically looking for a connection string that points at a storage account. However, I would like the web job to query an Azure database rather than work with storage.
Is this doable?
Thanks,
As Victor wrote, the sdk events are triggered by blobs, queues and tables.
A simple solution for your need: write a console application with a pollig approach. No sdk needed. Details at the beginning of http://blog.amitapple.com/post/73574681678/git-deploy-console-app/
while(true)
{
if ("isthereworktodo?")
{
var uow = new KirkleesDatabase.DAL.UnitOfWork();
uow.CommunicationRepository.SendPALSAppointmentReminders();
}
Thread.Sleep(10000); // set an appropriate sleep interval
}
Unfortunately the WebJobs SDK does not support SQL databases as triggers. For triggers, it only supports Azure Storage (blobs, queues and tables).
You can access Azure SQL Databases from the web job as demonstrated in this answer.
To create a web job you don't have to use the webjobs sdk, you can use several types of executables (.exe, .cmd, .js, .php, .py, ...).
Try using this new visual studio add-on: http://visualstudiogallery.msdn.microsoft.com/f4824551-2660-4afa-aba1-1fcc1673c3d0, follow the steps there to link between your web application and your console application which will be deployed as a webjob when published to your Azure site.