I have a VM running a service which I connect to from my application. Let's say it's a MongoDB service.
I would like to monitor with one of Azure's existing tools if the MongoDB service is up. The easiest way to accomplish this seems to be a simple tcpconnect on the MongoDB port (27017), and if it fails say 3 times, send an alert. I can't figure out how to configure this with either:
Operation Management Suite (OMS)
Log Analytics
Network Watcher.
Is it possible to configure this kind of monitoring/alerts with these services? Or is there another service that is managed by Azure which I can use to accomplish this?
I spent some time researching and trying to find some answer, since I haven't set this up personally.
Easiest I think would be to configure Log Analytics to read any custom logs your application creates and let it ingest them. Then create a log search with an alert that either alerts you or automatically run some custom logic.
I could not find any way to monitor if a port is open, e.g. how you can do with "psping www.google.com:80" for TCP 80.
In other words, to me seems easiest would be to:
Set up custom log collection in log analytics
Create a log search for the events you want to monitor
Set up an alert for the custom search
Related
I am following this tutorial and I am also able to add the logic app logs into azure log analytics but the problem is that logic analytics for logic apps is still in preview mode.
https://learn.microsoft.com/en-us/azure/logic-apps/monitor-logic-apps-log-analytics
I have few question regarding this.
should I use it for logging as it is still in preview mode.
If not what other options do I have to logs data in azure monitor?
Preview mode is mode where full-fledged features are not available. This type of modes is provided to give feedbacks to improve it better.
If you ask me about to use it or not, I usually use it and i get desired results, and it works fine for me Example-Reference.
The other way I monitor logs is by using below process:
So firstly, I send logs to Log analytics workspace and then, I created another logic apps and get logs by below process:
Another way is this to log Logs of Azure Logic Apps.
We have a table in Azure Log Analytics that keeps the logs from many different systems.
For example, our CommonSecurityLog table has the logs from different Firewalls. I have created a custom RBAC role that allows access to this specific table only but would like to go further and limit the access to specific rows only.
I did some research but can't find a way to do this, is it possible?
There's no way to do this natively in Azure - RBAC only supports controlling access at the Table level.
EDIT:
So, as #FidelCasto mentioned, there's also the option of using Custom Logs. This will be helpful in many cases when you need to collect Custom Windows-related, Application-related. This could be a more user-friendly option but obviously there will be other cases where it will not apply, specially when you have devices sending non-standard logs.
If your requirements are not met by the option above, the only other catch-all option is to put a Log Collector between the firewalls and Azure, and use a script to filter the logs before sending them over via the Log Analytics (OpInsights) REST API. You could use a PowerShell script to handle this.
Each Firewall would send their logs to a local/remote Log Collector.
Have a script query/filter through the logs with If/Else based on the Firewall name.
For each Firewall, you would create a new Log-Type based on the Firewall name. Log-Type corresponds to the table name in Log Analytics.
Assign permission based on the newly created custom tables.
It's not as straight-forward but gets the job done!
I wanted to monitor Azure Logic Apps with the help of Azure Monitor alerts. In alerts, I came across a metric Run Throttled events which is showing some numbers in recent days. But I couldn't find any events anywhere to resolve the issue. Is it possible view the actual run throttled events in Azure Portal?
You will need to setup diagnostic logging for Logic Apps, see here.
When you are done with the setup and initial run through of logs and if interested you want to look at more advanced queries via this logs data then go here.
Specifically on throttling you need to see this. Also take a look at limits set for Logic Apps from here as well.
We shut down our azure vm's every day at 11pm and start them at 7am, except for weekends.
Is there a way we can check the heartbeat on weekdays only between 7.30am and 10.30pm to see if the server is alive and working?
If so, how can I send a mail for the servers that miss the heartbeat during that time?
There doesn't seem to be any support from any Azure service for creating monitoring or alerts that fire only between certain hours.
It seems Azure monitoring, Log Analytics or specific service alerts (like specific a VM) all assume the alerts should be active at all times.
Some possible solutions that might work that I came up with:
If you need to use Azure services, you could create an Azure Automation PowerShell runbook
Set it to trigger regularly (every hour?) and in code handle the time interval you are interested in
Also, in code do the testing if the service is up
In my opinion, feels a bit hackish..
Use some external tool for monitoring your services, like e.g. Nagios
See e.g. this link for how it could work
I would like to fetch window event logs from Azure webrole's instance. when i connect to RDP of instance then by Event viewer i am able to see Window Event Logs of "Application" & "Error" type. Is there any way to directly access that logs using API or something else or i need to use Diagnostic to enable that log data to transfer in to storage and from there i can access?
IF you want to directly access the Azure VM Event logs, the best option is to use use Azure Diagnostics and Azure Cmdlets to access Event Log details. In my understand this one is very easy to setup and once you have access to Event logs, you can download and save it to your local machine. The method is described as below:
http://michaelwasham.com/2011/09/20/windows-event-logs-with-windows-azure-diagnostics-and-powershell/
There is another simple method is that you can create an ASP page and use Event Log API to simply access the event log directly on your web role and customize it the way you would want to see. You can find several examples on net on how to do it i.e. this one. This is a very simple way to get what you are looking for and the only drawback to this approach is that the ASP page will be available on website, unless you find some ways to protect it.
Although you can use any of the above method, setting up Windows Azure Diagnostics to collect Event log from the machine and send to Windows Azure Storage, is best and preferred method. The steps are described here, in case you don't know:
https://msdn.microsoft.com/en-us/library/windows/desktop/bb427443(v=vs.85).aspx
Any other method to collect these logs (using Azure Connect and Remoting etc) would be complex and troublesome.
Should be possible using http://technet.microsoft.com/en-us/library/cc766438.aspx
Although the port will be blocked by default, you will need to change the firewall settings.
See this article for the port numbers (search event log) http://support.microsoft.com/kb/832017/en