How to Fetch only Azure Computers in OMS Log Analytics search query - azure

I am trying to fetch underutilized computers in Azure. I am trying to use OMS Log Analytics query for this.
The query returns the Azure VMs and on-premise servers as well (fetched via SCOM or direct agent). I need to filter the result to get only the Azure VMs. What is the best way (or query) to fetch only Azure computers in OMS Log Analytics Search Query?
I know that I need to create a Computer Group and then use that Computer Group in my Query as shown below.
Type=Perf ObjectName=Processor CounterName="% Processor Time" Computer IN $ComputerGroups[AzureComputers]
I need to know what Query should I use to create the Computer Group "AzureComputers" which is used in the above query.This computer group will contain only computers which are present in Azure i.e. which are Azure VMs.

This feature is now provided out of the box in OMS Log Analytics. Finding Azure computers is as easy as running below query in Log Analytics:
 
Heartbeat | where ComputerEnvironment == "Azure" and notempty(ResourceId) | distinct Computer

Related

Question regarding RDP Brute force location

Im looking into where exactly i can find these details ? We have an inbound rule about RDP.
But i recieved this image but fail to find it in the Azure portal ?
To Detect RDP brute force in Azure you need to use Azure Sentinel.
Once you enable Azure Sentinel you can view these events in Azure log analytics workspace.
You need to install Microsoft Monitoring Agent (MMA) in all the VM you want to check the brute force.
In log analytics workspace you need to configure Agent configuration.
After enabling Log Analytics Workspace, you need to create a Schedule query rule in Azure sentinel dashboard.
Here is the KQL script for log analytics workspace
SecurityEvent
| where EventID == 4625
| project TimeGenerated, EventID, WorkstationName, Computer, Account, LogonTypeName, IpAddress
| extend AccountEntity = Account
| extend IPEntity = IpAddress
Go through this Azure sentinel document for complete information.

How to get list of compute instance size under Azure Machine Learning and Azure Databricks?

Goal here is to query a list of frequently used compute instance size under Azure Machine Learning and Azure Databricks using Azure Resource Graph Explorer from Azure Portal using Kusto query. From the documentation here, there is a list of resources can be queried but there isn't any compute under microsoft.machinelearningservices/(not classic studio) and Microsoft.Databricks/workspaces.
Below is what was tried, to get VM instance size but not showing what we have under Azure Machine Learning/Azure Databricks.
Resources
| project name, location, type, vmSize=tostring(properties.hardwareProfile.vmSize)
| where type =~ 'Microsoft.Compute/virtualMachines'
| order by name desc
Unfortunately, Azure Resource Graph Explorer doesn't provide any query
to get any compute related information from both, Azure Machine
Learning and Databricks.
Though Azure Resource Graph Explorer supports join functionality, allowing for more advanced exploration of your Azure environment by enabling you to correlate between resources and their properties. But these services only applicable on few Azure resources like VM, storage account, Cosmos DB, SQL databases, Network Security Groups, public IP addresses, etc.
Hence, there is no such Kusto query available in Azure Resource Graph Explorer which can list compute instance size of Machine Learning service and Databricks.
Workarounds
Machine Learning Service
For machine learning service you can manage the compute instance directly from ML service by using Python SDK. Refer Python SDK azureml v1 to know more.
Azure Databricks
Cluster is the computational resource in Databricks. You can filter the cluster list from Databricks UI and manage the same. Features like cluster configuration, cluster cloning, access control, etc. are available which you can used based on your requirement. For more details, please check here.

Is it possible to query Azure data warehouse within log analytics

I have a scenario where I would like to query Azure Data warehouse tables within the Log Analytics workspace and using those records I need to create a result set and prepare a chart.
I do see some objects in log analytics workspace like a database, table but not sure what is the purpose and are these objects specific to a resource or generic and how to use them I couldn't get documentation for these objects can somebody guide me on this.
Unfortunately, you cannot use Azure Log Analytics to query Azure SQL Data Warehouse.
Use Azure Data Studio to connect and query data in Azure SQL data warehouse.
Recommended tools for querying data in Azure SQL Data Warehouse.
Azure Log Analytics is used to write, execute, and manage Azure Monitor log queries in the Azure portal. You can use Log Analytics queries to search for terms, identify trends, analyze patterns, and provide many other insights from your data.
For more information about log queries, see Overview of log queries in Azure Monitor.
For a detailed tutorial on writing log queries, see Get started with log queries in Azure Monitor.

Can I output to a VM running SQL Server from Azure Analytics job?

Generating output to a Azure SQL database is supported, but I was shocked when I found that the portal does not allow to specify a SQL Server database running on a VM. Is not this supported?
We need to store lots of data coming through the ASA jobs, and use SQL Jobs, that's why we were planning to use a SQL Server VM.
Thanks!
You cannot configure the SQL Database running on VM as an output to the ASA job.
However, Azure provides SQL services with 2 variants
Microsoft Azure SQL Database (Azure SQL Database) as PaaS
where lower stack is managed by Microsoft Azure and billed as pay-as-you-go model.
and
SQL Server in Azure Virtual Machine (VM) as IaaS where user owns the VM and make any changes, including licences for the SQL database.
the Microsoft Azure SQL Database provided as PaaS is configurable as
ASA output.
one idea might be to create and Event Hub output for the ASA and then consume it from there using any sort of application to write into an IaaS SQL DB. The application that consumes the data can also be hosted as a Web App as well.
Hope this helps.

Determine SQL Azure Region without using Admin Console

I am working on a solution that uses SQL Azure. Part of the project deals with backups and using the DAC Web Services for backups.
The issue is that there is a different endpoint depending on which region the Azure SQL database is in. As I am working with multiple groups, and cannot ensure which region the database will be in, I am looking for a way to programmatically determine the region.
The region is also important, as I want to copy the backups to a different region just to be on the safe side.
I know that I can look in the Admin console, but I would like to use code to solve this problem.
Additional information:
The application is running on Azure using Worker Roles for functionality.
I do not have access to all of the account-id's to use the full REST API.
I do have access to the master database on the Azure Sql Server.
Working on this in C# (I failed to put the language)
You can use Get Servers request (GET https://management.database.windows.net:8443/<subscription-id>/servers) of Azure REST API to enumerate SQL servers which gives the Location or Region more info at msdn -> http://msdn.microsoft.com/en-us/library/windowsazure/gg715269.aspx

Resources