Extract Alerts logs from Azure without Azure security Centre - azure

I want to extract alerts log in CSV format to show that I have received this type of alerts.
But unable to extract from azure log query Or I have to install some agent?

You may list all existing alerts, where the results can be filtered on the basis of multiple parameters (e.g. time range). The results can then be sorted on the basis specific fields, with the default being lastModifiedDateTime:
GET https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.AlertsManagement/alerts?api-version=2018-05-05
Similar with Optional Parameters:
GET https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.AlertsManagement/alerts?targetResource={targetResource}&targetResourceType={targetResourceType}&targetResourceGroup={targetResourceGroup}&monitorService={monitorService}&monitorCondition={monitorCondition}&severity={severity}&alertState={alertState}&alertRule={alertRule}&smartGroupId={smartGroupId}&includeContext={includeContext}&includeEgressConfig={includeEgressConfig}&pageCount={pageCount}&sortBy={sortBy}&sortOrder={sortOrder}&select={select}&timeRange={timeRange}&customTimeRange={customTimeRange}&api-version=2018-05-05
To check other URI parameter for Logging, you may refer this URL.
And finally when you have availed response(s) in JSON format, you may get that automatically converted into CSV format using any of the freely available online conversion utilities (like this service HERE)

Related

I want to create Datadog Metric SLOs and Monitors using terraform. I want to loop through a list of services that I want to terraform to create

I have ~20 Services that I want to monitor differently so for example I want the monitor to alert me if SerivceA is over 1 second but ServiceB is over 3 seconds. I currently have a list of services text file that is setup like
ServiceName,Threshold
For example:
ServiceA,1
ServiceB,3
(For context eventualy I want other tools to access this list of services so I kind of just want central list to maintain for all the tools)
I use the for_each loop in terraform to access each String(ServiceA,1)
Then use ${tolist(split(",", "${each.key}"))[0]} -> Name(ServiceA)
or ${tolist(split(",", "${each.key}"))[1]} -> Threshold(1)
In my datadog dashboard it creates and seperates the name from the threshold fine in the SLO. But when I want to create a monitor for this SLO I use:
query = "error_budget("${datadog_service_level_objective.latency_slo["${tolist(split(",", "${each.key}"))[0]}"].id}").over("7d") > 100"
But I am getting an error like this: Error Message
The ".id" Worked before and currently is working for the Availability monitor that is using a text file with just the names of the services. So no ",2" in the text file.
So I want to be able to loop through this list and have it create custom monitors based on the metadata I put in the text file. My end goal is to have multiple points of data to get really granualar for over 100 services eventualy.. I do not want to do this manually
I have tried creating a variable for the list of services but I need to loop through the list inside the resouce with meta data. I really do not see how having a seperate list for just the meta data would even work. I would love and apprieciate any feedback or advice. Thank you in advance

Get list of "new alerts" for azure monitor

I have KQL giving me counts of my alert by severity the only issue is when the user closes them (i.e updates the user response) no column in the alerts table is updated
So here is the azure triggered view
but the alerts table has nothing
This strikes me as a fairly normal ask
I am making the following assumption that you have a custom KQL query for Azure Resource Graph Explorer to identify Azure Monitor alerts.
Properties, such as alertState and monitorCondition are not standalone columns, but are nested properties within the dynamically typed "properties" column. As this is querying Azure Resource Graph, the records are updated directly, rather than adding a new log (as it would be in log analytics).
Below is a query that extracts the two relevant properties.
alertsmanagementresources
| extend alertState = tostring(parse_json(properties.essentials.alertState))
| extend monitorCondition = tostring(parse_json(properties.essentials.monitorCondition))
| project name, alertState, monitorCondition
If you need help, please share your query and what information you are looking to query.
Alistair

Getting the successful deployments between dates or onbefore

I am using the below API to get the successful deployment of a particular pipeline. I am not able to get the deployments happened between particular dates or before particular date.
I tried like below but not getting desired outputs any modification to the below API
Documentation can be found here (https://learn.microsoft.com/en-us/rest/api/azure/devops/release/deployments/list?view=azure-devops-rest-6.0#releasequeryorder)
https://vsrm.dev.azure.com/ABC/DEF/_apis/release/deployments?definitionId=111&sourceBranch=master&createdOn=2022-10-06
Based on your requirement, you need to get the deployments between particular dates or before particular date.
You need to use the parameter: minStartedTime and maxStartedTime to replace the CreatedOn to set the date .
For example:
Between particular dates
Rest API: use minStartedTime and maxStartedTime
Get https://vsrm.dev.azure.com/{OrganizationName}/{ProjectName}/_apis/release/deployments?definitionId=xx&minStartedTime=2022-02-01&maxStartedTime=2022-02-17&api-version=6.0
Before particular date use maxStartedTime
Get https://vsrm.dev.azure.com/{OrganizationName}/{ProjectName}/_apis/release/deployments?definitionId=xx&maxStartedTime=2022-02-17&api-version=6.0
For more detailed info, you can refer to this doc: Deployments - List

KQL (Keyword not Kusto) Nesting, Document Selecting for an Extranet

Longtime member, been a while since posting. Working on building out an Extranet and am running into a stupidly frustrating issue. First time using SharePoint Online as document repository for external (anonymous) users. In doing so, using Azure permissoning, have the documents split up in repositories on SharePoint based on access level. On top of that I am attempting to display them in Highlighted Content Web part, I am not able to sort them out by location AND type. I have a custom column in each repository that defines what type they are, but when I try to add the AND portion to the KQL it doesn't work. Additionally the internet seems to be massively void of actual documentation of KQL.
(
path:https://domain.sharepoint.com/sites/example/Level%201%20Resources/
OR
path:https://domain.sharepoint.com/sites/example/Level%202%20Resources/
OR
path:https://domain.sharepoint.com/sites/example/Level%203%20Resources/
OR
path:https://domain.sharepoint.com/sites/example/Level%204%20Resources/
OR
path:https://domain.sharepoint.com/sites/example/Level%205%20Resources/
OR
path:https://domain.sharepoint.com/sites/example/Level%206%20Resources/
AND
DocType:"Articles"
)
The above will simply pull all documents from those locations and ignore the AND statement. I have tried renaming it to call on the custom column identifier pulled from the source, and that doesn't work either.
The only real documentation I can find on this is: Here
Which doesn't appear to address filtering based on custom column tags.
EDIT: Reformatted to pull all docs from multiple locations using below, but the nesting portion still isn't working
path:(
"https://domain.sharepoint.com/sites/example/Level%201%20Resources/"
OR
"https://domain.sharepoint.com/sites/example/Level%202%20Resources/"
OR
"https://domain.sharepoint.com/sites/example/Level%203%20Resources/"
OR
"https://domain.sharepoint.com/sites/example/Level%204%20Resources/"
OR
"https://domain.sharepoint.com/sites/example/Level%205%20Resources/"
OR
"https://domain.sharepoint.com/sites/example/Level%206%20Resources/"
)
So the additional issue I was running into was the creation of a column to separate out based on the category of file type (not literal file type). Apparently SPO doesn't like it when you create a list and then reference that list to then filter by via KQL. So I found this morning this.
Apparently the best way to do this is create a custom "Choice" column, allow some time for it to flow and update, and then you can reference it via KQL.

Mapping columns from JSON in an Azure SQL Data Flow task

I am attempting a simple SELECT action on a Source JSON dataset in an Azure Data Factory data flow, but I am getting an error message that none of the columns from my source are valid. I use the exact configuration as the video, except instead of a CSV file, I use a JSON file.
In the video, at 1:12, you can see that after configuring the source dataset, the source projection shows all of the columns from the source schema. Below is a screen shot from the tutorial video:
image.png
And below is a screen shot from my attempt:
(I blurred the column names because they match column names from a vendor app)
Note in my projection, I am unable to modify the data types or the format. I'm not sure why not, but I don't need to modify either so I moved on. I did try with a CSV and I was able to modify the data types. I'm assuming this is a JSON thing, but I'm noting here just in case there is some configuration that I should take a look at.
At 6:48 in the video, you'll see the user add a select task, exactly as I have done. Below is a screen shot of the select task in the tutorial immediately following adding the task:
Notice the source columns all appear. Below is a screen shot of my select task:
I'm curious why the column names are missing? If I type them in manually, I get an error: "Column not found"
For reference, below are screen shots of my Data Source setup. I'm using a Data Lake Storage Gen2 Linked Service connected via Managed Identity and the AutoResolvingIntegrationRuntime.
Note that I tried to do this with a CSV as well. I was able to edit the datatype and format on a CSV, but I get the same column not found error on the next step.
Try doing this in a different browser or clear your browser cache. It may just be a formatting thing in the auto-generated JSON. This has happened to me before.

Resources