Supress logs while running Azure Powershell commands - azure

I an running Azure Powershell commands where I will be adding the Network rules to storage account, diagnostic logs storage account and Keyvault and I am using the following Azure commands.
Add-AzStorageAccountNetworkRule
Add-AzKeyVaultNetworkRule
And I am getting lots of logs while executing this in my powershell. Not only these commands, but most of the Azure commands will output some logs into the powershell console, the above two are just examples.
I want to know if there is any flag that we can add at the end of the Azure powershell commands so that it won't output any logs or output minimal logs into the powershell console.

Thank you #Gaurav Mantri, For the last blog you posted. its gave the output as per op's requirement for the logs. Posting it as an answer to beneficial for other community members.
We have also tried the same using the blog to see the logs if its appear or not using pipeline with | Out-Null and can able to not getting any logs for this. Also there are another 3 ways as mentioned in that blog.
We have tried with below cmdlts:-
Add-AzStorageAccountNetworkRule -ResourceGroupName "rgname" -Name "cloudtest*****" -IPAddressOrRange "10.0.0.0/7","28.2.0.0/16" |Out-Null
PLEASE CHECK MY OUTPUT SCREENSHOT FOR REFERENCE:-

Related

powershell error for AzureUSGovernment/GCCH Sentinel

I am trying to run a powershell script to deploy Azure Sentinel to a GCC high or AzureUSGovernment environment. Whenever I run the script, line 17 of my script returns an error. Below is line 17 of the script:
Set-AzSentinel -subscriptionid $subscriptionId -WorkspaceName $workspaceName.Name -Confirm:$true
This is the error message for line 17:
"Response status code does not indicate success: 404 (Not Found)."
What did I do wrong? This script works for commercial environment.
One of the workaround you can follow:
It seems it would be a environment issue Please Make sure that you have logged in to your Azure government portal through PowerShell by selecting the correct account details with subscription .
Please verify the limitation and availability for MICROSFT SENITEL as mentioned here .
NOTE:-
As Azure sentinel is generally available for Azure Government you can do the perform once you will connect with your
account.
For more information please refer this GitHub docs.

Job Suspended Run Login-AzureRmAccount to login using Azure AutomationAccounts System Managed Identity

I am trying to shutdown the VM using Azure Automation Account System Managed identity option.
However I am ending up with below error.
As per the other articles it is mentioned to upgrade the module Update-ModulesInAutomationToLatestVersion but I could not update due to below error. I am not sure what is the issue in the script. Same script works with AzureRunAsConnection option without issues ( script ).I even checked with simple login with System Managed Identity it successfully login and fetches the resource group names.
I have tested the above shared script in my automation account. Below are
the couple of observations:
You need to use Connect-AzureRMAccount -Identity cmdlet instead of 'connect-AzAccount` to connect to your subscription because the rest of the script that you have written using Azure RM cmdlets.
If we use AzureRM cmdlets in your run book script the job is getting suspended stating that Azure RM is going to retired and suggesting us to use Az Module in your workflow.
You can refer to this documentation on how to migrate your PowerShell scripts automatically from AzureRM to AZ modules.
If you want to perform start/stop on your virtual Machines you can leverage the Azure Automation start/stop during the off hours feature.
According to the MICROSOFT DOCUMENTATION and looking at your script the Azure Rm module is not supported and it has been updated to the latest version of Az module.
For more information please refer the below links:-
MICROSOFT DOCUMENT|Using a system-assigned managed identity for an Azure Automation account & Troubleshoot runbook issue.

Deploy of Arm Template with Logic Apps throws Error : Data sinks can’t be reused in different settings on the same category for the same resource

So I have setup a deployment of an Arm Template with some Logic Apps with some related diagnostic setting for Event hub, see img.
Event Hub Settings
However, when deploying the same template again, we get the error: "Data sinks can’t be reused in different settings on the same category for the same resource".
And the solution is to remove the diagnostic settings before a new deploy. But I don't want to manually do this each time we do a new deploy.
Have someone figured out a workaround for this?
Thanks!
You can either use PowerShell command or Azure CLI command to remove a diagnostic setting for the resource.
PowerShell command (You can find the documentation here):
Remove-AzDiagnosticSetting -ResourceId "Resource01" -Name myDiagSetting
Azure CLI command (You can find documentation here):
az monitor diagnostic-settings delete --name "myDiagSetting" --resource "Resource01"

Unable to Register Microsoft.DataFactory using Azure PowerShell

I am new to Azure Data Factory and PowerShell and trying to register Microsoft.DataFactory in Azure subscription using following command in Azure PowerShell
Register-AzureRmResourceProvider -ProviderNamespace Microsoft.DataFactory
but getting this error.
Could anyone help please
Close out your PowerShell console and reopen it. Then log back in, make sure you are on the right subscription (if you have multiple), and try the command again.
Log Into Azure:
New azure module command for that is -->
Login-AzAccount . I think the old command was --> Login-AzureRmAccount
.

Azure resource to check orphan resource

I am looking solution to find out Stopped | Deallocated resources Orphan Resources in Azure. I grab the VM data. But if someone spins the VM and VM showing running, How to check owner not used that VM since 30 Days.
az vm list -d --output table
Any automation suggestion will be welcome.
az vm list -d --output table
TESTSXG VM running
I see multiple queries here.
To identify if someone created any resource (say VM) and has forgot to deallocate it.
To check last login in VM if it is older than 30 days.
To check owner not used the VM(s) in the last 30 days.
If we don’t login to VM since a while and if some services (like Jenkins, etc.) are running and untouched.
To audit actions on resources and to determine the operations that were taken on resources, you may use Activity Logs. For more information refer this (https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-group-audit) link.
For #1, You may execute the below command.
Get-AzureRmVM -Status|select Name, PowerState
For #2 and #3, below is the command which you can run manually in the VM.
Get-WmiObject -Class Win32_NetworkLoginProfile |
Sort-Object -Property LastLogon -Descending |
Select-Object -Property * -First 1 |
Where-Object {$_.LastLogon -match "(\d{14})"} |
Foreach-Object { New-Object PSObject -Property #{ Name=$_.Name;LastLogon=[datetime]::ParseExact($matches[0], "yyyyMMddHHmmss", $null)}}
But I know that we are looking for an automated way to validate all the VM’s under your subscription. So here the requirement is to automatically (i.e., remotely) connect to all the ‘running’ VM’s from Azure portal and then get the required output. If i am not wrong, most probably we can achieve this requirement in multiple ways i.e.,
i. Log Analytics
ii. DSC
iii. Functions
iv. Runbook
v. Logic App
i. Create a Log Analytics OMS workspace and install OMS agent on the VM(s) as instructed here (https://learn.microsoft.com/en-us/azure/azure-monitor/learn/quick-collect-azurevm). Then add Azure Security Center (Security and Audit) solution in OMS so that the security events will be pushed to OMS repository. Then goto Log Analytics -> OMSworkspaceName -> Logs and run the below Kusto query to get the required output.
SecurityEvent
| where EventID == 4624
| sort by TimeGenerated desc
Note that the Event ID 4624 is the ID for the event log of any account logged on to a machine.
ii. Onboard Azure DSC on the VM(s) as instructed here (https://learn.microsoft.com/en-us/azure/automation/automation-dsc-onboarding) and write a DSC configuration script using ‘script’ DSC resource which will run the above mentioned Get-WmiObject…. command remotely on the DSC nodes (i.e., VM’s) and fetch us the required output.
iii. Write a HTTP trigger PowerShell function which will run the above mentioned Get-WmiObject…. command remotely (i.e., may be try a new ps session and invoke command) on the VM’s and fetch us the required output. You may refer this (https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-first-azure-function) link to learn about Functions.
iv. Write a PowerShell runbook which will run the above mentioned Get-WmiObject…. command remotely (i.e., may be try new ps session and invoke command) on the VM’s and fetch us the required output.
v. Currently Azure Logic Apps seems not support to run PowerShell and CLI script. However, we may try to use available Logic Apps Functions connector or any similar connector and internally try to call PowerShell to execute above mentioned Get-WmiObject…. command remotely. Just FYI here (https://feedback.azure.com/forums/287593-logic-apps/suggestions/33913552-run-a-powershell-code-within-a-logic-app-action) is a voice in Azure feedback regarding running PowerShell code within a Logic App, you could vote if you are interested in this option.
For #4, Install OMS agent on the VM’s so that the events details get stored in OMS repository. For example, if no one is logging in to a VM but Jenkins service is running on that VM then in that case you may want to not disturb that VM. So, to validate if Jenkins service is running on a VM or not you may have to run a Kusto query something like this.
Event
| where (EventLog == "System")
| where (RenderedDescription has "jenkins" and RenderedDescription has "stopped")
Hope this helps!!

Resources