I'm trying to deploy sentinel alerts into sentinel using Azure Runbook by using the below command:
Import-AzSentinelAlertRule -WorkspaceName "xxx" -SettingsFile "test_alert.json"
The SettingsFile of this command expects a path of json as parameter. How we can pass the json file to runbook?
How we can pass the json file to runbook?
I have reproduced in my environment and I followed Microsoft-Document and I got expected results as below:
Param(
[parameter(Mandatory=$true)]
[object]$json
)
$json = $json | ConvertFrom-Json
Then save and publish runbook.
Then open your local windows PowerShell and follow below steps:
Step1:
Connect-AzAccount
Step2:
$json = (Get-content -path "C:Downloads\xy.json") | Out-string
Step3:
$RBParams = #{
AutomationAccountName = 'rithwikrunning'
ResourceGroupName = 'XX'
Name = 'xy'
Parameters = $JsonParams
}
XX- Name of the resource Group
xy- Name of the runbook
Step4:
$job = Start-AzAutomationRunbook #RBParams
Now the json file is passed to run book and it got started:
Now the content of the file or file is in $json variable in runbook.
References:
Transferring Microsoft Sentinel scheduled alert rules between different workspaces using PowerShell - Microsoft Community Hub
Related
I have created number of resources in azure via portal. how to add the tags to each existing resource using PowerShell or CLI?
AZURE POWERSHELL
To add new tag you use New-AzTag.
New-AzTag -ResourceId $resource.id -Tag $tags
To add tags to a resource that already has tags, use Update-AzTag.
Update-AzTag -ResourceId $resource.id -Tag
AZURE CLI
The az tag create replaces all tags on the resource, resource group, or subscription.
az tag create --resource-id $resource --tags
To add tags to a resource that already has tags, use az tag update.
az tag update --resource-id $resource --operation Merge --tags Dept=Finance Status=Normal
This document gives you more details.
You can use the PowerShell script to update the tag to the resource which are null
connect-azaccount
$resources=Get-AzResource | Where-Object Tags -eq $null ##list all the resource whose tags are empty
$resources=Get-AzResourceGroup | Where-Object Tags -eq $null ##list all the resources at resource group level whose tags are empty
$resources | ForEach-Object { New-AzTag -Tag #{ "Env"="test" } -ResourceId $_.ResourceId }
As you may know, MSFT is getting rid of AzureRM cmdlets in favor of Az.
There are a lot of issues regarding this since the proposed native aliases "Enable-AzureRmAlias" seems to stop being updated.
I have a script based on AzureRM in one repo, that triggers by Azure DevOps release pipeline step function (Azure PowerShell based),
that has the following piece of code:
$var = (Get-AzureKeyVaultSecret -VaultName $vaultName-Name $Key).SecretValueText
"Enable-AzureRmAlias" command activated as well..., that converts the code like this:
$var = (Get-AzKeyVaultSecret -VaultName $vaultName-Name $Key).SecretValueText
The problem is, that ".SecretValueText" was deprecated a while ago. Instead of it, a new parameter has been added to the Get-AzKeyVaultSecret cmdlet - "-AsPlainText"
so... theoretically the final construction has to be like this:
$var = Get-AzKeyVaultSecret -VaultName $vaultName-Name $Key -AsPlainText
Challenges!
I can't upgrade the original script in the repo to Az due to the necessity of back-compatibility.
The only way to solve it - is to create some kind of alias in Azure PowerShell inline script (that triggers the main script in the repo)
I stuck with this ".SecretValueText"
My original idea to put the following into the inline script doesn't seem to be working:
function Get-AzKeyVaultSecretNew {
Param(
$vaultName,
$Key
)
$var = Get-AzKeyVaultSecret -VaultName $vaultName -Name $Key -AsPlainText
return $var
}
Set-Alias -Name Get-AzKeyVaultSecret -Value Get-AzKeyVaultSecretNew
Any ideas on how to accomplish this?
This should theoretically help your situation. You can run this code at the beginning of the PowerShell session that will be calling your scripts. You will need to make sure any necessary modules are loaded so that the secret object types are loaded.
$Script = { Get-AzKeyVaultSecret -VaultName $this.VaultName -Name $this.Name -AsPlainText }
Update-TypeData -TypeName 'Microsoft.Azure.Commands.KeyVault.Models.PSKeyVaultSecretIdentityItem' -MemberName 'SecretValueText' -MemberType ScriptProperty -Value $Script
The idea is to add the SecretValueText property back to the Microsoft.Azure.Commands.KeyVault.Models.PSKeyVaultSecretIdentityItem objects.
You can try using below workaround to replace below piece of code:
(Get-AzureKeyVaultSecret -VaultName $vaultName-Name $Key).SecretValueText
with Get-AzKeyVaultSecret -VaultName $vaultName -Name $Key -AsPlainText via using RegEx Find & Replace task. Check below steps:
1, Add task RegEx Find & Replace to replace the orginal code with the converted code. See below:
FindRegex: '\(Get-AzureKeyVaultSecret -VaultName \$vaultName -Name \$Key\)\.SecretValueText'
ReplaceRegex: 'Get-AzKeyVaultSecret -VaultName $vaultName -Name $Key -AsPlainText'
2, -AsPlainText parameter is only available in the latest az 5.3.0 version. Since the version installed in cloud agent is 4.7.0. You need to install the az 5.3.0 version before executing your script. See below. Use a powershell task to install az 5.3.0 version.
New-Item -Path "C:\Modules" -Name "az_5.3.0" -ItemType "directory"
Save-Module -Name AZ -RequiredVersion 5.3.0 -Path "C:\Modules\az_5.3.0"
3, Then you can invoke your script in the azure powershell task directly.
I have an Azure function app that makes the following web request:
Invoke-RestMethod -Uri $uri -Method 'GET' -Headers $headers -Body $body -SkipCertificateCheck
On my local machine, I can save the returned XML file as OutFile to local disk. However, in the actual environment this is on an Azure function app and therefore I don't think that I can save the file to disk. Instead, I want to redirect it to a storage container in Azure.
I have tried Azure function output bindings in order to redirect the response there but failed to write the actual file:
Push-OutputBinding -Name outputBlob -Value $response.content
All this write to the storage container is a string value. So how do I write the actual file received as a response to InvokeRestMethod within the Azure function app to the Azure storage container in this environment?
Unless the file is huge, you can indeed save it to the local file system, even when running on Azure. Just select the proper location and don't forget to clean up:
$f = New-TemporaryFile
try {
Invoke-RestMethod ... -OutFile $f.FullName
# Do whatever you want with this file
...
} finally {
Remove-Item $f.FullName
}
Please read your file into byte array and pass it as argument.
using namespace System.Net
# Input bindings are passed in via param block.
param($Request, $TriggerMetadata)
# Write to the Azure Functions log stream.
Write-Host "PowerShell HTTP trigger function processed a request."
New-Item -Path 'test.txt' -ItemType File
Set-Content 'test.txt' 'Welcome to www.thecodemanual.pl'
$file = [System.IO.File]::ReadAllBytes('test.txt')
Push-OutputBinding -Name outputBlob -Value $file
# Associate values to output bindings by calling 'Push-OutputBinding'.
Push-OutputBinding -Name Response -Value ([HttpResponseContext]#{
StatusCode = [HttpStatusCode]::OK
Body = $body
})
and then you will get:
I am trying the execute the powershell script found at :
https://github.com/amanbedi18/Azure-KeyVault-Automation/tree/master/UploadSecrets
https://github.com/amanbedi18/Azure-KeyVault-Automation
This script is used to automate upload of secrets to azure key-vault.
I have created the below json file and named it /user/kv/AzureSecretsMetaData.json
[
{
"key": "test1",
"value": "1"
},
{
"key": "test2",
"value": "2"
}
]
I then copied contents of setKeyVaultSecrets.ps1 to /user/kv/setKeyVaultSecrets.ps1
I have tried the below commands that were mentioned in the readme and powershell script files and get the below errors, could someone please help me execute this script correctly and point out what I am doing wrong here.
./setkeyVaultSecret.ps1 -KeyVaultName 'avkv01'
setkeyVaultSecret.ps1: Parameter set cannot be resolved using the specified named parameters. One or more parameters issued cannot be used together or an insufficient number of parameters were provided.
./setkeyVaultSecret.ps1 -KeyVaultName 'avkv01' -KVSecretMetadataFilePath '/home/aditya/kv/AzureSecretsMetaData.json'
setkeyVaultSecret.ps1: Parameter set cannot be resolved using the specified named parameters. One or more parameters issued cannot be used together or an insufficient number of parameters were provided.
Any help is greatly appreciated. Thank You
If these are scripts you found, they are very poorly written and I would advise not using them. I would suggest installing the az powershell module and using those scripts as a reference in writing your own script. You can install the az powershell module with powershellget:
Install-module az -force -allowclobber
The way the script that you are trying to call is written very strangely. It defines two different parameter sets, but each of the two parameters are mandatory for BOTH parameter sets?! That doesn't make a lot of sense. If you must run them, you will probably need to modify them a bit. They don't really seem finished.. Why not modify the script and get rid of the parameter sets altogether? Also, the example in the script indicates that the "$KVSecretMetadataFilePath" parameter is not mandatory, since it is not being called, but is defined as being mandatory when defining your parameter. You need to make clear if this parameter is required or not. If it is required Make it look more like this:
<#
.PREREQUISITE
1. An Azure key vault and its name as parameter.
2. Json template should be properly populated with valid json schema in sampleSecretValues.json in KeyVaultjson directory.
.PARAMETER vaultName
The name of the key vault.
.EXAMPLE
. setKeyVaultSecret.ps1 -KeyVaultName 'somekeyvault'
#>
# provision keys and secrets to a key vault
Param(
[Parameter(Mandatory=$true)]
[String]
$KeyVaultName,
[Parameter(Mandatory=$true)]
[String]
$KVSecretMetadataFilePath
)
Install-Module -Name AzureADPreview -ErrorAction SilentlyContinue -Force
Import-Module Azure -ErrorAction SilentlyContinue
Import-Module AzureRM.Resources
Set-StrictMode -Version 3
$json = Get-Content $KVSecretMetadataFilePath | Out-String | ConvertFrom-Json
$json | ForEach {
$secretToSearch = Get-AzureKeyVaultSecret -VaultName $KeyVaultName -Name $_.key -ErrorAction SilentlyContinue
if($secretToSearch -ne $null)
{
echo "The secret $_.key already exists !"
}
Else
{
$NewSecret = Set-AzureKeyVaultSecret -VaultName $KeyVaultName -Name $_.key -SecretValue (ConvertTo-SecureString $_.value -AsPlainText -Force ) -Verbose
Write-Host
Write-Host "Source Vault Resource Id: "$(Get-AzureRmKeyVault -VaultName $KeyVaultName).ResourceId
}
}
If it is not required, remove the line "[Parameter(Mandatory=$true)]" above the $KVSecretMetadataFilePath declaration.
Using the Az PowerShell module, I'm trying to enumerate a directory on disk and pipe the output to Set-AzStorageBlobContent to upload to Azure, while preserving the folder structure. This works great, except the ContentType property of all blobs is set to application/octet-stream. I'd like to set it dynamically based on the file extension of the blob being uploaded.
Here's example code for the base case:
Get-ChildItem $SourceRoot -Recurse -File |
Set-AzStorageBlobContent -Container $ContainerName -Context $context -Force
To set the ContentType, I need to add a Properties parameter to Set-AzStorageBlobContent with a value like #{ "ContentType" = "<content type>" }. The content type should be determined from the specific file extension being uploaded. I've written a separate pipelined function that can add a MimeType property to the file object, but I can't figure out how to reference that for the parameter in the pipeline. Example:
function Add-MimeType{
[cmdletbinding()]
param(
[parameter(
Mandatory = $true,
ValueFromPipeline = $true)]
$pipelineInput
)
Process {
$mimeType = Get-MimeType $pipelineInput.Extension
Add-Member -InputObject $pipelineInput -NotePropertyName "MimeType" -NotePropertyValue $mimeType
return $pipelineInput
}
}
function Get-MimeType(
[string]$FileExtension
)
{
switch ($FileExtension.ToLowerInvariant())
{
'.txt' { return 'text/plain' }
'.xml' { return 'text/xml' }
default { return 'application/octet-stream' }
}
}
Get-ChildItem $SourceRoot -Recurse -File |
Add-MimeType |
Set-AzStorageBlobContent -Container $ContainerName -Properties #{"ContentType" = "$($_.MimeType)"} -Context $context -Force
It seems that $_ isn't usable in this context. Is there another way to accomplish this?
The reason I'd like to continue using pipelining is that it appears to work much faster than using a ForEach-Object loop to call the function (where $_ does work).
If you are open to completely different solutions, you can also use AzCopy.
You can upload your whole folder with one command, and AzCopy can also automatically guess the correct mime type based on the file extension. There is also support for Azure Pipelines, if that is part of your setup.
Command could look something like this:
# AzCopy v10 will automatically guess the content type unless you pass --no-guess-mime-type
azcopy copy 'C:\myDirectory' 'https://mystorageaccount.blob.core.windows.net/mycontainer' --recursive
# AzCopy V8
azcopy copy 'C:\myDirectory' 'https://mystorageaccount.blob.core.windows.net/mycontainer' /s /SetContentType
Taken from the output of AzCopy.exe copy --help:
AzCopy automatically detects the content type of the files when uploading from the local disk, based on the file extension or content (if no extension is specified).
The built-in lookup table is small, but on Unix, it is augmented by the local system's mime.types file(s) if available under one or more of these names:
/etc/mime.types
/etc/apache2/mime.types
/etc/apache/mime.types
On Windows, MIME types are extracted from the registry. This feature can be turned off with the help of a flag. Please refer to the flag section.