Deploying Azure sentinel analytics rules in bulk with Powershell - azure

I am trying to deploy around 70 log analytics rules in to an environment's Azure Sentinel with Powershell so I don't have to do this manually one by one. I got the Analyticsrules.ps1 and rules.json which includes all the rules I want to deploy.
When I run it, I get this error: https://i.stack.imgur.com/nz9S4.png
"Cannot bind argument to parameter 'path' because it is null"
Does anyone know what I am doing wrong? I can show the json file as well but its really long.

As mentioned in the error, the path parameter is null.
Please make sure $env:Pipeline_Workspace and $artifactPath are not null.
$artifactPath = Join-Path $env:Pipeline_Workspace $artifactName
$rulesFilePath = Join-Path $artifactPath $RulesFile

Related

Azure Template spec caching

I have a Azure template spec, version "ado", which has been working great. I recently changed a parameter name (ie. "location" to "aslocation") and updated the template spec using Powershell's Set-AzTemplateSpec command with the same version name "ado"
But when I call the template spec using the new parameter name, "aslocation", it throws:
Error BCP037: The property "aslocation" is not allowed on objects of type "params". Permissible properties include "location"
Even if I try using the old parameter name, "location", it throws:
New-AzResourceGroupDeployment: Cannot retrieve the dynamic parameters for the cmdlet. D:\git\IaC\Azure\main.bicep(4,5) : Error BCP035: The specified "object" declaration is missing the following required properties: "location". D:\git\IaC\Azure\main.bicep(5,7) : Error BCP089: The property "aslocation" is not allowed on objects of type "params". Did you mean "location"?
So it seems something is being cached. Any ideas on how to resolve or avoid this problem?
I have confirmed:
Occurs using Azure CLI or Powershell commands
Occurs using VS Code's integrated terminal or standalone powershell/CMD terminals
Template spec is indeed updated (verified via portal)
Issue persists through multiple days/reboots
I managed to resolve the issue. Leaving this here in case anyone else runs into this problem.
Bicep maintains a local cache for template specs at %USERPROFILE%.bicep
This is NOT automatically updated when template spec versions are updated. You must use Bicep CLI's restore command with the --force flag to refresh it (or remove your cache).
This seems counterintuitive IMHO but I digress..
Reference: https://learn.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-cli#restore

Substitute Service Fabric application parameters during deployment

I'm setting up my production environment and would like to secure my environment-related variables.
For the moment, every environment has its own application parameters file, which works well, but I don't want every dev in my team knowing the production connection strings and other sensitive stuffs that could appear in there.
So I'm looking for every possibility available.
I've seen that in Azure DevOps, which I'm using at the moment for my CI/CD, there is some possible variable substitution (xml transformation). Is it usable in a SF project?
I've seen in another project something similar through Octopus.
Are there any other tools that would help me manage my variables by environment safely (and easily)?
Can I do that with my KeyVault eventually?
Any recommendations?
Thanks
EDIT: an example of how I'd like to manage those values; this is a screenshot from octopus :
so something similar to this that separates and injects the values is what I'm looking for.
You can do XML transformation to the ApplicationParameter file to update the values in there before you deploy it.
The other option is use Powershell to update the application and pass the parameters as argument to the script.
The Start-ServiceFabricApplicationUpgrade command accept as parameter a hashtable with the parameters, technically, the builtin task in VSTS\DevOps transform the application parameters in a hashtable, the script would be something like this:
#Get the existing parameters
$app = Get-ServiceFabricApplication -ApplicationName "fabric:/AzureFilesVolumePlugin"
#Create a temp hashtable and populate with existing values
$parameters = #{ }
$app.ApplicationParameters | ForEach-Object { $parameters.Add($_.Name, $_.Value) }
#Replace the desired parameters
$parameters["test"] = "123test" #Here you would replace with your variable, like $env:username
#Upgrade the application
Start-ServiceFabricApplicationUpgrade -ApplicationName "fabric:/AzureFilesVolumePlugin" -ApplicationParameter $parameters -ApplicationTypeVersion "6.4.617.9590" -UnmonitoredAuto
Keep in mind that the existing VSTS Task also has other operations, like copy the package to SF and register the application version in the image store, you will need to replicate it. You can copy the full script from Deploy-FabricApplication.ps1 file in the service fabric project and replace it with your changes. The other approach is get the source for the VSTS Task here and add your changes.
If you are planning to use KeyVault, I would recommend the application access the values direct on KeyVault instead of passing it to SF, this way, you can change the values in KeyVault without redeploying the application. In the deployment, you would only pass the KeyVault credentials\configuration.

Azure function published but not running, "no data available"

I can publish a Azure function from Visual Studio without an error.
This funtion is set to run every 4 seconds ("*/4 * * * * *") but it is not running at all. Even if I try to run it manually it do not run and show the following error:
Status: 404 Not FoundThe resource you are looking for has been
removed, had its name changed, or is temporarily unavailable.
Under monitoring it do not shows data, under success or error count it says no data available :(
Nothing is working please help
This is a pretty old thread but in case anyone is facing the same issue after migrating their Function App to .NET Core 3.1, check that you have also updated the Function Runtime Version to 3. Update the Function App SDK and in Azure portal check that the function runtime settings is 3. Without updating this setting the same 404 error appears whenever you try to call your function app.
For changing the Function Runtime Version open the Function App in Azure Portal then go to Configuration -> Function runtime settings. From the Runtime version dropdown choose ~3.
The resource you are looking for has been removed, had its name changed, or is temporarily unavailable.
According to your 404 error message, it means your function source couldn’t be found. Such as wrong resource path , function name has been changed, wrong function name or the function has been deleted.You could check whether your class name and FunctionName attribute name are consistant. If you have changed code, remeber to rebuild the project.
And please make sure you could run the Azure function successfully in Visual studio before published to Azure. In debug mode, check whether output logs are correct.
Under monitoring it do not shows data, under success or error count it says no data available
This info usually means function has never been triggered before. If you create a new function in Azure and click Monitor directly, you could also see this info. To solve this problem, unless you could trigger this Azure function successfully.
In my case I was deploying the azure function using the Azure Resource Manager (ARM) template. I created it manually and was missing some of the properties for the storage account:
For anyone deploying an Azure Function using an ARM template, I would highly recommend taking a template from the GitHub quickstart ARM templates: https://github.com/Azure/azure-quickstart-templates
It provides the minimum template to get your function (and other resource) up and running.
The issue with your function was that GetFTPData.cs is not a valid function name. VS build doesn't validate the function name and the portal isn't displaying these errors.
This issue is tracking the portal error display https://github.com/Azure/azure-functions-ux/issues/2316
and this is for VS build to validate functionName attribute https://github.com/Azure/azure-functions-vs-build-sdk/issues/174

Azure ARM Template Testing with Pester

I have been following the link
Azure ARM Template Testing on how to carry out ARM testing with Pester.
Unfortunately, I'm unable to get a successful tests.
For example in the script the following code states the following:
It "Does Availability Set Have Correct SKU" {
$av = $deploymentOutput.validatedResources | Where-Object { $_.type -eq 'Microsoft.Compute/availabilitySets' }
$av.sku.name | Should Be **'Align'**
However, even though the result of the ARM template is 'Align' I get the following error.
error
Whereas I should be getting the following successful output:
success
For a complete look at the code it can be found here
Any guidance will greatly appreciated.
Regards
While this isnt a direct answer to your question, this is an indirect answer to your question :)
Just dont do this. Test-AzureRMResourceGroupDeployment doesnt do any real good. If you insist on using it you can always use a 1 liner to do that or use VSCode tasks or whatever to kick off this cough test cough.
There is really no point in validating if this particular resource type is the one that you expect, because you dont really change resource types in the resource after you created it. Also, if Test-AzureRMResourceGroupDeployment returns success doesnt mean your deployment will work. It only checks basic sanity. Just create a powershell script\task to deploy a template and kick it off automatically after commit. Pester adds nothing of value to this process, only complicates things.

Error w/ Azure CLI on importing publish settings file

This is similar to the issue this SO user was having except I'm getting a different error for the same behavior.
I downloaded the publishsettings file from azure and
Issued this command in the azure cli: azure account import <MySite>.azurewebsites.net.PublishSettings
and I got the following error:
{ name: 'AssertionError',
message: undefined,
actual: 'UNIVERSAL-primative-0',
expected: 'UNIVERSAL-primative-6',
operator: '==' }
AssertionError: "UNIVERSAL-primative-0" == "UNIVERSAL-primative-6"
...Shortened for brevity. Let me know if you'd like the full stack trace...
I wasn't particularly anxious to wrap this node project in a VisualStudio project, but I think in a pinch, I could and just format the publish settings from within VS. But if there is a way to do this correctly, I'd prefer that.
Where did you get the file? Were you using the following command to get it?
azure site download
It seems like you are using the publishsettings file of an Azure Web Site while xplat-cli expects the publishsettings file of the subscription.
There are kinds of 2 publishsettings files. And yeah, it's confusing.

Resources