Related
I'm trying to add some Azure Vulnerability Assessment baseline definitions to my ARM templates. I use JSON for my ARM templates. I cannot find any documentation on how to specify certain VA baseline definitions, though, namely ones that need to have multiple rows in the baselines.
Specifically, I'm trying to add a baseline defintiion for VA2109. I can locate the documentation for how to define a baseline VA entry in a general sense, which is here...
https://learn.microsoft.com/en-us/azure/templates/microsoft.sql/servers/databases/vulnerabilityassessments/rules/baselines?tabs=json
And then I can locate the description of VA2109 in here ...
https://learn.microsoft.com/en-us/azure/azure-sql/database/sql-database-vulnerability-assessment-rules#authentication-and-authorization
But neither of those tell me how to include more than one user-role mapping. For example, below is what I currently have, which works and lets me specify that a user should have data writer role. But, I also want to specify that the user should have data reader and ddl admin roles.
{
"type": "Microsoft.Sql/servers/databases/vulnerabilityAssessments/rules/baselines",
"apiVersion": "2021-02-01-preview",
"name": "[concat(variables('sqlServerName'), '/', variables('databaseName'), '/default/VA2109/Default')]",
"dependsOn": [
"[resourceId('Microsoft.Sql/servers/databases', variables('sqlServerName'), variables('databaseName'))]"
],
"properties": {
"baselineResults": [
{
"result": ["wibuser", "db_datawriter"]
}
]
}
}
I was able to find an example of what I want using PowerShell. In PowerShell, you can just provide and array of arrays. The PowerShell example can be found here ...
https://learn.microsoft.com/en-us/powershell/module/sqlserver/new-sqlvulnerabilityassessmentbaseline?view=sqlserver-ps#example-2--create-a-new-security-check-baseline-manually
So I adjusted my ARM to do the same thing, but it throws an error saying invalid ARM template. The adjusted ARM I tried looks like below ...
{
"type": "Microsoft.Sql/servers/databases/vulnerabilityAssessments/rules/baselines",
"apiVersion": "2021-02-01-preview",
"name": "[concat(variables('sqlServerName'), '/', variables('databaseName'), '/default/VA2109/Default')]",
"dependsOn": [
"[resourceId('Microsoft.Sql/servers/databases', variables('sqlServerName'), variables('databaseName'))]"
],
"properties": {
"baselineResults": [
{
"result": [
["wibuser", "db_datawriter"],
["wibuser", "db_datareader"]
]
}
]
}
}
Does anybody know how to specify multiple rows in a VA baseline resource when using ARM JSON? Or perhaps know where to find documentation for all of these VA definitions?
Note that baselineResults is an array of rows.
You will need to add each row as an JSON object to that array.
Also, note that each result row should include all columns so you should also include "Principal Type" and "Authentication Type" rows.
It should look something like that:
{
"type": "Microsoft.Sql/servers/databases/vulnerabilityAssessments/rules/baselines",
"apiVersion": "2021-02-01-preview",
"name": "[concat(variables('sqlServerName'), '/', variables('databaseName'), '/default/VA2109/Default')]",
"dependsOn": [
"[resourceId('Microsoft.Sql/servers/databases', variables('sqlServerName'), variables('databaseName'))]"
],
"properties": {
"baselineResults": [
{
"result": ["wibuser", "db_datawriter", "SQL_USER", "NONE"]
},
{
"result": ["wibuser", "db_datareader", "SQL_USER", "NONE"]
}
]
}
}
I added dummy values for "Principal Type" and "Authentication Type" rows, fill your own
I am looking for a way to convert an array (e.g. of strings) into one object, where the properties are generated from the array values.
Use case: I want to generate a tags object with links to resources, based on a list of resource names. I need to do this, to link App Service resources to an Application Insights resource.
The list of resources could be supplied using a parameter:
"parameters": {
"appServices": {
"type": "array",
"metadata": {
"description": "Names of app services to link this application insights resource to via hidden tags"
}
}
}
Sample input:
['appName1', 'appName2', 'appName3']
Sample output:
"tags":
{
"[concat('hidden-link:', resourceId('Microsoft.Web/sites/', 'appName1'))]": "Resource",
"[concat('hidden-link:', resourceId('Microsoft.Web/sites/', 'appName2'))]": "Resource",
"[concat('hidden-link:', resourceId('Microsoft.Web/sites/', 'appName3'))]": "Resource"
}
I know you can use copy to loop over arrays but that will create an array of objects and not a single object (which is required for tags), for example:
[
{
"[concat('hidden-link:', resourceId('Microsoft.Web/sites/', 'appName1'))]": "Resource"
},
{
"[concat('hidden-link:', resourceId('Microsoft.Web/sites/', 'appName2'))]": "Resource"
},
{
"[concat('hidden-link:', resourceId('Microsoft.Web/sites/', 'appName3'))]": "Resource"
}
]
It would be possible to use union to merge those objects again, but that function requires you to hardcode the objects you want to merge, so it does not work when you have an input with variable length.
What I am looking for is a way to do this in a dynamic way.
There is no direct option to convert array to object.
But here's a hack to achieve what you need. This will work for array of any length.
Steps:
append hidden-link text to service names
convert array to string
replace necessary symbols and make it a valid json string.
use json() to convert string to object
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"appServices": {
"type": "array",
"metadata": {
"description": "Names of app services to link this application insights resource to via hidden tags"
},
"defaultValue": [ "appName1", "appName2", "appName3" ]
}
},
"functions": [],
"variables": {
"copy": [
{
"name": "as",
"count": "[length(parameters('appServices'))]",
"input": "[concat('hidden-link:', resourceId('Microsoft.Web/sites/', parameters('appServices')[copyIndex('as')]))]"
}
],
"0": "[string(variables('as'))]",
"1": "[replace(variables('0'), '[', '{')]",
"2": "[replace(variables('1'), '\",', '\":\"Resource\",')]",
"3": "[replace(variables('2'), '\"]', '\":\"Resource\"}')]"
},
"resources": [],
"outputs": {
"op1": {
"type": "object",
"value": "[json(variables('3'))]"
}
}
}
Now that lambdas have been added to bicep you can convert arrays to objects using reduce. Note that the array you reduce must consist of object items. If it doesn't you can convert it to an object using map.
// With array of objects
var names = [
{
name: 'foo'
id: 'foo-id'
}
{
name: 'bar'
id: 'bar-id'
}
]
var nameIds = reduce(names, {}, (cur, next) => union(cur, {
'${next.name}': next.id
}))
output test object = nameIds
// With array of strings
var names = [
'foo'
'bar'
]
var nameMaps = map(names, (name) => {name: name})
var nameIds = reduce(nameMaps, {}, (cur, next) => union(cur, {
'${next.name}': next.name
}))
output test object = nameIds
I'm not sure if this is the best approach to this problem.
Tags are supposed to be metadata about a specific object/service. Wouldn't it make more sense to apply a tag (say your system name, environment, etc..) and then run a query against azure on that tag?
This should achieve the same result pulling back all related resources.
I don't know if it is still relevant, but since 2021 it is possible to do with items() function
I would like to be able to get custom output from an "Execute Pipeline Activity". During the execution of the invoked pipeline, I capture some information in a variable using the "Set Variable" activity. I would like to be able to use that value in the master pipeline.
I know that the master pipeline can read the invoked pipeline's name and runId using "#activity('InvokedPipeline').output," but those are the only properties available.
I have the invokable pipeline because it's configurable to be used by multiple other pipelines, assuming we can get the output from it. It currently consists of 8 activities; I would hate to have to duplicate them all across multiple pipelines just because we can't get the output from an invoked pipeline.
Reference: Execute Pipeline Activity
[
{
"name": "MasterPipeline",
"type": "Microsoft.DataFactory/factories/pipelines"
"properties": {
"description": "Uses the results of the invoked pipeline to do some further processing",
"activities": [
{
"name": "ExecuteChildPipeline",
"description": "Executes the child pipeline to get some value.",
"type": "ExecutePipeline",
"dependsOn": [],
"userProperties": [],
"typeProperties": {
"pipeline": {
"referenceName": "InvokedPipeline",
"type": "PipelineReference"
},
"waitOnCompletion": true
}
},
{
"name": "UseVariableFromInvokedPipeline",
"description": "Uses the variable returned from the invoked pipeline.",
"type": "Copy",
"dependsOn": [
{
"activity": "ExecuteChildPipeline",
"dependencyConditions": [
"Succeeded"
]
}
]
}
],
"parameters": {},
"variables": {}
}
},
{
"name": "InvokedPipeline",
"type": "Microsoft.DataFactory/factories/pipelines"
"properties": {
"description": "The child pipeline that makes some HTTP calls, gets some metadata, and sets a variable.",
"activities": [
{
"name": "SetMyVariable",
"description": "Sets a variable after some processing from other activities.",
"type": "SetVariable",
"dependsOn": [
{
"activity": "ProcessingActivity",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"variableName": "MyVariable",
"value": {
"value": "#activity('ProcessingActivity').output",
"type": "Expression"
}
}
}
],
"parameters": {},
"variables": {
"MyVariable": {
"type": "String"
}
}
}
}
]
Hello Heather and thank you for your inquiry. Custom outputs are not an inbuilt feature at this time. You can request/upvote for the feature in the Azure feedback forum. For now, I do have two workarounds.
Utilizing the invoked pipeline's runID, we can query the REST API (using Web Activity) for the activity run logs, and from there, the activity outputs. However, before making the query, it is necessary to authenticate.
REST call to get the activities of a pipeline
For authentication I reccomend using the Web Activity to get an oauth2 token. The URL would be https://login.microsoftonline.com/tenantid/oauth2/token. Headers "Content-Type": "application/x-www-form-urlencoded" and body "grant_type=client_credentials&client_id=xxxx&client_secret=xxxx&resource=https://management.azure.com/". Since this request is to get credentials, the Authentication setting for this request is type 'None'. These credentials correspond to an App you create via Azure Active Directory>App Registrations. Do not forget to assign the app RBAC in Data FActory Access Control (IAM).
Another workaround, has the child pipeline write its output. It can write to a database table, or it can write to a blob (I passed the Data Factory variable to a Logic App which wrote to blob storage), or to something else of your choice. Since you are planning to use the child pipeline for many different parent pipelines, I would recommend passing the child pipeline a parameter that it uses to identify the output to the parent. That could mean a blob name, or writing the parent runID to a SQL table. This way the parent pipeline knows where to look to get the output.
just had a chat with ADF team, and the response
[10:11 PM] Mark Kromer
Brajesh Jaishwal: any plans on custom output from execute pipeline activity?
Yes, this work is on the engineering work plan
I know that I can get the host key and trigger_url of an Azure Function in an ARM template by using the listKeys/listSecrets method.
But I need the systemkey, I'm deploying an Event Grid Subscription and it needs the Azure Function endpoint url which contains the system key:
"resources": [
{
"type": "Microsoft.Storage/StorageAccounts/providers/eventSubscriptions",
"name": "[concat(concat(parameters('publisherName'), '/Microsoft.EventGrid/'), parameters('name'))]",
"apiVersion": "2018-01-01",
"properties": {
"destination": {
"endpointType": "[parameters('endpointType')]",
"properties": {
"endpointUrl": "[parameters('endpointUrl')]"
}
},
"filter": {
"subjectBeginsWith": "[parameters('subjectBeginsWith')]",
"subjectEndsWith": "[parameters('subjectEndsWith')]",
"subjectIsCaseSensitive": "[parameters('subjectIsCaseSensitive')]",
"includedEventTypes": "[parameters('includedEventTypes')]"
},
"labels": "[parameters('labels')]"
}
}
]
where endpointUrl is in the form of:
https://<function-app-name>.azurewebsites.net/admin/extensions/EventGridExtensionConfig?functionName=<function-name>&code=XZvGU0ROPxxxxxxxxxxxxxxxxxxxxxxxxxxxxaaieD89gPQ==
The parameter named 'code' is the systemkey, which can be retrieved by doing a GET on
http://<function-app-name>.azurewebsites.net/admin/host/systemkeys/eventgridextensionconfig_extension?code=<master_key>
Is there a way to retrieve this systemkey (or the entire endpointurl) in the ARM template without resorting to bash scripts that inject it or other external systems?
The documentation does say: "However, you cannot use list operations that require values in the request body." So I don't think I'll be able to with a 'list' operation.
Yes, it is now possible:
"destination": {
"endpointType": "WebHook",
"properties": {
"endpointUrl": "[concat(variables('functionUrl'), listKeys(resourceId(variables('functionResourceGroupName'), 'Microsoft.Web/sites/host/', variables('functionAppName'), 'default'),'2016-08-01').systemkeys.eventgrid_extension)]"
}
},
Where functionUrl ends with &code=. Tested that on runtime ~2.
This is not possible right now. You can return only function keys using the ARM template.
Same described here:
https://blog.mexia.com.au/list-of-access-keys-from-output-values-after-arm-template-deployment#functions
I have a For_Each loop in an Azure Logic App that calls another, nested, Logic App. The result from each iteration of the nested Logic Apps is a JSON object that contains an array of strings, like this:
{
"Results": ["string a", "string b"]
}
So the output from my For_Each loop in the parent Logic App looks like this:
[
{"Results": ["string a", "string b"]},
{"Results": ["string c", "string d"]}
]
I want to put all these strings into a single flat list that I can pass to another action.
How can I do this? Is it possible using the workflow definition language and built-in functions, or do I need to use an external function (in a service, or an Azure Function)?
There's a simpler solution, working with Array Variables.
At the top level, outside the For Each loop, declare a variable with an InitializeVariable action:
"Initialize_Items_variable": {
"inputs": {
"variables": [
{
"name": "Items",
"type": "Array",
"value": []
}
]
},
"runAfter": {},
"type": "InitializeVariable"
}
Inside the For Each, use a AppendToArrayVariable action. You can append the Response object of the Nested Logic App you just called.
"Append_to_Items_variable": {
"inputs": {
"name": "Items",
"value": "#body('Nested_Logic_App_Response')"
},
"runAfter": {
},
"type": "AppendToArrayVariable"
}
Hope it helps.
Picking up on #DerekLi's useful comment above, it seems this is not possible at the time of writing with Logic Apps schema version 2016-06-01.
One of the great strengths of Logic Apps is the ability to leverage the power of Azure Functions to solve problems like this that can't (yet) be solved in the schema language.
Re-writing the array is trivial in c# within a function:
using System.Net;
public class Result
{
public List<string> Results {get; set;}
}
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
var inputs = await req.Content.ReadAsAsync<List<Result>>();
var outputs = new List<string>();
foreach(var item in inputs)
{
log.Info(item.Results.ToString());
outputs.AddRange(item.Results.Where(x => !string.IsNullOrEmpty(x)));
}
return req.CreateResponse(HttpStatusCode.OK, outputs);
}
And this function can then be passed the result of the For_Each loop:
"MyFunction": {
"inputs": {
"body": "#body('Parse_JSON')",
"function": {
"id": "/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.Web/sites/{function-app-name}/functions/{function-name}"
},
"method": "POST"
},
"runAfter": {
"For_each": [
"Succeeded"
]
},
"type": "Function"
}
There is also a way to do it using the workflow definition language. (https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-workflow-definition-language).
Using the fonctions string and replace you can work on your json as a string rather than on objects.
Here is a Flat_List action that follows a Parse_JSON action with your data:
Your data:
[
{"Results": ["string a", "string b"]},
{"Results": ["string c", "string d"]}
]
Flat_List component:
"Flat_List": {
"inputs": "#replace(replace(replace(string(body('Parse_JSON')),']},{\"Results\":[',','),'}]','}'),'[{','{')",
"runAfter": {
"Parse_JSON": [
"Succeeded"
]
},
"type": "Compose"
},
What happens here? First we use string that takes your json data and gives:
[{"Results":["string a", "string b"]},{"Results":["string c", "string d"]}]
We replace all the ]},{"Results":[ by ,.
We replace all the }] by }.
We replace all the [{ by {.
We get the string {"Results":["string a","string b","string c","string d"]}
Then you are free to parse it back to json with:
"Parse_JSON_2": {
"inputs": {
"content": "#outputs('Flat_List')",
"schema": {
"properties": {
"Results": {
"items": {
"type": "string"
},
"type": "array"
}
},
"type": "object"
}
},
"runAfter": {
"Flat_List": [
"Succeeded"
]
},
"type": "ParseJson"
}
You can see it as a proof of concept as the Azure Function may be easier to re-read later but there may be many reason not to want to instantiate a new Azure Function while you can do the job in Logic App.
Feel free to ask for more details if needed :)
This technique works pretty well, and only uses run-of-the-mill Logic App actions:
1. start with declaring an empty array variable (action Variable: Initialise variable)
2. iterate through your items (action Control: For each), e.g. the resultset from a previous action
in each iteration, first compose the JSON fragment you need (action Data Operations: Compose)
then append the output of your Compose action to the array (action: Variable: Append to array variable)
3. then, outside the loop, join the elements of the array (action Data Operations: Join)
4. do what you need with the output of the Join action, e.g. send as response payload (action Request: Response)
This is what it looks like in the end:
You can use #body(nestedLogicApp) outside of the for-each loop to access all the nested Logic Apps' response in an array.