I'm almost sure last time I was playing with ARM templates there was some trick to make code tidy and concat() with multiple lines. Yes, I know I could use
"myvar": "[concat(
'abc',
'def
)]"
if I'm deploying with cli/powershell, but I won't control the tools of the template's users, so they'll very likely just copy-paste into web console.
Any idea? Besides having a custom function to join multi-line text encoded as array:
"variables": {
"splitStr": [
"first line",
"second line",
"third line"
],
"output": "[custom.join(variables('splitStr'))]"
},
"functions": [
{
"namespace": "custom",
"members": {
"join": {
"parameters": [
{
"type": "array",
"name": "splitStr"
}
],
"output": {
"type": "string",
"value": "[replace(replace(replace(string(parameters('splitStr')), '[\"', ''), '\"]', ''), '\",\"', '\\n')]"
}
}
}
}
],
We're talking about a working sample code pushing hundred-ish lines of configuration text into a deployed VM, so text readability is important.
Related
trying to add custom skill in the skillset and map it in the index
here is in detail
I'm using the azure Named Entity Recognition in my skillset as
{
"#odata.type": "#Microsoft.Skills.Text.MergeSkill",
"description": "Merge text content with image tags",
"insertPreTag": " ",
"context": "/document",
"inputs": [
{
"name": "text",
"source": "/document/fullTextAndCaptions"
},
{
"name": "itemsToInsert",
"source": "/document/normalized_images/*/Tags/*/name"
}
],
"outputs": [
{
"name": "mergedText",
"targetName": "finalText"
}
]
}
and in the indexer as
{
"sourceFieldName": "/document/finalText/pages/*/entities/*/value",
"targetFieldName": "entities"
},
{
"sourceFieldName": "/document/finalText/pages/*/locations/*",
"targetFieldName": "locations"
},
and it works 100% now I want to add the Distinct custom skill from https://github.com/Azure-Samples/azure-search-power-skills/tree/master/Text/Distinct
I did publish the function and when I go to test it manually it works as expected.
however overall its not working in skillset. I want it to take the location and filter it and output the distinct only in it's own field in the search index.
I'm having a really hard time to configure the skillset and indexer to get it to work.
any help please?
You'll need to add the distinct custom skill like this, assuming you want to dedup over the whole document
{
"skills": [
...
{
"#odata.type": "#Microsoft.Skills.Custom.WebApiSkill",
"description": "Distinct skill",
"uri": "<https://distinct-skill>",
"context": "/document",
"inputs": [
{
"name": "locations",
"source": /document/finalText/pages/*/locations/*"
}
],
"outputs": [
{
"name": "distinct",
"targetName": "distinctLocations"
}
]
}
...
]
}
and an output field mapping to put it into the index.
{
"sourceFieldName": "/document/distinctLocations",
"targetFieldName": "distinctLocations"
}
See https://learn.microsoft.com/en-us/azure/search/cognitive-search-custom-skill-interface#consuming-custom-skills-from-skillset for adding a custom skill.
The skill inputs for the custom skill must be configured to point to the data you want to disambiguate. In this case, you didn't really need to modify the code, all you had to do was have an input with name 'words' and source '/document/finalText/pages//locations/'.
i want to display a Azure GUI element in solution offer only if the deployment is being done in certain region. I know there is a "visible" field for createUIdefinition elements. But Can i use location function with it.
I tried below, it seems to not work though. what am i missing:
{
"name": "MyDropdown",
"type": "Microsoft.Common.DropDown",
"label": "Only show in EastUS",
"defaultValue": "blah",
"toolTip": "select from below",
"constraints": {
"allowedValues": [
{
"label": "yes",
"value": "yes"
},
{
"label": "no",
"value": "no"
},
]
},
"visible": "[ equals(location(), 'eastus') ]"
}
Looks like above example not working for createUIdefinition's basics section. Though , it works for steps section.
For context, I currently have a Data Factory v2 pipeline with a ForEach Activity that calls a Copy Activity. The Copy Activity simply copies data from an FTP server to a blob storage container.
Here is the pipeline json file :
{
"name": "pipeline1",
"properties": {
"activities": [
{
"name": "ForEach1",
"type": "ForEach",
"typeProperties": {
"items": {
"value": "#pipeline().parameters.InputParams",
"type": "Expression"
},
"isSequential": true,
"activities": [
{
"name": "Copy1",
"type": "Copy",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false
},
"typeProperties": {
"source": {
"type": "FileSystemSource",
"recursive": true
},
"sink": {
"type": "BlobSink"
},
"enableStaging": false,
"cloudDataMovementUnits": 0
},
"inputs": [
{
"referenceName": "FtpDataset",
"type": "DatasetReference",
"parameters": {
"FtpFileName": "#item().FtpFileName",
"FtpFolderPath": "#item().FtpFolderPath"
}
}
],
"outputs": [
{
"referenceName": "BlobDataset",
"type": "DatasetReference",
"parameters": {
"BlobFileName": "#item().BlobFileName",
"BlobFolderPath": "#item().BlobFolderPath"
}
}
]
}
]
}
}
],
"parameters": {
"InputParams": {
"type": "Array",
"defaultValue": [
{
"FtpFolderPath": "/Folder1/",
"FtpFileName": "#concat('File_',formatDateTime(utcnow(), 'yyyyMMdd'), '.txt')",
"BlobFolderPath": "blobfolderpath",
"BlobFileName": "blobfile1"
},
{
"FtpFolderPath": "/Folder2/",
"FtpFileName": "#concat('File_',formatDateTime(utcnow(), 'yyyyMMdd'), '.txt')",
"BlobFolderPath": "blobfolderpath",
"BlobFileName": "blobfile2"
}
]
}
}
},
"type": "Microsoft.DataFactory/factories/pipelines"
}
The issue I am having is that when specifying pipeline parameters, it seems I cannot use system variables and functions the same way I can when for example specifying folder paths for a blob storage dataset.
The consequence of this is that formatDateTime(utcnow(), 'yyyyMMdd') is not being interpreted as function calls but rather the actual string with value formatDateTime(utcnow(), 'yyyyMMdd').
To counter this I am guessing I should be using a trigger to execute my pipeline and pass the trigger's execution time as a parameter to the pipeline like trigger().startTime but is this the only way? Am I simply doing something wrong in my pipeline's JSON?
This should work:
File_#{formatDateTime(utcnow(), 'yyyyMMdd')}
Or complex paths as well:
rootfolder/subfolder/#{formatDateTime(utcnow(),'yyyy')}/#{formatDateTime(utcnow(),'MM')}/#{formatDateTime(utcnow(),'dd')}/#{formatDateTime(utcnow(),'HH')}
You can't put a dynamic expression in the default value. You should define this expression and function either when you creating a trigger, or when you define dataset parameters in sink/source in copy activity.
So either you create dataset property FtpFileName with some default value in source dataset, and then in copy activity, you can in source category to specify that dynamic expression.
Another way is to define pipeline parameter, and then to add dynamic expression to that pipeline parameter when you are defining a trigger. Hope this is a clear answer to you. :)
Default value of parameters cannot be expressions. They must be literal strings.
You could use trigger to achieve this. Or you could extract the common part of your expressions and just put literal values into the foreach items.
I am currently working on building a few templates for some of the basic deployments that I use daily and have them working except for the last piece. The issue is that i am using t-shirt sizing to choose a specific vhd as I am deploying and I can not see to get the syntax correct for the uri string.
I know the syntax to call the variable on its own but when I add it into the uri string, it fails.
[variables(‘vhd’)[parameters('version')]]
Parameter:
"version": {
"type": "string",
"defaultValue": "10.5",
"allowedValues": [
"10.3.1",
"10.4.1",
"10.5"
],
Variable:
"vhd": {
"10.3.1": "20170524144905.vhd",
"10.4.1": "20170524144656.vhd",
"10.5": "20170524133408.vhd"
},
String:
"vhd": {
"uri": "[concat(concat(reference(resourceId(parameters('virtualMachineName'), 'Microsoft.Storage/storageAccounts', variables('storageAccountName')), '2015-06-15').primaryEndpoints['blob'], 'vhds/'), parameters('virtualMachineName'), '20170524133408.vhd')]"
},
We have an Azure ARM template, which is adding appsettings for a Microsoft.Web/site.
"resources": [
{
"apiVersion": "2016-03-01",
"name": "myazurefunction",
"type": "Microsoft.Web/sites",
"properties": {
"name": "myazurefunction",
"siteConfig": {
"appSettings": [
{
"name": "MY_SERVICE_URL",
"value": "[concat('https://myservice-', parameters('env'), '.domain.ca')]"
}
]
}
}
}
]
We also have four parameters.environment.json files. For instance, this is the content of parameters.dev.json.
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01...",
"contentVersion": "1.0.0.0",
"parameters": {
"env": {
"value": "dev"
}
}
}
The template and its parameters favor convention over configuration. This is working nicely for the most part, and leads to the following MY_SERVICE_URL values.
https://myservice-dev.domain.ca
https://myservice-qa.domain.ca
https://myservice-ci.domain.ca
https://myservice-prod.domain.ca
The problem is that we want to break the convention for the dev environment. That is, we want it to have a MY_SERVICE_URL that looks something like this:
https://abc123.foo.bar.baz.ca
How can we configure the ARM template to break the convention for only one environment?
My first though is to use a conditional like this, but such an ARM function appears not to be available.
"name": "MY_SERVICE_URL",
"value": "[parameters('env') -eq 'dev'
? 'https://abc123.foo.bar.baz.ca'
: concat('https://myservice-', parameters('env'), '.domain.ca')]"
just create a variable that would depend on the parameter:
"parameters": {
...
"DeploymentType": {
"type": "string",
"allowedValues": [
"Dev",
"Prod"
]
}
...
"variables": {
"Dev": "https://some_service-ci.domain.com",
"Prod": "https://abc123.foo.bar.baz.com",
"DeploymentVariable": "[variables(parameters('DeploymentType'))]",
...
"appSettings": [
"name": "MY_SERVICE_URL",
"value": "[variables('DeploymentVariable')]"
]
...
Ok, so how does this work. you pass in the parameter 'DeploymentType', it can be PROD or DEV. If you pass DEV "DeploymentVariable": "[variables(parameters('DeploymentType'))]", - this evaluates to "[variables('Dev')]" and gets the value of "Dev": "https://some_service-ci.domain.com",
For the example in the question, the answer ended up looking like this:
"variables": {
"myServiceUrl_default": "[concat('https://myservice-', parameters('env'), '.domain.ca')]",
"myServiceUrl_dev": "https://abc123.foo.bar.baz.ca",
"myServiceUrl_ci": "[variables('myServiceUrl_default')]",
"myServiceUrl_qa": "[variables('myServiceUrl_default')]",
"myServiceUrl_prod": "[variables('myServiceUrl_default')]",
"myServiceUrl": "[variables(concat('myServiceUrl_', 'parameters('env')'))]"
},
...
"appSettings: [
{
"name": "MY_SERVICE_URL",
"value": "[variables('myServiceUrl')]"
}
]