Using parameters to locate file during trigger creation in Azure Data Factory - azure

I am trying to create a trigger that I will use for starting a pipeline in ADF:
The folder I want to set my trigger on can have different paths:
202001/Test/TriggerFolder
202002/Test/TriggerFolder
202003/Test/TriggerFolder
etc..
Therefore in my Blob path begins with I would like to use a parameter (that I will set somewhere else through another pipeline) that tells the trigger where to look for instead of having a static name file.
Unfortunately it doesn't seem to give me the chance to add dynamic content as (for example) in a DataSet.
If there is really no chance, because maybe I may think the trigger is something instantiated once, is there a way to create a trigger as a step during a pipeline?
Thank you!

There is possibility to pass parameter from "ARM Template" of the Azure Data Factory. During the deployment of pipelines, this parameter can be passed with necessary value. Below is example code for it.
Sample Code:
{
"name": "[concat(parameters('factoryName'), '/trigger1')]",
"type": "Microsoft.DataFactory/factories/triggers",
"apiVersion": "2018-06-01",
"properties": {
"annotations": [],
"runtimeState": "Stopped",
"pipelines": [],
"type": "BlobEventsTrigger",
"typeProperties": {
"blobPathBeginsWith": "[parameters('trigger1_properties_typeProperties_blobPathBeginsWith')]",
"ignoreEmptyBlobs": true,
"scope": "[parameters('trigger1_properties_typeProperties_scope')]",
"events": [
"Microsoft.Storage.BlobCreated"
]
}
},

Related

ARM Template - How to pass a dynamic array to a property?

Im triying to have a storage account created with several virtualnetworkrules but has to be dinamyc. Like an array of X virtual network will be on the parameter portion of the arm and then i would need this property to iterate on it?
parameter will be something like ["subnetidstring1", "subnetidstring2"] and so on
Property that needs to be populated:
"virtualNetworkRules": [
{
"action": "Allow",
"id": "string",
"state": "string"
}
I tried with copy but i cant get it to work. Maybe I'm doing something wrong. I would really appreciate your help.
Thanks!
I tried this:
https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/copy-properties

Azure Data Factory not interpreting well an array global parameter

We have an Azure Data Factory using Global Parameters, it's working fine on our Dev environment, but we when try do deploy it to QA environment using an Azure DevOps pipeline, it seems it's not understanding the only Global Parameter with type = array; even though all of the other parameters are good.
This is the guide we're using to build the CI/CD pipelines.
We have something similar to this in the Global Parameters JSON file:
{
"FilesToProcess": {
"type": "array",
"value": [
"VALUE01",
"VALUE02",
"VALUE03",
"VALUE04",
"VALUE05",
"VALUE06",
"VALUE07",
"VALUE08",
"VALUE09",
"VALUE10",
"VALUE11",
"VALUE12",
"VALUE13",
"VALUE14",
"VALUE15",
"VALUE16",
"VALUE17",
"VALUE18",
"VALUE19",
"VALUE20",
"VALUE21",
"VALUE22",
"VALUE23",
"VALUE24",
"VALUE25",
"VALUE26",
"VALUE27"
]
},
"EmailLogicAppUrl": {
"type": "string",
"value": "URL
}
}
All of the paremeters are deployed fine, except for the array one, and we're getting this:
We have debugged the PS script to update the Global Parameters, and it seems it's understanding well the array, so it has to be something else.
Any help will be highly appreciated.
Thanks!

Is it possible to create a blob triggered azure function with file name pattern?

I am developing a blob triggered azure function. Following is the configuration of my "function.json" file:
{
"disabled": false,
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "input/{name}",
"connection": "BlobConnectionString"
}
]
}
My function is working fine. It is triggered for all files in "input" blob. Now I want to filter files by its naming pattern. For Example : I want to trigger my azure function for only those files which contains "~123~" in its name.
Is it possible to do with some change in "path" property of "function.json" file?
If yes, then what should be the value of the "path" property?
If not, please let me know if there is any other workaround possible.
Thanks,
input/{prefix}~123~{suffix} should work. In function method signature, instead of name, use prefix and suffix to get blob name if needed.

Scheduling an httpTrigger function

I'm trying to set the schedule field at function.json on my "httpTrigger" type function but it seems the timer functionality doesn't run. My goal is to have a function that could be even scheduled and manually started, if needed, without having to add another function just for scheduling.
{
"disabled": false,
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get"
],
"schedule": "0 0 * * * *"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
The only trigger type that you can schedule is timerTrigger.
So, if you need a piece of code to both run on schedule and be runnable from a nice URL, you will have to create two Functions with two trigger types.
If you don't need a nice HTTP URL, you can call a timer Functions manually, see this answer.
My goal is to have a function that could be even scheduled and manually started, if needed, without having to add another function just for scheduling.
As other members mentioned that HttpTrigger does not support schedule. For your requirement, you could create another function or manually invoke your function programmatically as Mikhail answered.
Per my understanding, you could leverage Azure Scheduler for a simpler approach to schedule your azure function.
GET https://{your-function-name}.azurewebsites.net/api/HttpTriggerCSharp1?code={your-function-key}
Or
GET https://{your-function-name}.azurewebsites.net/api/HttpTriggerCSharp1
Header x-functions-key:{your-function-key}

Azure Data Factory specify custom output filename when copying to Blob Storage

I'm currently using ADF to copy files from an SFTP server to Blob Storage on a scheduled basis.
The filename structure is AAAAAA_BBBBBB_CCCCCC.txt.
Is it possible to rename the file before copying to Blob Storage so that I end up with a folder-like structure like below?
AAAAAA/BBBBBB/CCCCCC.txt
Here is what worked for me
I created 3 parameters in my Blob storage dataset, see the image bellow:
I specified the name of my file, added the file extension, you can add anything in the Timestamp just so you could bypass the ADF requirement since a parameter can't be empty.
Next, click on the Connection tab and add the following code in the FileName box: #concat(dataset().FileName,dataset().Timestamp,dataset().FileExtension). This code basically concatenate all parameters do you could have something like "FileName_Timestamp_FileExtension. See the image bellow:
Next, click on your pipeline then select your copy data activity. Click on the Sink tab. Find the parameter Timestamp under Dataset properties and add this code: #pipeline().TriggerTime. See the image bellow:
Finally, publish your pipeline and run/debug it. If it worked for me then I am sure it will work for you as well :)
With ADF V2, you could do that. First, use a lookup activity to get all the filenames of your source.
Then chain a foreach activity to iterate the source file names. The foreach activity contains a copy activity. Both your source dataset and sink dataset of the cop activity have parameters for filename and folder path.
You could use split and replace functions to generate the sink folder path and filename based on your source file names.
First you have to get the filenames in a GetMetadata-Activity. You can use this as a parameter in a copy-Activity and rename the filenames.
As mentioned in previous answer you can use a replace function to do this:
{
"name": "TgtBooksBlob",
"properties": {
"linkedServiceName": {
"referenceName": "Destination-BlobStorage-data",
"type": "LinkedServiceReference"
},
"folder": {
"name": "Target"
},
"type": "AzureBlob",
"typeProperties": {
"fileName": {
"value": "#replace(item().name, '_', '\\')",
"type": "Expression"
},
"folderPath": "data"
}
},
"type": "Microsoft.DataFactory/factories/datasets"
}

Resources