Dynamic Email attachment using Logic Apps via Data Factory - azure

I need to build a generic Logic app using which i can send mail with attachment.
Is this possible to pass path and file name as parameter so i can use same logic app for different ADF pipelines.

Of course we can use a generic with a generic logic app. You just need to set the "When a HTTP request is received" trigger with two parameters, we can do it by specify the schema of it(shown like below).
schema:
{
"type": "object",
"properties": {
"path": {
"type": "string"
},
"fileName": {
"type": "string"
}
}
}
In following actions of your logic app, you can use the parameters path and fileName when you get the file from Azure Data Lake.
Then you can use the logic app in Azure Data Factory by a "Web" activity.

Related

Is it possible to execute Event Hub triggered Azure Functions locally without using an actual Event Hub?

I just want to ask if it is possible to execute an Azure Function (Event Hub trigger) purely on my local machine without depending on any Azure Event Hubs resource?
I've been following Microsoft's process in developing Azure Functions locally (Link) but it seems that I need to fill in Event Hub connection string and name.
public static async Task Run([EventHubTrigger("samples-workitems", Connection = "eventhub-connectionstring")] EventData[] events, ILogger log)
Is there any way for this to be possible?
Each binding has an HTTP endpoint for local testing and debugging.
https://localhost:7071/admin/functions/{FUNCNAME}
That's available for QueueTrigger, ServiceBusTrigger, TimerTrigger, EventHubTrigger at least.
Send a POST request with the expected data as JSON.
{ "input": "YOUR JSON SERIALIZED AND ESCAPED DATA" }
For triggers that need data put the data as serialized string into "input". See EventHubTrigger example further below.
TimerTrigger
For TimerTrigger use this:
{ "input": null }
EventGridTrigger
To execute this on some triggers it's bit tricky. Here it is for EventGridTrigger:
https://localhost:7071/runtime/webhooks/EventGrid/functionName={FUNCNAME}
Send a POST request to execute. See here for details. The object must be an array.
EventHubTrigger
The EventHubTrigger receives data like the other triggers, as single JSON object. The structure follows the EventData class, but the only required field is "SystemProperties". There seems no serializer specific settings, property names do not change case etc.
Post this as body;
{
"input": "{\"SystemProperties\":{},\"SomeId\":\"123\",\"Status\":\"Available\"}"
}
The event hub's body is the escaped and serialized value of "input".
Note that the same applies for the IoT Hub
Meta Data
For ALL triggers you can get meta data with a GET request. For an EventHubTrigger this could look like this:
{
"name": "StateChange",
"script_root_path_href": "http://localhost:7071/admin/vfs/StateChange/",
"script_href": "http://localhost:7071/admin/vfs/bin/MyApp.Notifications.Functions.dll",
"config_href": "http://localhost:7071/admin/vfs/StateChange/function.json",
"test_data_href": null,
"href": "http://localhost:7071/admin/functions/StateChange",
"invoke_url_template": null,
"language": "DotNetAssembly",
"config": {
"generatedBy": "Microsoft.NET.Sdk.Functions-3.0.11",
"configurationSource": "attributes",
"bindings": [
{
"type": "eventHubTrigger",
"consumerGroup": "regular",
"connection": "EventHub_Hub_Status",
"eventHubName": "%EventHub_Status%",
"name": "statusUpdateMessage"
}
],
"disabled": false,
"scriptFile": "../bin/MyApp.Notifications.Functions.dll",
"entryPoint": "MyApp.Notifications.Functions.StateChange.Run"
},
"files": null,
"test_data": null,
"isDisabled": false,
"isDirect": true,
"isProxy": false
}
And, of course you can use all the paths to retrieve the data, including the binaries. Very handy to write sophisticated integration tests.
No, It is not possiable.
This is because there is no official simulation tool.
For httptrigger, you can use postman or just use some code to hit the endpoint.
For trigger about azure storage, you can use local azure storage explorer.
But something like eventhub, service bus and so on can not be triggered without creating a azure resource.

Can we run rest calls during ARM template deployment like Azure does for App Service Domain?

Can we create ARM template UI definition to mimick this behavior of making rest calls when the text box loses focus in Arm Template Deployment.
Like below
There is a dedicated widget for calling Azure 'ARM' APIs.
https://learn.microsoft.com/en-us/azure/azure-resource-manager/managed-applications/microsoft-solutions-armapicontrol
The following example shows the schema for this control:
{
"name": "testApi",
"type": "Microsoft.Solutions.ArmApiControl",
"request": {
"method": "{HTTP-method}",
"path": "{path-for-the-URL}",
    "body": {
      "key1": "val1",
      "key2": "val2"
}
}
}
But there is not a way to just call generic rest endpoints.

Validated Open API spec fails to upload on Azure API Management

I am trying to import OpenAPI specs json largely similar to https://github.com/ccouzens/keycloak-openapi/blob/master/keycloak/9.0.json on my azure api management service.
I validated my version of the openapi.json on
http://editor.swagger.io/
.
When I try creating API resources using the above json I get the Error :
{"error":{"code":"ValidationError","message":"One or more fields contain incorrect values:","details":[{"code":"ValidationError","target":"templateParameters","message":"All template parameters used in the UriTemplate must be defined in the Operation, and vice-versa."}]}}
please help
Found the Issue. The json API spec file has multiple API URLs with {id} occurring twice in the same URL ,which azure's API management doesn't allow, and there is no "parameter" definition for two path parameters.
For example , refer below URL and corresponding parameter definition in the api spec file
/{realm}/client-scopes/{id}/protocol-mappers/models/{id}
"parameters": [
{
"in": "path",
"name": "realm",
"description": "realm name (not id!)",
"required": true,
"schema": {
"type": "string"
},
"style": "simple"
},
{
"in": "path",
"name": "id",
"description": "Mapper id",
"required": true,
"schema": {
"type": "string"
},
"style": "simple"
}
]
swagger editor does not consider these constrain violations
So, to sum up, for uploading open API spec on Azure api management service, you need to consider following constrains along with the ones that is present in the azure docs
you cannot have two path parameters with same identification string
all path parameters should have a "parameter" definition in the spec json
PS : My api spec json was largely similar to the json file, but not the same. It has other issues like delete API with a request body.

Does the linked service support dynamic json in azure data factory?

Currently, I'm trying to set up a dynamic key vault linked service.
Unfortunately, whatever I try I'm not able to successfully test the connection.
{
"name": "AzureKeyVault1",
"properties": {
"type": "AzureKeyVault",
"typeProperties": {
"baseUrl": {
"value": "#concat('https://vault.azure.net','/')",
"type": "Expression"
}
}
}
}
The above code using concat is not a real use case but just a way to test if dynamic json is possible for linked service.
I was expecting (based on the documentation) that I could make the baseUrl property dynamic. Am I using the right formatting?
I get the following error:
Error: Error: Can't get property concat
Wim,based on the official document,parameterize linked services only supports below services,not including Azure Key Vault.
You could submit feedback here to push some improvements of azure data factory you want.
#Wim
To Answer your question, this is the correct formatting.
I have been able to parameterize a pipeline parameter, and then pass it as the baseurl in a dynamic expression.

Sending parameter from web activity in Data Factory to logic apps

I can successfully trigger Logic App from my Pipeline in ADFv2 via web activity. But now I would like to send also some user-defined parameters to logic app.
My question is now:
How can I send parameter from web Activity to logic app
How can I extract this parameter in logic app
On the Azure Data Factory-v2 side:
Click on the web activity. Go to the settings tab of the activity.
See this image for how to fill fields in settings tab
You have already figured what goes into URL and Method field in settings tab as you have successfully triggered logic app.
Let's suppose we want to send parameters in JSON (preferred way). Set 'NAME' Headers field to 'Content-Type' and 'VALUE' to 'application/json'.
In the body send you parameters in the form of JSON. Let's send following dummy parameters
{"Location":"northeurope","Model":"dummy_model","Server_name":"dummy_service","Onwer_email":"dummy#dummy.com"}
On the Logic App side:
You have already used 'When a HTTP request is received' trigger for logic app.
In the 'Request Body JSON Schema' field, enter the following schema to catch parameters send from ADFv2 web activity:
{
"properties": {
"Location": {
"type": "string"
},
"Model": {
"type": "string"
},
"Onwer_email": {
"type": "string"
},
"Server_name": {
"type": "string"
}
},
"type": "object"
}
See this image for help
You can also use 'Use sample payload to generate schema' instead of doing step 2 above. When using this option simply paste the json that you passed in body of ADFv2 web activity. It will automatically generate JSON schema to catch parameter.
Set the 'Method' field to the same method that you selected in ADFv2 web activity 'Method' field.
In subsequent steps in logic apps (for example initialize variable step) you can now use parameters set above (Location, Model, Onwer_email and Server_name) as Dynamic content using 'Add dynamic content' option. See this image for help.

Resources