I am creating an extensive data factory work flow that will create and fill a data warehouse for multiple customers automatic, however i'm running into an error. I am going to post the questions first, since the remaining info is a bit long. Keep in mind i'm new to data factory and JSON coding.
Questions & comments
How do i correctly pass the parameter through to an Execute Pipeline activity?
How do i add said parameter to an Azure Function activity?
The issue may lie with correctly passing the parameter through, or it may lie in picking it up - i can't seem to determine which one. If you spot an error with the current setup, dont hesitate to let me know - all help is appreciated
The Error
{
"errorCode": "BadRequest",
"message": "Operation on target FetchEntries failed: Call to provided Azure function
'' failed with status-'BadRequest' and message -
'{\"Message\":\"Please pass 'customerId' on the query string or in the request body\"}'.",
"failureType": "UserError",
"target": "ExecuteFullLoad"
}
The Setup:
The whole setup starts with a function call to get new customers from an online economic platform. It the writes them to a SQL table, from which they are processed and loaded into the final table, after which a new pipeline is executed. This process works perfectly. From there the following pipeline is executed:
As you can see it all works well until the ForEach loop tries to execute another pipeline, that contains an azure function that calls a .NET scripted function that fills said warehouse (complex i know). This azure function needs a customerid to retrieve tokens and load the data into the warehouse. I'm trying to pass those tokens from the InternalCustomerID lookup through the ForEach into the pipeline and into the function. The ForEach works actually, but fails "Because an inner activity failed".
The Execute Pipeline task contains the following settings, where i'm trying to pass the parameter through which comes from the foreach loop. This part of the process also works, since it executes twice (as it should in this test phase):
I dont know if it doesn't successfully pass the parameter through or it fails at adding it to the body of the azure function.
The child pipeline (FullLoad) contains the following parameters. I'm not sure if i should set a default value to be overwritten or how that actually works. The guides i've look at on the internet havent had a default value.
Finally there is the settings for the Azure function. I'm not sure what i need to write in order to correctly capture the parameter and/or what to fill in - if it's the header or the body regarding the error message. I know a post cannot be executed without a body.
If i run this specific funtion by hand (using the Function App part of portal.azure.com) it works fine, by using the following settings:
I viewed all of your detailed question and I think the key of the issue is the format of Azure Function Request Body.
I'm afraid this is incorrect. Please see my below steps based on your description:
Work Flow:
Inside ForEach Activity, only one Azure Function Activity:
The preview data of LookUp Activity:
Then the configuration of ForEach Activity: #activity('Lookup1').output.value
The configuration of Azure Function Activity: #json(concat('{"name":"',item().name,'"}'))
From the azure function, I only output the input data. Sample Output as below:
Tips: I saw your step is executing azure function in another pipeline and using Execute Pipeline Activity, (I don't know why you have to follow such steps), but I think it doesn't matter because you only need to focus on the Body format, if your acceptable format is JSON, you could use #json(....),if the acceptable format is String, you could use #cancat(....). Besides, you could check the sample from the ADF UI portal which uses pipeline().parameters
Related
I'm trying to add a pipeline parameter into the body of a post request to an Azure Function app in Azure Data Factory. It appears that the string isn't getting replaced, but the Microsoft documentation suggests I'm doing it the right way. See here: https://learn.microsoft.com/en-gb/azure/data-factory/control-flow-web-activity#request-payload-schema
This is a screenshot of an error I'm getting alongside it:
I'm confused as to how to proper interpolate pipeline parameters into this request
Thanks!
As mentioned in this document, wrap your parameter inside braces #{pipeline().parameters.ParameterName}.
Below is the example with my sample API post body:
{
"tourist_name": "#{pipeline().parameters.name}",
"tourist_email": "#{pipeline().parameters.email}",
"tourist_location": "#{pipeline().parameters.location}"
}
So I'm trying to do something I expected to be simple - using Logic App to insert a json object into Cosmos DB.
I've created a CosmosDB (based on Core SQL API) and created a container called Archive with the partition key /username (which is in my json object).
Now to Logic App.
First, I YOLO'ed the simple approach.
Which gave me error: "The input content is invalid because the required properties - 'id; ' - are missing"
But my json object doesnt have an id field. Maybe use the IsUpsert parameter? I dont feel like manipulating my input document to add a field called 'id'.
Which gave me error: "One of the specified inputs is invalid" Okay - feels even worse.
I now tried to use the other logic app connector (Not V2, but the original).
which gave me error: "The partition key supplied in x-ms-partitionkey header has fewer components than defined in the the collection."
I saw that this connector(unlike the V2 one) has a Partition Key Value parameter from UI, which I added to pass the value of my username
which gave me error "The input content is invalid because the required properties - 'id; ' - are missing".
At this point I thought, let me just give the bloody machine what it wants, and so I added "id" to my json object.
and yes that actually worked.
So questions are
With Logic Apps connectors, are you only able to insert json objects into Cosmos DB without that magic field "id" in the message payload?
If the partitioning key is required. Why is it not available from the V2 connector parameter UI?
Thanks.
The v1 version of this thing doesn't have partition key because it's so old, Cosmos didn't have partitioned containers. You will want to use the v2 preview. I think that will go GA at some point here soon because it's been in preview for a while now.
And yes, With Cosmos DB you can't insert anything without it's id so you will need to generate from whatever calls your endpoint and pass it in the body of your http request.
I am new in Azure technology. I have created two functions Master (masterFunction) and Child (childFunction) in Javascript.
I want to invoke Child function from Master function.
I have tried with childFunction.Run(req, log); but its not working.
Please suggest me to fix above issue.
Thanks
This does not work, the azure function actually executes code to meet the trigger conditions. For example, if your childfunction is triggered by a blob, then the way your masterfunction calls it is to satisfy the trigger condition of the childfunction, that is, add operations to the target blob in your masterfunction. If your childfunction is http, then you need to send a request to the childfunction url in the code of the masterfunction.
Is there a clean way to query for the names for all disabled functions in Azure Monitor?
Sure, I can hard-code the names or the functions and check if there has been any logs for the functions, but I imagine there is a smarter way.
Thanks!
You should use Application insights for Azure function. For details, please follow this article. Then you can use the query below to get all the disabled function names.
Note: The query below can be run in Application insights, or Azure Monitor.
traces
//use sdkVersion to ensure it's an anzure function
| where sdkVersion contains "azurefunctions"
//then check if the message contains the word disabled
| where message contains "disabled"
//get the function name from message
| extend functionname=substring(message, 10,indexof(message,"'", 10)-10)
Explain about the query:
1.if the function is disabled, then the message field should contain the information like "function xxx is disabled."
2.to ensure it's an azure function, I check the field sdkVersion to see if it contains word "azurefunctions"
3.at last, fetch the function name from the message field.
The test result:
I tested it with function v3, if you're using azure function v2 or v1, you may(or may not) modify the query a little, but it should be easy.
I am new to Terraform and have been trying to understand the constructs of the same. Let's say i have a service which exposes REST API's and i want to call those REST API's as part of my terraform script, what are the steps i need to take ?
My understanding is that i need to write a custom provider but i am unable to connect the dot's on how to add new data source type for the new provider.
Also, assuming that we do have the required provider, whats the protocol that would be used for communicating with my service ? Is it HTTP/s ?
One more point to note is that my service currently is used for configuring storage in the backend.
Recent versions of terraform ( > 0.9 I believe) support external data sources. You don't have to create a custom provider. You can call any arbitrary shell or python script that return values that you can use as data.
data "external" "example" {
program = ["python", "${path.module}/example-data-source.py"]
query = {
# arbitrary map from strings to strings, passed
# to the external program as the data query.
id = "abc123"
}
}
In your case you could use a simple curl in a bash script to call your endpoint and return data to terraform as a map of strings.
Do note the warnings a the top of that page.
This is considerably more difficult then it appears; it is impossible to debug the interaction between what terraform is sending to my script and what the script is expecting. It just fails to parse the arguments and refuses to provide me any feedback as to what is getting into the program