I want to start my data factory with parameters. For that I defined a parameter "param" in my factory. The parameter is of type string and contains a json object like this
{
"url": "http://mySpecialFile.csv",
"name": "testing"
"destination: "farAway"
}
Now how can I a access these values in my factory. Is it necessary to define for each attribue a variable? And can I extract for example the value of url? I tried it like this
(#json(#pipeline().parameters.param).url)
But this does not work, it will write this value into the variable?
Is there better way to access the incoming json object? Also if I define for each json entry a variable I need to change my factory if the json changes?
Thanks for any advice
In Azure Data factory pipeline parameter for Json object type of data there is Data type called Object.
Here I created Pipeline parameter named json with Object data type and your sample value.
from this parameter to access particular value you can use dynamic parameter like #pipeline().parameters.json['value_which_you_want']
e.g. #pipeline().parameters.json['url'],#pipeline().parameters.json['name'],#pipeline().parameters.json['destination']
And it is fetching proper data.
Related
Sorry if this is a bit vague or rambly, I'm still getting to grips with Data Factory and a lot of it seems a bit obtuse...
What I want to do is query my Cosmos Database for a list of Ids of records that need to be updated. For each of these records, I want to call a REST API using the Id (i.e. /Record/{Id}/Details)
I've created a Data Flow that took a string as a parameter and then called the REST API fine.
I then made a pipeline using a Lookup with a query (select c.RecordId from c where...) and pass that into a ForEach with items set to #activity('Lookup1').output.value
I then setup the Activity of the ForEach to my Data flow. From research, I think I'm supposed to set the Parameter value to "#item().RecordId", but that gives an error "parameter [name] does not match parameter type 'string'".
I can change the type of the parameter to any (and use toString([parameter]) to cast it ) and then when I try and debug it passes the parameter in, but it gives an error of "Job failed due to reason: at (Line 2/Col 14): Datatype any not found".
I'm not sure what the solution is. Is there a way to cast the result of the lookup to an integer or string? Is there a way to narrow an any down? Is there a better way than toString() that would work? Is there a better way than ForEach?
I tried to reproduce similar scenario what you are trying.
My sample data in cosmos
To query Cosmos Database for a list of Ids and call a REST API using the Id For each of these records.
First, I took Lookup activity in data factory and selected the id's where the last_name is Bluth
Its output and settings are as below:
Then I passed the output of lookup activity to For-each activity.
Then inside for each activity I created Dataflow activity and for that DataSource I gave the source as Rest API. My Rest API to call specific user is https://reqres.in/api/users/2 I gave base URL as https://reqres.in/api/users.
Then I created parameter called demoId as datatype string and in relative URL I gave that dynamic value as #dataset().demoId
After this I gave value source parameter as #item().id as after https://reqres.in/api/users there is only id should be provided to get data in you case you can try Record/#{item().id}/Details.
For each id it is successfully passing id to rest API and fetching data:
Using a lookup activity in ADF to get list of tables that I want to output to Databricks notebook which will be used to run the code.
For Loop Object dynamic content #activity('Lookup IngestionControl').output.value
The error I'm getting is
The value type in key 'TABLENAME' is not expected type 'System.String'
Attempted Solution: #String(activity('Lookup IngestionControl').output.value)
Warning: Expression of type: 'String' does not match the field: 'items'
Ran it with the warning and get an error because the object is type array and cannot be converted to a string
You can only pass a string into the databricks API. ADF uses the databricks jobs API when it calls a notebook / jar.
https://docs.databricks.com/dev-tools/api/latest/jobs.html
What i usually do is convert the array into a json string. Can do this in SQL or in ADF doesnt really matter. Depending on which one you are trying to do would change how i would do it.
#activity('Lookup IngestionControl').output.value tells me its a Lookup activity. I would just create the json from sql and pass it through ADF and into your notebook.
I reading binary data (jpeg image) using an api (Web Action) and i want to store it as varbinary or base64 in azure sql server.
As it looks there is no way to base64 encode binary data using azure data factory. Is that correct?
So i am trying to pass it as byte[] using a varbinary parameter. The parameter of the stored procedure looks like this:
#Photo varbinary(max) NULL
The parameter in the stored procedure action in ADF looks like this:
But this seems also not to work, because the pipeline is failing with this error:
The value of the property 'Value' is invalid for the stored procedure parameter 'Photo'.
Is it possible to store that image using that approach? And if not, how can this be achieved (using ADF and Stored Procuedure)?
Just to be safe, are you missing a '#' before the activity ?
Cant see it on the picture
Peace
Can we pass a SqlQuerySpec object as a parameter to stored procedure in Document Db? I think this gives us flexibility to send parameterized SQL text and parameters to procedure. If this is not possible, I would like to know if it is possible to access the complete SQL from the SqlQuerySpec.
Thank you,
Soma.
The server-side API takes the same JSON string as the REST and node.js APIs. The SqlQuerySpec type for the .NET SDK actually translates to this before sending it up when doing a regular query.
So, here are two ways you can compose the parameterized query from within your stored procedure.
In the package of parameters that you send to your stored procedure, include a JSON string like this:
'''
{
"query": "SELECT * FROM books b WHERE (b.Author.Name = #name)",
"parameters": [
{"name": "#name", "value": "Herman Melville"}
]
}
'''
You may be able to pass the string directly into a call to Collection.queryDocuments() but you may have to do JSON.parse(<your string>) before calling Collection.queryDocuments()
If you know the shape of your query is going to be the same for every call to this stored procedure, then you can send each parameter for your query in as a parameter for the stored procedure. You then have to compose the filterQuery object in the JavaScript of your stored procedure.
I'm writing an Azure Function to access multiple records in Azure Table Storage and want to apply my filter at runtime with a variable passed in to a WebHook. I have successfully run my Function with the filter in function.js, but don't see anything in the docs on how to apply the filter inside index.js.
I tried this, but it had no effect on the entities returned. This same filter works correctly inside function.js.
context.bindings.inputTable.filter = 'name eq "test"';
You can't construct and set the filter in your function code. We do have an open issue here in our repo tracking support for more dynamic binding scenarios, which would enable this.
However, the function.json filter expression does support binding parameters, so if the parameters are part of the JSON payload coming in on the WebHook you can use them in your query. For example, if your payload contains properties region of type string and status of type int you can define a filter like "(Region eq '{region}') and (Status eq {status})" and the filter executed at runtime will be bound to the incoming values.