Set variable in Azure data factory - azure

I’m wondering if someone can help us out here as we are going round in circles.
In short we are trying to;
Read a value from a SQL table via a stored procedure in a lookup task.
We want to use that value to set a variable so we can use it in a copy data task.
However our set variable task using (#activity('Lookup1').output) returns the value but its wrapped in lots of JSON (see attached).
In the attached we are only interested in the TokenGUID value, not the rest of the JSON.
So can someone please point us in the direction of how we would set a variable to be a string value.
Thanks,
Nic

You can use this expression to get TokenGUID:
#activity('Lookup1').output.firstRow.TokenGUID
My test:
Output of Lookup activity
Output of Set variable activity

Related

How to use timestamp type parameters set in pipeline with timestamp type in data flow

I can't send the question due to some mysterious error, so I'll share a screenshot of the question.
Can anyone help me solve this?
I have reproduced the above and got same error when the Expression checkbox in checked.
Remove the Expression checkbox check in dataflow pipeline assignment and pass it as a string. Now it won't give the error.
It will take the Dataflow parameter like this.
Also, along with the Date time string pass the format in toTimestamp() function to avoid null values.
This is my sample input data:
sample filter condition:
toTimestamp(start_date,'yyyy-MM-dd\'T\'HH:mm:ss')
Filtered Result:

ADF - passing parameters within string to SQL Lookup

I'm writing a pipeline, where I fetch SQL queries from a metadata database in a lookup with the hope to execute those later on in the pipeline. Imagine a string stored in a database:
"SELECT * FROM #{pipeline().parameters.SchemaName}.#{pipeline().parameters.TableName}"
My hope was when passing this string to another Lookup activity, it would pick up the necessary parameters. However it's being passed to the activity as-is, without parameter substitution and I'm getting errors as a result. Is there any clean fix for this or am I trying to implement something not supported by ADF natively?
I found a work-around is just wrapping the string in a series of replace() statements, but hoping something simpler exists.
Can you try below query in the Dynamic Content text box:
#concat('SELECT * FROM ',pipeline().parameters.SchemaName,'.',pipeline().parameters.TableName)

How to pass JSON into an Azure Function with embedded dynamic content in Azure Data Factory V2

In ADFv2 I'm looking up a date and passing it to an Azure Function. I can pass just the data like so:
#activity('GetLastDateProcessed').output.firstRow.LastDateProcessed
However if I embed this into a JSON string like this:
{"lastProcessDate":"#activity('GetLastDateProcessed').output.firstRow.LastDateProcessed"}
I get this {"lastProcessDate":"#activity('GetLastDateProcessed').output.firstRow.LastDateProcessed"} instead of {"lastProcessDate":"2019-11-13"} as input into function.
Last I've tried to use a parameter with no success also.
#concat('{"lastProcessDate":"', string(pipeline().parameters.lastProcessDate), '"}')
The problem here is the parameter was not set. I set the parameter like this:
#activity('GetLastDateProcessed').output.firstRow.LastDateProcessed
However this is a default value and is never dynamically updated. If I can update this string then the #concat method will work, but haven't been able to figure out how to dynamically update a parameter for the pipeline.
Another option could be a pipeline variable, but I don't know how to reference the variable.
How do I concat strings together with dynamic content?
I think what you are missing is that when you use the at-sign '#' in the json string you should follow it with a curly bracket '{'
In your example it will look something like this:
{"lastProcessDate":"#{activity('GetLastDateProcessed').output.firstRow.LastDateProcessed}"}
here is the source (found it in the comments):
https://azure.microsoft.com/en-us/blog/azure-functions-now-supported-as-a-step-in-azure-data-factory-pipelines/#:~:text=Azure%20Data%20Factory%20(ADF)%20is,in%20your%20data%20factory%20pipelines.
I was able to get this to work by creating a second pipeline. This is not optimal, but works for people running into this same issue. Hopefully someone finds a better solution than this!
From the first pipeline I set the second pipelines parameter with this:
#activity('GetLastDateProcessed').output.firstRow.LastDateProcessed
I named the parameter in the second pipeline lastProcessDate so then this worked:
#concat('{"lastProcessDate":"', string(pipeline().parameters.lastProcessDate), '"}')
This is not straight forward and can't be how Microsoft is expecting us to solve this!
I was able to achieve this with command.
{
"storedprocedure":"storedProcName",
"params":"#{variables('currentDt')}"
}

How to pass content of file as a pipeline parameter

I have a pipeline that accepts an array as parameters.
Currently, an array has been hardcoded as the default value.
Is it possible to make this dynamic? there is a file called Array.txt in azure blob which is updated frequently, how can I extract the content of Array.txt and pass it as parameter values to the Pipeline.
I tried using Lookup but receive error 'Object cannot be passed, pipeline is expecting an Array'
Please make sure the data in Array.txt is array format-compatible, then use a lookup activity for data extracting, pass #array(activity('Lookup1').output.value) to the subsequent activity. Remember to use #array() function to convert data into an Array.

SSIS: Filtering Multiple GUIDs from String Variable as Parameter In Data Flow OLE Source

I have an SSIS package that obtains a list of new GUIDs from a SQL table. I then shred the GUIDs into a string variable so that I have them separated out by comma. An example of how they appear in the variable is:
'5f661168-aed2-4659-86ba-fd864ca341bc','f5ba6d28-7283-4bed-9f11-e8f6bef225c5'
The problem is in the data flow task. I use the variable as a parameter in a SQL query to get my source data and I cannot get my results. When the WHERE clause looks like:
WHERE [GUID] IN (?)
I get an invalid character error so I found out the implicit conversion doesn't work with the GUIDs like I thought they would. I could resolve this by putting {} around the GUID if this were a single GUID but there are a potential 4 or 5 different GUIDs this will need to retrieve at runtime.
Figuring I could get around it with this:
WHERE CAST([GUID] AS VARCHAR(50)) IN (?)
But this simply produces no results and there should be two in my current test.
I figure there must be a way to accomplish this... What am I missing?
You can't, at least not using the mechanics you have provided.
You cannot concatenate values and make that work with a parameter.
I'm open to being proven wrong on this point but I'll be damned if I can make it work.
How can I make it work?
The trick is to just go old school and make your query via string building/concatenation.
In my package, I defined two variables, filter and query. filter will be the concatenation you are already performing.
query will be an expression (right click, properties: set EvaluateAsExpression to True, Expression would be something like "SELECT * FROM dbo.RefData R WHERE R.refkey IN (" + #[User::filter] + ")"
In your data flow, then change your source to SQL Command from variable. No mapping required there.
Basic look and feel would be like
OLE Source query

Resources