Passing timestamp to azure mapping data flow - azure

I'm trying to pass last modified date and time to my data flow as parameter. Can anyone tell me what is a correct way to pass it as parameter. I've tried multiple things like. passing utcnow() from activity throws error saying file not found whereas passing directly from dataflow works fine.
I figured out using dataflow expression works fine for ucnow() whereas pipeline expressions fails.

The Pipeline expression language is a different and a little more limited than the Data Flow expression language. While Data Flow supports a richer variable type system, Pipelines only support String, Boolean, and Array types. Since there is no Date or Timestamp types, the date functions in the Pipeline expression language return strings:
If you want to use the UTC value from the pipeline instead of the data flow, you will need define a string parameter on the Data Flow:
Then pass the string of utcnow() to the Data Flow as a Pipeline Expression:
In the expression, use the utcnow() function to get the string value:
In the Data Flow, use Derived Column to convert it to the desired type:

Related

How to use timestamp type parameters set in pipeline with timestamp type in data flow

I can't send the question due to some mysterious error, so I'll share a screenshot of the question.
Can anyone help me solve this?
I have reproduced the above and got same error when the Expression checkbox in checked.
Remove the Expression checkbox check in dataflow pipeline assignment and pass it as a string. Now it won't give the error.
It will take the Dataflow parameter like this.
Also, along with the Date time string pass the format in toTimestamp() function to avoid null values.
This is my sample input data:
sample filter condition:
toTimestamp(start_date,'yyyy-MM-dd\'T\'HH:mm:ss')
Filtered Result:

ADF - passing parameters within string to SQL Lookup

I'm writing a pipeline, where I fetch SQL queries from a metadata database in a lookup with the hope to execute those later on in the pipeline. Imagine a string stored in a database:
"SELECT * FROM #{pipeline().parameters.SchemaName}.#{pipeline().parameters.TableName}"
My hope was when passing this string to another Lookup activity, it would pick up the necessary parameters. However it's being passed to the activity as-is, without parameter substitution and I'm getting errors as a result. Is there any clean fix for this or am I trying to implement something not supported by ADF natively?
I found a work-around is just wrapping the string in a series of replace() statements, but hoping something simpler exists.
Can you try below query in the Dynamic Content text box:
#concat('SELECT * FROM ',pipeline().parameters.SchemaName,'.',pipeline().parameters.TableName)

I have to find the file with maximum speed size in azure data factory

I created an array variable and tried to pass that to max math function in ADF but i'm getting error in it. So how to using max function there?
Array is one of the datatypes supported in ADF with both parameters and variables, so if you have a valid array then max will work either. A simple example:
create a valid parameter of the Array datatype:
Create a variable and add a Set Variable activity. Use this expression as the Value argument:
#string(max(pipeline().parameters.pInputArray))
NB I'm using the max function directly on the array and then string as only the String, Array and Boolean datatypes are supported for variables (at this time).

How to pass content of file as a pipeline parameter

I have a pipeline that accepts an array as parameters.
Currently, an array has been hardcoded as the default value.
Is it possible to make this dynamic? there is a file called Array.txt in azure blob which is updated frequently, how can I extract the content of Array.txt and pass it as parameter values to the Pipeline.
I tried using Lookup but receive error 'Object cannot be passed, pipeline is expecting an Array'
Please make sure the data in Array.txt is array format-compatible, then use a lookup activity for data extracting, pass #array(activity('Lookup1').output.value) to the subsequent activity. Remember to use #array() function to convert data into an Array.

In Azure Logic App, how do I pass parameters in a http request?

I am trying to download a file from a website that requires me to pass parameters while making the HTTP request. The request is as follows
https:/wwww.xxx.com/download/exportdata.go?pid=3276439&startdate=2015-01-01&enddate=2015-01-02
Though when I hard code the request it works perfectly, but now that I want to download this file on a schedule, i need to be able to change the date for the startdate and enddate parameters?
I was trying to explore the expressions of utcnow and adddays, but not much success? What do i need to do to pass these parameters?
To pass arguments you can use the "#{}" syntax, and use the builtin functions such as "utcnow, adddays, concat, base64, length, contains, int, string, float, addhours, rand, toLower, toUpper, etc".
To add formatting to the date returned when using the #utcnow() function you can pass an optional argument to the call to use it as the formatting argument, something like:
http://api.example.org/weather?lat=35&lon=139&time=#{utcnow('yyyy-MM-dd')}
For more information on what that formatting string can look like you can use "C#" formatting as detailed in the following page:
https://learn.microsoft.com/en-us/dotnet/standard/base-types/custom-date-and-time-format-strings

Resources