Substring of a global parameter ADF dynamic content - azure

"#{concat(split(string(pipeline().globalParameters.DATABASE), 'JERICHO_'), ' Data Warehouse Load', ' ',substring(utcNow(),0 ,10 ))}"
"#{concat(substring(string(pipeline().globalParameters.DATABASE), 8), ' Data Warehouse Load', ' ',substring(utcNow(),0 ,10 ))}"
The full global parameter is JERICHO_DEV. However I will be publishing this to different environments with different database names (although JERICHO_ will be common in all). Is there anyway to standardise the database name above so that it takes the part after the _ regardless of how many characters it is?

If you want to concatenate the global parameter substring with custom names like that you can use an array variable for the custom names and generate different database names by using ForEach activity.
Please follow below steps after creating the global parameter:
First create an array variable with the Set Variable activity and give all custom names list in that array for example:
["Data Warehouse Load","AZURE SQL DB","SERVERLESS SQL"]
Set variable activity:
Then, connect this to a ForEach activity and give the items value as
#variables('dbnames') and check the Sequential.
ForEach activity:
Now, go to activities inside the ForEach and drag Append Variable activity. Click on it and create a new array variable in the variables section and give your dynamic content.
#concat(substring(string(pipeline().globalParameters.DATABASE),0, 8),item(),' ',substring(utcNow(),0 ,10 ))
Append variable activity dynamic content:
Now create another Set variable activity for the result array output and connect it with ForEach by creating new array variable and values like below. This is optional as I am creating this array for showing the output. You can use the array variable created in append activity as result.
#variables('res_variable')
Set variable activity for output:
Execute the Pipeline and you can see the global parameter DATABASE name common(JERICHO_) in all the database names in the output.
Output:

Related

ForEach activity to loop through an SQL parameters table?

I'm working on an ETL pipeline in Azure Synapse.
In the previous version I used an Array set as a parameter of the pipeline and it contained JSON objects. For example:
[{"source":{"table":"Address"},"destination {"filename:"Address.parquet"},"source_system":"SQL","loadtype":"full"}}]
This was later used as the item() and I used a ForEach, switch, ifs and nested pipelines to process all the tables. I just passed down the item parameters to the sub pipelines and it worked fine.
My task is now to create a dedicated SQL pool and a table which stores parameters as columns. The columns are: source_table, destination_file, source_system and loadtype.
Example:
source_table
destination_file
source_system
loadtype
"Address"
"Address.parquet"
"SQL"
"full"
I don't know how to use this table in the ForEach activity and how to process the tables this way since this is not an Array.
What I've done so far:
I created the dedicated SQL pool and the following stored procedures:
create_parameters_table
insert_parameters
get_parameters
The get_parameters is an SQL SELECT statement but I don't know how to convert it in a way that could be used in the ForEach activity.
CREATE PROCEDURE get_parameters AS BEGIN SELECT source_table, destination_filename, source_system, load_type FROM parameters END
All these procedures are called in the pipeline as SQL pool stored procedure. I don't know how to loop through the tables. I need to have every row as one table or one object like in the Array.
Take a lookup activity in Azure data factory/ Synapse pipeline and in source dataset of lookup activity, take the table that has the required parameter details.
Make sure to uncheck the first row only check box.
Then take the for-each activity and connect it with lookup activity.
In settings of for-each activity, click add dynamic content in items and type
#activity('Lookup1').output.value
Then you can add other activities like switch/if inside the for-each activity.

Azure Data Factory concating syntax

I'm trying to run a pipeline that results a select with where be the system triggername system output.
I've tried to use :
#concat(concat(concat(concat('select * from ',item().DB_CARGA,'.',item().SC_CARGA,'.',item().tb_carga, 'where ' ,item().DS_CARGA ,'=', string(pipeline().TriggerName)))))
But I'm getting the following error:
Could anyone help me with the right syntax?
I reproduced this and got similar error when I used your syntax.
In your dynamic expression, there should be a space between table name and where and also trigger name should be enclosed in single quotes.
Please go through the below procedure to resolve the error.
First, I have a lookup activity which will give database name, table name and SQL table column(trigger column name). Give this to Foreach.
I have created sample tables with trigger_column column and gave some values.
In pipeline I have created trigger with same name mytrigger. Inside Foreach for my demo I have used lookup query. You can use your activity as per the requirement.
Dynamic content:
You can do this with single concat function.
#concat('select * from ',item().database,'.',item().table_name, ' where ' ,item().trigger_name ,'=','''',string(pipeline().TriggerName),'''')
This will give the SQL query like this when Executed.
Output when triggered:

How can I modify a property of a variable in a logic app?

I need to get counts of records matching a category and subcategory, then include the summary data in an email.
I was thinking to follow the flow:
create a variable (Counts: object, {}))
-> forEach category(
forEach subcategory(
Run Analytics Query
-> Set Variable (Counts)))
-> Visualize Analytics Query # gets raw data to attach to email
-> Send Email # somehow iterating over same category and subcategory to inject summary data in the email
My question is, how do I set just a property of the Counts variable? or, failing that, what is a good way I can keep track of my results such that I can use the same nested for loop to build the email?
Here provide a sample to just modify one property of variable(object) for your reference.
1. I initialize a variable named Counts shown as:
2. Then initialize another variable named temp shown as:
3. I want to change the value of property2 in variable Counts from 20 to 19. So do it like below:
4. After that, do not forget update the value of variable temp:
In your situation, step 1 and step 2 should be outside of the "For each" loop because "For each" loop doesn't allow "Initialize variable" action exist.
================================Update=================================
concat('"property2":"', string(variables('Counts')?['property2']), '"')
json(replace(variables('temp'), variables('oneProperty'), '"property2":"19"'))

Azure Data Factory - Can we use local variables inside derived column in the data flow?

Experts,
I have a simple requirement where I need to store the value of a variable inside a column in my destination SQL Table (Sink).
Here's what I am doing:
Created a new pipeline and created a variable "createDate" with value "#utcNow"
I created a "data flow" task, where I configured my source (a simple CSV file) and added a "Derived Column" task because I want to store the date when the data was loaded in to my destination table
In my "Derived Column" task I added a new column "der_createDate", BUT, i don't know how can I assign the value of the "createDate" variable to this derived column, I tried several expressions like - #variables('createDate'), but, the expression validation fails.
How can I use the value of a variable created in the pipeline in any of "Data Flows"? Is it even possible? I have seen several use cases of variables in iterables or even in the "Copy Data" task, but, I am using Data Flows and would want to refer to those local variables i created in my pipeline. Let me know what you'd suggest.
Variables are not supported in dataflow. You have to pass the value of this variable as a parameter(createDate) into dataflow and reference it as $createDate
We can declare a variable createDate in the pipeline.
2.Inside the dataflow we can declare a parameter parameter1 and temporarily assign double quotes.
Then in the pipeline, select Pipeline expression, add dynamic value #variables('createDate'). So we can assign value of createDate to parameter1.
In DerivedColumn1 activity, we can generate a new column with the value of parameter1. There may be problems with the data preview, but the debugging result is correct. Because this column will assigned at runtime.
The debugging result is as follows:

ssis string variables value should be used in exequte sql task

I am trying to create a table. In the table name id is unique
(MARKET_id is the name of the table).
The id has leading zeros which needs to be preserved. So I declared id as a string variable.
Example: id=02161515. And in parameter mapping I mapped it as VARCHAR.
An error occurs when I'm using a sql statement CREATE TABLE MARKET_?;. It results in: CREATE TABLE MARKET_'02161515'; the quotes is unnecessary and throws an error found "'" in the sql statement.
Please help!
I guess you have columns added to your table, but are not posted?
One way to solve this would be to create two variables, one for the table name and one for the create table statement.
Example (for demonstration only):
First variable is TABLE_NAME and defined with the following expression for demonstration of dynamic naming:
"dmu.MY_TABLE_"+(DT_STR,30,1252)datepart("s",getdate()) + "( id int)"
Second variable is CREATE_TABLE_SQL and defined as follows:
"Create table "+ #[User::TABLE_NAME]
Next change the Settings of your Execute_SQL_Task:
Set SQLSourceType to Variable and SourceVariable to User::CREATE_TABLE_SQL
This will create the table in your database

Resources