how to using web activity in data factory - azure

I have to implement one report using Lookup activity and Web activity.
Lookup activity having output from store procedure with multiple records like below
Name ActiveRecords Active
Abc 500 0
XYZ 300 200
Something like the above I have output from the procedure and then I have to use this on web activity.
Also, I am having exciting web activity and I have appended it.
Thanks in advance

Created ADF pipeline as shown below,
In Lookup activity calling sql stored procedure as shown below,
Output of lookup activity as shown below,
The output of Lookup activity is using in web as
#string(activity('Lookup').output)
5. Debug the pipeline and it is taking output of lookup as shown below,

Related

ForEach activity to loop through an SQL parameters table?

I'm working on an ETL pipeline in Azure Synapse.
In the previous version I used an Array set as a parameter of the pipeline and it contained JSON objects. For example:
[{"source":{"table":"Address"},"destination {"filename:"Address.parquet"},"source_system":"SQL","loadtype":"full"}}]
This was later used as the item() and I used a ForEach, switch, ifs and nested pipelines to process all the tables. I just passed down the item parameters to the sub pipelines and it worked fine.
My task is now to create a dedicated SQL pool and a table which stores parameters as columns. The columns are: source_table, destination_file, source_system and loadtype.
Example:
source_table
destination_file
source_system
loadtype
"Address"
"Address.parquet"
"SQL"
"full"
I don't know how to use this table in the ForEach activity and how to process the tables this way since this is not an Array.
What I've done so far:
I created the dedicated SQL pool and the following stored procedures:
create_parameters_table
insert_parameters
get_parameters
The get_parameters is an SQL SELECT statement but I don't know how to convert it in a way that could be used in the ForEach activity.
CREATE PROCEDURE get_parameters AS BEGIN SELECT source_table, destination_filename, source_system, load_type FROM parameters END
All these procedures are called in the pipeline as SQL pool stored procedure. I don't know how to loop through the tables. I need to have every row as one table or one object like in the Array.
Take a lookup activity in Azure data factory/ Synapse pipeline and in source dataset of lookup activity, take the table that has the required parameter details.
Make sure to uncheck the first row only check box.
Then take the for-each activity and connect it with lookup activity.
In settings of for-each activity, click add dynamic content in items and type
#activity('Lookup1').output.value
Then you can add other activities like switch/if inside the for-each activity.

How terminate pipelines in Azure Synapse when query returns no rows

I have a pipeline A that is invoke by a main pipeline D. It invokes 2 other pipelines B and C. When pipeline A is invoked an extraction query is executed that can return rows or nothing.
In case it returns no rows I would like it to terminate without sending an error message.
It should also terminate the main pipeline D. In other words pipelines B and C shouldn’t be invoked.
How can I invoke such a terminal activity in Azure Synapse? I would like to avoid using a Fail activity as it would be a false negative.
Since your child pipeline has the look up output count, and there is direct way to pass the count to master pipeline, you can consider changing the pipeline configuration.
Instead of using lookup to get the count, you can directly use a copy data activity and write the count of records to a new table.
You can get this data (new table data) using look up in master pipeline and perform the check (whether count is 0 or not).
Look at the following demonstration. I have a table with no records in my azure SQL database. In Pipeline A, I have used the following query as source of copy data activity and auto created a table in sink.
-- in source. Querying the required table for count
select count(*) as count from demo
Now in master pipeline, use an additional lookup to read records from above created count_val table. The output will be as follows:
Now you can use this count in if condition using the following dynamic content:
#equals(activity('Lookup1').output.value[0].count,0)
The condition will be true when the count is 0 (as show below) and hence the flow stops (no activities inside true case). If there are records, then the next pipelines will be executed (false case).

Extracting data from API with pagination offset using Azure Data Factory

I am having a below API
http://example.com/?mdule=API&method=Live.getLastVisitsDetails&filter_limit=2000&filter_offset=0
Have to extract more data by increasing the offset into 2000
http://example.com/?mdule=API&method=Live.getLastVisitsDetails&filter_limit=2000&filter_offset=2000
http://example.com/?mdule=API&method=Live.getLastVisitsDetails&filter_limit=2000&filter_offset=4000
http://example.com/?mdule=API&method=Live.getLastVisitsDetails&filter_limit=2000&filter_offset=6000
I have a maximum of 6000 records. I don't know how to pass the offset value of every 2000 in the data factory
I can able to extract individually with the above links but wanted to do it automatically for all 6000 records.
Can anyone show me some documentation or advise how to execute this in the datafactory?
I saw the pagination documentation, but no success
Extracting data using copy activity from rest API
Step1: Create a new pipeline and add a Copy Data activity.
Step2: Configure the source of copy activity, adding a pagination rule configured as below
Follow this sample URL and also make sure ={offset} at the end of URL.
Pagination rule either select option1 or option2 .In my case I selected Range:0:8:2
In your scenario you can follow the range as below :
Option1: QueryParameter.{offset}: Range:0:6000:2000
Option2: AbsoluteUrl.{offset}:Range:0:6000:2000
Range option are using 0 as the start value,6000 as the max value, and increasing by 2000 each time.
Storage account
For more detail information refer this official article.

How to use Azure Data Factory IF Activity?

I am getting value cnt=1 from my query in LookUp Activity.
Now I want to check if cnt value is 1 in IF activity then run another activity.
Finally, I found the solution.
Below is the working code for my problem:
#equals(activity('Lookup1').output.firstRow.cnt,1)

How to achieve dynamic columnmapping in azure data factory when Dynamics CRM is used as sink

I have a requirement where i need to pass the column mapping dynamically from a stored procedure to the copy activity. This copy activity will perform update operation in Dynamics CRM. The source is SQL server (2014) and sink is Dynamics CRM.
I am fetching the column mapping from a stored procedure using look up activity and passing this parameter to copy activity.'
When i directly provide the below mentioned json value as default value to the parameter, the copy activity is updating the mapped fields correctly.
{"type":"TabularTranslator","columnMappings":{"leadid":"leadid","StateCode":"statecode"}}
But when the json value is fetched from the SP , it is not working . I am getting the error ColumnName is read only.
Please suggest if any conversion is required on the output of the loopup activity before passing the parameter to copy activity. Below is the output of the lookup activity.
{\"type\":\"TabularTranslator\",\"columnMappings\":{\"leadid\":\"leadid\",\"StateCode\":\"statecode\"}}
Appreciate a quick turnaround.
Using parameter directly and Using lookup output are different. can you share how did you write the parameter from the output of lookup actvitiy.
you can refer to this doc https://learn.microsoft.com/en-us/azure/data-factory/control-flow-lookup-activity

Resources