Azure Data Factory - if condition expression builder - azure

I have a lookup that runs and returns a status of 0/1,as per below;
Status
1
I want to use the if condition, where if the value = 1, then execute another pipeline.
I am trying to do this using If Condition, with the following expression;
#equals(activity('Dependency Checker').output.firstRow,1)
But it does not evaluate as true and therefore run the activity.
When I check the output via debug, the output is as follows;
And it does not execute the pipeline (the true activity).
Edit -
Here is the output of the lookup, it is capturing the value 1;
I added a wait task to the true/false activities, and as a result, i've noticed it is initiating the False activity.

Based on the Lookup activity output:
You need to use the below expression
#equals(activity('Dependency Checker').output.firstRow['Status'],'1')

Related

How to validate the output of "Execute a SQL query" action in Azure Logic App

I have created an Azure Logic App by adding Execute a SQL query (V2) action. In the Execute a SQL query action, I have used the following query to get the specific column data.
select XXXX from [dbo].[XXXX] where XXXX=#{triggerBody()?['XXXX']?['XXXX']}
I’m getting the column data by using the following expression:
body('Execute_a_SQL_query_(V2)')?['resultsets']?['Table1'][0][<'Name of table column'>]
But for some scenarios Execute a SQL query action returns the following response:
{
"ResultSets": {},
"OutputParameters": {}
}
Whenever Execute a SQL query action returns empty response, then I'm getting the following error:
InvalidTemplate. Unable to process template language expressions in action 'Set_variable' inputs at line '0' and column '0': 'The template language expression 'int(body('Execute_a_SQL_query_(V2)')?['resultsets']?['Table1'][0]['XXXX'])' cannot be evaluated because property '0' cannot be selected
So, can anyone suggest me how to validate the “ResultSets” object is null or not?
You can use the below expression and conditions to evaluate if the ResultSets are empty and then proceed with further steps.
equals(string(outputs('Execute_a_SQL_query_(V2)')?['body']['ResultSets']),'{}')
Example:
If its true, that means it empty set.

how to stop the execution of a foreach loop in ADF when an activity fails

I have a scenario where I need to fail the complete pipeline when an activity fails inside the ForEach loop container in ADF V2. Don't want the loop to continue further.
I am using ExecutePipeline to call the pipeline which contain ForEach loop.
Please advice me in doing this.
Thanks,
Nandini
In ADF, break loop in the ForEach activity is not supported now. You can add a If Condition activity in the ForEach activity to skip some steps in the loop.
For a situation where I wanted to break out of a loop and return to the parent flow when a Copy data Activity failed, I set the Copy data failure output path to execute two activities:
Set a condition that would stop looping (in my case # records written less than expected).
Force a failure with an invalid 'Set variable' (Set a string value to integer or anything that would trigger a fail).
When the Copy data activity fails, execution picks up in the parent flow on the Failure path of the loop.

Azure data factory: Handling inner failure in until/for activity

I have an Azure data factory v2 pipeline containing an until activity.
Inside the until is a copy activity - if this fails, the error is logged, exactly as in this post, and I want the loop to continue.
Azure Data Factory Pipeline 'On Failure'
Although the inner copy activity’s error is handled, the until activity is deemed to have failed because an inner activity has failed.
Is there any way to configure the until activity to continue when an inner activity fails?
Solution
Put the error-handling steps in their own pipeline and run them from an ExecutePipeline activity. You'll need to pass-in all the parameters required from the outer pipeline.
You can then use the completion (blue) dependency from the ExecutePipeline (rather than success (green)) so the outer pipeline continues to run despite the inner error.
Note that if you want the outer to know what happened in the inner then there is currently no way to pass data out of the ExecutePipeline to its parent (https://feedback.azure.com/forums/270578-data-factory/suggestions/38690032-add-ability-to-customize-output-fields-from-execut).
To solve this, use an sp activity inside the ExecutePipeline to write data to a SQL table, identified with the pipeline run id. This can be referenced inside the pipeline with #pipeline().RunId.
Then outside the pipeline you can do a lookup in the SQL table, using the run ID to get the right row.
HEALTH WARNING:
For some weird reason, the output of ExecutePipeline is returned not as a JSON object but as a string. So if you try to select a property of output like this #activity('ExecutePipelineActivityName').output.something then you get this error:
Property selection is not supported on values of type 'String'
So, to get the ExecutePipeine's run ID from outside you need:
#json(activity('ExecutePipelineActivityName').output).pipelineRunId
I couldn't find this documented in Microsoft's documentation anywhere, hence posting gory details here.

Using If-Condition ADF V2

I have one Copy activity and two stored Proc Activity and i want to basically update the status of my pipeline as Failed in Logtable if any of these activities failed with error message details. Below is the flow of my pipeline
I wanted to use If-Condition activity and need help in setting the expression for it. For Copy activity i can use the below expression, but not sure about getting the status of stored Proc activity
#or(equals(activity('Copy Data Source1').output.executionDetails[0].status, 'Failed'), <expression to get the status of Stored Proc>)
If the above expression is true then i want to have one common stored proc activity that i will set in Add If True Activity to log the error details
Let me know if this possible.
I think you have overcomplicated this.
A much easier way to do that is to leverage a Failure path for required activities. Furthermore, SP would not be executed when Copy Data fails, therefore checking the status of execution of SP doesn't really make sense.
My pipeline would look like this:

Azure Data Factory v2: Activity execute pipeline output

Is there a way to reference the output of an executed pipeline in the activity "Execute pipeline"?
I.e.: master pipeline executes 2 pipelines in sequence. The first pipeline generates an own created run_id that needs to be forwarded as a parameter to the second pipeline.
I've read the documentation and checked that the master pipeline log the output of the first pipeline, but it looks like that this is not directly possible?
We've used until now only 2 pipelines without a master pipeline, but we want to re-use the logic more. Currently we have 1 pipeline that calls the next pipeline and forwards the run_id.
ExecutePipline currently cannot pass anything from its insides to its output. You can only get the runID or name.
For some weird reason, the output of ExecutePipeline is returned not as a JSON object but as a string. So if you try to select a property of output like this #activity('ExecutePipelineActivityName').output.something then you get this error:
Property selection is not supported on values of type 'String'
I found that I had to use the following to get the run ID:
#json(activity('ExecutePipelineActivityName').output).pipelineRunId
The execute pipeline activity is just another activity with outputs that can be captured by other activities. https://learn.microsoft.com/en-us/azure/data-factory/control-flow-execute-pipeline-activity#type-properties
If you want to use the runId of the pipeline executed previosly, it would look like this:
#activity('ExecutePipelineActivityName').output.pipeline.runId
Hope this helped!

Resources