Azure Data Factory - Capture error details of a dataflow activity - azure

I have a data flow and my requirement is to capture the error details into a variable when it fails and assign this variable to a parameter in the next data flow. I tried to achieve this until the second stage(With help) as below, but I'm unable to get this variable assigned to a parameter in the next data flow. The error I get is - Expression cannot be parsed

To retrieve the dataflow error message, connect the dataflow activity upon failure to the set variable activity to store the error message using the expression:
#string(json(activity('Data flow1').error.message).Message)
Error Message:
Output:

Related

How to pass an object from an azure data factory lookup to a databricks notebook?

Using a lookup activity in ADF to get list of tables that I want to output to Databricks notebook which will be used to run the code.
For Loop Object dynamic content #activity('Lookup IngestionControl').output.value
The error I'm getting is
The value type in key 'TABLENAME' is not expected type 'System.String'
Attempted Solution: #String(activity('Lookup IngestionControl').output.value)
Warning: Expression of type: 'String' does not match the field: 'items'
Ran it with the warning and get an error because the object is type array and cannot be converted to a string
You can only pass a string into the databricks API. ADF uses the databricks jobs API when it calls a notebook / jar.
https://docs.databricks.com/dev-tools/api/latest/jobs.html
What i usually do is convert the array into a json string. Can do this in SQL or in ADF doesnt really matter. Depending on which one you are trying to do would change how i would do it.
#activity('Lookup IngestionControl').output.value tells me its a Lookup activity. I would just create the json from sql and pass it through ADF and into your notebook.

ADF copy activity and data flow behaving differently when writing data to multi lookup field in Dynamics 365

I am trying to import data from a CSV file into a Dynamics 365 Account table. As I need to do some transformations I am using a dataflow rather than a basic copy activity.
I was having difficulties getting it to work using a dataflow for writing to a multi lookup field so I tried using a copy activity to see if that worked using the exact same source,sink and mappings. I was able to import the
data successfully with the copy activity. I'm confused as to why the data flow does not work using the same source,sink and mappings. Below are screenshots of the various elements I set up and configured. Would appreciate any suggestions to get the dataflow working.
I'm using a cut down version of what will ultimately be my source CSV file. This is just so I can concentrate on getting the writing to the lookup field working.
Source CSV file
Copy Activity Source
Copy Activity Sink
Dynamics 365 Sink
Dataflow Source
Dataflow Sink
Copy Activity Mapping
Dataflow Mapping
Copy Activity Success
Dataflow Failure
Dataflow Error
Details
{"StatusCode":"DFExecutorUserError","Message":"Job failed due to reason: DF-REST_001 - Rest - Error response received from the server (url:https://##############v9.0/accounts,request body: Some({"accountid":"8b0257ea-de19-4aaa-9945-############","name":"A User","ownerid":"7d64133b-daa8-eb11-9442-############","ownerid#EntityReference":"systemuser"}), request method: POST, status code: 400), response body: Some({"error":{"code":"0x0","message":"An error occurred while validating input parameters: Microsoft.OData.ODataException: A 'PrimitiveValue' node with non-null value was found when trying to read the value of the property 'ownerid'; however, a 'StartArray' node, a 'StartObject' node, or a 'PrimitiveValue' node with null value was expected.\r\n at Microsoft.OData.JsonLight.ODataJsonLightPropertyAndValueDeserializer.ValidateExpandedNestedResourceInfoPropertyValue(IJsonReader jsonReader, Nullable1 isCollection, String propertyName, IEdmTypeReference typeReference)\r\n at Microsoft.OData.JsonLight.ODataJsonLightResourceDeserializ","Details":"com.microsoft.dataflow.Issues: DF-REST_001 - Rest - Error response received from the server (url:https://dev-gc.crm11.dynamics.com/api/data/v9.0/accounts,request body: Some({"accountid":"8b0257ea-de19-4aaa-9945-############","name":"A User","ownerid":"7d64133b-daa8-eb11-9442-############","ownerid#EntityReference":"systemuser"}), request method: POST, status code: 400), response body: Some({"error":{"code":"0x0","message":"An error occurred while validating input parameters: Microsoft.OData.ODataException: A 'PrimitiveValue' node with non-null value was found when trying to read the value of the property 'ownerid'; however, a 'StartArray' node, a 'StartObject' node, or a 'PrimitiveValue' node with null value was expected.\r\n at Microsoft.OData.JsonLight.ODataJsonLightPropertyAndValueDeserializer.ValidateExpandedNestedResourceInfoPropertyValue(IJsonReader jsonReader, Nullable1 isCollection, String propertyName, IEdmTypeReference typeReference)\r\n at Microsoft.OData.JsonLight.ODataJsonLightResourceDeser"}
I am running into a same wall,
but a temporary solution here is to sink the dataflow output to a Csv/or similar file into ADLS and then use a Copy activity to extract those files and Upsert it into the Dynamics.
Other references: https://vishalgrade.com/2020/10/01/how-to-populate-multi-lookup-attribute-in-ce-using-azure-data-factory/

ADF Script Activity - Custom SQL Error code

Azure ADF has recently added a script activity that allows us to add multiple SQL statements. I am trying to execute those statements against SQLMI. In that, on certain error, if I want to throw an exception, I have following piece of code.
set #ERROR_MSG = 'My custom error message';
throw 50000, #ERROR_MSG, 1;
The output of activity have custom error message. But the error code is not getting customized. It returns 2001. My question is, how to customize error code ?

How to capture the error message of the each activity among the multiple activities(3) in a pipeline using single stored procedure

I have a ADF pipeline with 3 different activities in sequence. I want to capture the error message of the activity whichever gets failed. So I have created a stored procedure by passing a parameter with which it can capture using #activity('Activity name').Error.Message
But using that expression I can only get the error message for the specified activity.
How can I capture the error message of any activity (of three activities in the pipeline)
This would be the pipeline output as the activities are in sequence.
As far as I know, you need to create three stored procedure to capture error message of correspond activity.
#concat(activity('Activity1').Error?.message,'|',activity('Activity2')?.Error?.message,'|',activity('Activity3')?.Error?.message)

Azure data factory: Handling inner failure in until/for activity

I have an Azure data factory v2 pipeline containing an until activity.
Inside the until is a copy activity - if this fails, the error is logged, exactly as in this post, and I want the loop to continue.
Azure Data Factory Pipeline 'On Failure'
Although the inner copy activity’s error is handled, the until activity is deemed to have failed because an inner activity has failed.
Is there any way to configure the until activity to continue when an inner activity fails?
Solution
Put the error-handling steps in their own pipeline and run them from an ExecutePipeline activity. You'll need to pass-in all the parameters required from the outer pipeline.
You can then use the completion (blue) dependency from the ExecutePipeline (rather than success (green)) so the outer pipeline continues to run despite the inner error.
Note that if you want the outer to know what happened in the inner then there is currently no way to pass data out of the ExecutePipeline to its parent (https://feedback.azure.com/forums/270578-data-factory/suggestions/38690032-add-ability-to-customize-output-fields-from-execut).
To solve this, use an sp activity inside the ExecutePipeline to write data to a SQL table, identified with the pipeline run id. This can be referenced inside the pipeline with #pipeline().RunId.
Then outside the pipeline you can do a lookup in the SQL table, using the run ID to get the right row.
HEALTH WARNING:
For some weird reason, the output of ExecutePipeline is returned not as a JSON object but as a string. So if you try to select a property of output like this #activity('ExecutePipelineActivityName').output.something then you get this error:
Property selection is not supported on values of type 'String'
So, to get the ExecutePipeine's run ID from outside you need:
#json(activity('ExecutePipelineActivityName').output).pipelineRunId
I couldn't find this documented in Microsoft's documentation anywhere, hence posting gory details here.

Resources