ADF-pipeline onfailure not working for throtling expections - azure

Please check above image where highlighted with yellow that get processed activity on failure connected with drop table but its not working and its not even triggering when expection like throtling in get processed activity
Help to trigger the drop table for get proccesed activity on failure

Azure data explorer command Drop table activity will be executed when Azure Data explorer command Get Alerts from Processed Table activity is failed and Lookup activity is failed and For-each activity is succeeded.
The solution is to have DROP TABLE activity thrice. One upon failure of Get Alerts from Processed table Azure data explorer command activity and second one on failure of lookup activity and the other one upon success of For-each activity.

Related

How I will get this operation of VM(Microsoft.Compute/virtualMachines/stop/action) in Activitiy Logs

Every time when I stopped My VM through OS and Azure portal I only get status like deallocate in Activity Logs
To get logs for a VM like
'Microsoft.Compute/virtualMachines/stop/action'
on Activity logs what should I do??
We have tested this in our local environment, below statements are based on our analysis.
Generally, the value Microsoft.Compute/virtualMachines/stop/action is getting stored in the Operation column in the activity logs.
We don't have any chance to customize the activity logs widget or project the operation as a column in the portal but you can apply an operation filter and pull the resource logs related to the operation like "start or stop or deallocate etc.,
Alternatively, you can publish those activity logs to the log analytics & by writing the kql queries you can pull the VM start or stop operation logs as shown below.

Azure Event Grid Topic Logs - Failed to resolve table or column expression named 'AegPublishFailureLogs'

As shown in the picture, when I try to access the build-in log for "Publish failures by topic and error" of event grid topic, I got this error:
Failed to resolve table or column expression named 'AegPublishFailureLogs'
Error
It is the built-in query, anyone knows why?
Following is my Diagnostic Setting, I have checked "Send to Log Analytics workspace"
Diagnostic setting

Azure DataFactory Copy Data - how to know it has done copying data?

We have a bunch of files in azure blob storage as a tsv format and we want to move them to destination which is ADLS Gen 2 and parquet format. We want this activity on daily basis. So the ADF pipeline will write bunch of parquet files in folders which will have date in them. for example
../../YYYYMMDD/*.parquet
On the other side we have API which will access this. How does the API know that the data migration is completed for a particular day or not?
Basically is there an in built ADF feature to write done file or _SUCCESS file which API can rely on?
Thanks
Why not simply call the API to let it know from ADF using Web activity?
You can use Web Activity to even pass the name of processed file as URL or body parameters to that the API knows what to process.
Provide two ways here for you.Let me say,from the perspective of ADF copy activity execution results, it can be divided into active way and passive way.
1.Active way,you could use waitOnCompletion feature in execute pipeline activity.
After that,execute a web activity to trigger your custom api.Please see this case:Azure Data Factory: How to trigger a pipeline after another pipeline completed successfully
2.Passive way,you could use monitor feature of ADF pipeline. Please see the example of .net sdk:
Console.WriteLine("Checking copy activity run details...");
RunFilterParameters filterParams = new RunFilterParameters(
DateTime.UtcNow.AddMinutes(-10), DateTime.UtcNow.AddMinutes(10));
ActivityRunsQueryResponse queryResponse = client.ActivityRuns.QueryByPipelineRun(
resourceGroup, dataFactoryName, runResponse.RunId, filterParams);
if (pipelineRun.Status == "Succeeded")
Console.WriteLine(queryResponse.Value.First().Output);
else
Console.WriteLine(queryResponse.Value.First().Error);
Console.WriteLine("\nPress any key to exit...");
Console.ReadKey();
Check the status is successed, then do your custom business.

filter data from azure data storage in data factory v2

I am new to Azure Data Factory v2. We have a table in a Azure data storage and I am able to load all data in a Azure SQL database by using the copy data option.
But what I would like to achieve is filter the data in the data storage by the field status that is an integer field. I tried some examples from the Microsoft website. But every time I get the bad syntax error when I run the pipeline.
So what I tried is, in the source tab I choose my data store as source data set, with the source table documentStatus. And I clicked on use query and put this line in:
"azureTableSourceQuery": "$$Text.Format('Status = 2')"
But when I run this I get this error: The remote server returned an error: (400) Bad Request.
Can anybody help me with writing a correct query so I can filter my source on this status field?
Thanks
Please set "azureTableSourceQuery": "Status eq 2":
Please refer to this doc related to Azure Table Storage Filter Sql.

How to avoid failed when input blob is missing in Azure Data Factory copy activity

I have a pipeline with three activities:
1. Download file from external web site and store it in blob storage (custom activity)
2. Copy data from blob to Azure SQL DB (Copy activity)
3. Delete blob file (custom activity)
The files shall be downloaded once/month, but I don't know which day it will be made available, so I have scheduled the pipeline to run once/day. If the file is available, it is downloaded. Activity #1 works fine, but activity #2 fails if there is no input file. Is there any way to avoid the Failed result, since it's an expected outcome?
There is a way to handle this situation in Azure Data Factory Version 1 by adding a policy custom code . But i tried in ADF v2 and doesn't works.
"policy": {
"validation": {
"minimumSizeMB": 0.01
}
}
One approach which I tried out successfully is to add a Get Metadata activity before the copy activity.
In the GetMetadata activity, you can add the same dataset, in the field list, you can add 'Exists'. You can consume the response of 'getmetadata' in a IF-Else activity, with the condition '#bool(activity('GetMetadataActivityName').output.exists)'
If the blob exists, the true part of If-else activity, you can add the Copy activity there. You can ignore the else part of If-else.
Sample pipeline
I ran into this exact same issue. From what I understand the only way to avoid the failure would be to create a blank blob file. The copy activity will no longer fail, it will just not copy any data because it is blank. I have confirmed this with an azure data factory PM as I said I ran into the same issue.
Hope this helps.

Resources