How to Trigger an email in the Event that a Logic App Trigger is skipped? - azure

I have the below scenario:
I have a Logic App, which gets triggered once in every day(24hours).
It basically looks at a SFTP location, if there is file dropped in there, pulls it and pushes it into a BLOB storage and then deletes it from the source(SFTP).
I need to trigger an email in the events of:
If the Trigger is "Skipped", i.e. it ran but could not find any file in the SFTP.
If it failed to Upload to the BLOB Storage.
Is it possible to enable Email Trigger in the above scenarios?(1&2)
Any guidance will be appreciated as I am new in the IAC space.
Thanks in advance.

Firstly, you can List the files in FTP and pass the name to get the content of the files using List files in folder and Get file content actions of SFTP connector.
If the Trigger is "Skipped", i.e. it ran but could not find any file in the SFTP.
For this, in the next step you can use a condition action to check if the file has been uploaded for that day by comparing the last modified time with the current date. If yes, then create a file in the blob storage with the file contents from Get file content step. Below is the flow of my logic app.
If it failed to Upload to the BLOB Storage.
For this you can create another condition action and check if the file is been created or not by using actions('Create_blob_(V2)')['outputs']['statusCode']. Below is the complete code of my Logic app

Related

When a Particular file is uploaded in blob storage trigger will invoke data pipeline

I want to solve a scenario where when a particular file is uploaded into the blob storage, trigger is invoked and pipeline runs.
I tried event based triggers but I don't know how to tackle this scenario.
I reproduce same in my environment . I got this output.
First create a Blob event trigger .
Note:
Blob path begins with('/Container_Name/') – Receives events form container.
Blob path begins with('/Container_Name/Folder_Name') – Receives events form container_name container and folder_name folder.
Data preview:
Continue and click on OK.
If you created a parameter. For example in my scenario, I created a parameter called file_name directly you can pass the value inside the parameter by using #triggerBody().fileName -> Publish the pipeline.
For more information follow this reference.

How to create a simple dashboard from Azure Storage and Fileshare

I have an azure storage account.
Inside the container, with a client-specific folder structure, every morning, some files get pushed.
I have a function app which processes and converts these files and calls some external service to work upon on these processed files.
I have got a file-share as well, which is basically mounted on a vm.
The external service, after processing the files (#3), generates the resultant success/failure files inside this file-share(#4).
Now the ask is:
Create a simple dashboard which will monitor the storage account(and in effect the container and the file-shares),it should capture & show basic information's, and should look like below table structure(with 3 simple variations of data):
FileName|ReceivedDateTime|NumberOfRecords
Original_file.csv20221011 5:21 AM|10
Original_file_Success.csv20221011 5:31 AM|9
Original_file_Failure.csv20221011 5:32 AM|1
In here the first record is captured from the Container and the second and third - both are generated in the file-share.
Also, whenever a new failure file is generated, i.e., Original file_Failure, it should send email with a predefines template adding the file name to a predefined recipient list.
Any guidance on the azure service to use?
I have seen Azure Monitor,workbook and other stuffs, but I feel that would be an overkill for such simple requirement.
Thanks in advance.

Azure datafactory triggers for any blob files get appened

I am trying to copy the databricks logs from one folder to another, Since I am sending databricks logs to storage account which is append blob. My objective as any new blob/any files get appended I need to run the copy activity.
I tired storage events trigger but it is not running if any logs get appended to the same files. Is there any way to run the pipeline immediately if any files appended or new folder dd/mm/yyy format get created.
Thanks
Anuj gupta
There is no out-of-the-box method to trigger when a blob is appended, there is a similar ask here, you can log a more precise one to get an official response.
Or you can use Create a custom event trigger to run a pipeline in Azure Data Factory with Azure Blob Storage as an Event Grid source where event Microsoft.Storage.BlobCreated is "triggered when a blob is created or replaced." (Append Block succeeds only if the blob already exists.)
Also, perhaps with Microsoft.Storage.BlobRenamed, Microsoft.Storage.DirectoryCreated & Microsoft.Storage.DirectoryRenamed

Is it possible to append only the newly created/arrived blob in Azure blob storage to the Azure SQL DB?

Here is my problem in steps:
Uploading specific csv file via PowerShell with Az module to a specific Azure blob container. [Done, works fine]
There is a trigger against this container which fires when a new file appears. [Done, works fine]
There is a pipeline connected with this trigger, which is appends the fresh csv to the specific SQL table. [Done, but not good]
I have a problem with step 3. I don't want to append all the csv-s within the container(how is's working now), I just want to append the csv which is just arrived - the newest in the container.
Okay, the solution is:
there is a builtin attribute in pipeline called #triggerBody().fileName
Since I have the file which fired the trigger, I can pass it to the Pipeline.
I think you can use event trigger and check Blob created option.
Here is an official documentation about it. You can refer to this.

Copy files from Azure blob to Sharepoint folder using Microsoft Flow

I want to copy files from Azure blob storage to SharePoint Folder using Microsoft Flow. I have tried several times and the flow always fail when its running.
I have attached the flow that I'm currently trying to execute:
Can someone help me with this?
For your problem, please refer to the logic I post below (I have upload a testcsv.csv file to the blob storage):
After the trigger "When a blob is added or modified", we need to use "Get blob content" action to get the content of the csv file. Then add "Create file" action of SharePoint and put the file content which we got from blob to the "File Content" box.
By the way, as you mentioned it is a csv file, so in my blob storage container I only have one file by default. If there are more than one file in your blob storage, you can use "List blobs" action and use "For each" to loop it and then create each of the file in SharePoint.
I tried that however failed as it would not handle if you have folder structure in the blob and you'd like to mirror that structure in SP and copy individual files to folders.

Resources