Is it possible to append only the newly created/arrived blob in Azure blob storage to the Azure SQL DB? - azure

Here is my problem in steps:
Uploading specific csv file via PowerShell with Az module to a specific Azure blob container. [Done, works fine]
There is a trigger against this container which fires when a new file appears. [Done, works fine]
There is a pipeline connected with this trigger, which is appends the fresh csv to the specific SQL table. [Done, but not good]
I have a problem with step 3. I don't want to append all the csv-s within the container(how is's working now), I just want to append the csv which is just arrived - the newest in the container.
Okay, the solution is:
there is a builtin attribute in pipeline called #triggerBody().fileName
Since I have the file which fired the trigger, I can pass it to the Pipeline.

I think you can use event trigger and check Blob created option.
Here is an official documentation about it. You can refer to this.

Related

When a Particular file is uploaded in blob storage trigger will invoke data pipeline

I want to solve a scenario where when a particular file is uploaded into the blob storage, trigger is invoked and pipeline runs.
I tried event based triggers but I don't know how to tackle this scenario.
I reproduce same in my environment . I got this output.
First create a Blob event trigger .
Note:
Blob path begins with('/Container_Name/') – Receives events form container.
Blob path begins with('/Container_Name/Folder_Name') – Receives events form container_name container and folder_name folder.
Data preview:
Continue and click on OK.
If you created a parameter. For example in my scenario, I created a parameter called file_name directly you can pass the value inside the parameter by using #triggerBody().fileName -> Publish the pipeline.
For more information follow this reference.

How to Trigger an email in the Event that a Logic App Trigger is skipped?

I have the below scenario:
I have a Logic App, which gets triggered once in every day(24hours).
It basically looks at a SFTP location, if there is file dropped in there, pulls it and pushes it into a BLOB storage and then deletes it from the source(SFTP).
I need to trigger an email in the events of:
If the Trigger is "Skipped", i.e. it ran but could not find any file in the SFTP.
If it failed to Upload to the BLOB Storage.
Is it possible to enable Email Trigger in the above scenarios?(1&2)
Any guidance will be appreciated as I am new in the IAC space.
Thanks in advance.
Firstly, you can List the files in FTP and pass the name to get the content of the files using List files in folder and Get file content actions of SFTP connector.
If the Trigger is "Skipped", i.e. it ran but could not find any file in the SFTP.
For this, in the next step you can use a condition action to check if the file has been uploaded for that day by comparing the last modified time with the current date. If yes, then create a file in the blob storage with the file contents from Get file content step. Below is the flow of my logic app.
If it failed to Upload to the BLOB Storage.
For this you can create another condition action and check if the file is been created or not by using actions('Create_blob_(V2)')['outputs']['statusCode']. Below is the complete code of my Logic app

Azure datafactory triggers for any blob files get appened

I am trying to copy the databricks logs from one folder to another, Since I am sending databricks logs to storage account which is append blob. My objective as any new blob/any files get appended I need to run the copy activity.
I tired storage events trigger but it is not running if any logs get appended to the same files. Is there any way to run the pipeline immediately if any files appended or new folder dd/mm/yyy format get created.
Thanks
Anuj gupta
There is no out-of-the-box method to trigger when a blob is appended, there is a similar ask here, you can log a more precise one to get an official response.
Or you can use Create a custom event trigger to run a pipeline in Azure Data Factory with Azure Blob Storage as an Event Grid source where event Microsoft.Storage.BlobCreated is "triggered when a blob is created or replaced." (Append Block succeeds only if the blob already exists.)
Also, perhaps with Microsoft.Storage.BlobRenamed, Microsoft.Storage.DirectoryCreated & Microsoft.Storage.DirectoryRenamed

Azure Data Factory Copy Data dynamically get last blob

I have an Azure Data Factory Pipeline that runs on a Blob Created Trigger, I want it to grab the last Blob added and copy that to the desired location.
How do I dynamically generate the file path for this outcome?
System Variables
Expressions and Functions
"#triggerBody().folderPath" and "#triggerBody().fileName" captures the last created blob file path in event trigger. You need to map your pipeline parameter to these two trigger properties. Please follow this link to do the parameter passing and reference. Thanks.

Azure Data Factory: event not starting pipeline

I've set up a Azure Data Factory pipeline containing a copy activity. For testing purposes both source and sink are Azure Blob Storages.
I wan't to execute the pipeline as soon as a new file is created on the source Azure Blob Storage.
I've created a trigger of type BlovEventsTrigger. Blob path begins with has been set to //
I use Cloud Storage Explorer to upload files but it doesn't trigger my pipeline. To get an idea of what is wrong, how can I check if the event is fired? Any idea what could be wrong?
Thanks
Reiterating what others have stated:
Must be using a V2 Storage Account
Trigger name must only contain letters, numbers and the '-' character (this restriction will soon be removed)
Must have registered subscription with Event Grid resource provider (this will be done for you via the UX soon)
Trigger makes the following properties available #triggerBody().folderPath and #triggerBody().fileName. To use these in your pipeline your must map them to pipeline paramaters and use them as such: #pipeline().parameters.paramaetername.
Finally, based on your configuration setting blob path begins with to // will not match any blob event. The UX will actually show you an error message saying that that value is not valid. Please refer to the Event Based Trigger documentation for examples of valid configuration.
Please reference this. First, it needs to be a v2 storage. Second, you need register it with event grid.
https://social.msdn.microsoft.com/Forums/azure/en-US/db332ac9-2753-4a14-be5f-d23d60ff2164/azure-data-factorys-event-trigger-for-pipeline-not-working-for-blob-creation-deletion-most-of-the?forum=AzureDataFactory
There seems to be a bug with Blob storage trigger, if you have more than one trigger is allocated to the same blob container, none of the triggers will fire.
For some reasons (another bug, but this time in Data factories?), if you edit several times your trigger in the data factory windows, the data factory seems to loose track of the triggers it creates, and your single trigger may end up creating multiple duplicate triggers on the blob storage. This condition activates the first bug discussed above: the blob storage trigger doesn't trigger anymore.
To fix this, delete the duplicate triggers. For that, navigate to your blob storage resource in the Azure portal. Go to the Events blade. From there you'll see all the triggers that the data factories added to your blob storage. Delete the duplicates.
And now, on 20.06.2021, same for me: event trigger is not working, though when editing it's definition in DF, it shows all my files in folder, that matches. But when i add new file to that folder, nothing happens!
If you're creating your trigger via arm template, make sure you're aware of this bug. The "runtimeState" (aka "Activated") property of the trigger can only be set as "Stopped" via arm template. The trigger will need to be activated via powershell or the ADF portal.
An event grid resource provider needs to have been registered, within the specific azure subscription.
Also if you use Synapse Studio pipelines instead of Data Factory (like me) make sure the Data Factory resource provider is also registered.
Finally, the user should have both 'owner' and 'storage blob data contributor' on the storage account.

Resources