Azure logic app Action to run the pipeline in Azure Data factory - azure

I created an Azure logic app for running my actions from my application.
so my requirement is to upload the blob to the shared folder from Azure blob.
for that my team is created azure data factory pipeline,
so from the logic app designer I am running the trigger when the blob is added or modified, I need to run the pipeline from the azure data factory,
while running the trigger, the trigger is fired successfully but it is running at the blob only it is not going to the second action.
can you please give me the guidance how should I resolve the issue.

Azure data-factory provides the option of azure blob trigger which will work automatically. When new blob gets added your pipeline will run.
Data Factory Event Trigger
In this case you don't need separate logic app for the triggering event.

Related

switch between blobs to get content as email attachment in azure logic app

I have a design in logic app which sends email with attachment when called from a pipeline in ADF. Attachment is the file Pipeline writes in the blob storage. Below is the design.
I created the pipeline in dev ADF and moved it to prod ADF. In dev adf pipeline copies data to dev blob and prod pipleine copies to prod blob. But logic app get blob content steps connects to dev blob. How can I switch blob from connecting to dev or prod as required.
You cannot have to different storage account using logic app blob connector based on the input of your HTTP trigger as there is no way to set the connector authentication dynamically during the run time.
The alternative would be creating two different logic apps for your dev and production. In case if you want to create two different logic app then you cannot leverage the blob connector. Instead of using blob connector you need to leverage the Native HTTP connector and call the storage REST API so you can dynamically create the URL and set the authentication.

Log Analytics data export to storage account- All tables

I want to use Azure Log Analytics with the data export feature to export all log tables to a storage account. There used to be an '--export-all-tables' option, but annoyingly this has been removed.
Is there a way I can export all tables? Not just the ones that exist at the moment, but any future ones that may be created?
Azure Policy?
Azure Functions?
Azure Logic App?
We can archive the data with the help of Logic App, as we run a query from a logic app and uses its output in other actions in the workflow. So here Azure Blob Storage connector is used to send query output to blob storage.
Here we just need Log Analytics Workspace and Storage account access to achieve this.
And to add on all the new data, we can create a trigger in logic app where we can run it once in a day according to our requirement.
After setting up the trigger “Click + New step to add an action that runs after the recurrence action. Under Choose an action, type azure monitor and then select Azure Monitor Logs.”
Later after configuring the whole workflow create blob and attach it to workflow as below:
Later we can run the logic app and check the storage for the logs.
Check for the Microsoft Documentation to understand more about Archive data from Log Analytics workspace to Azure storage using Logic App

Trigger Azure data factory pipeline - Blob upload ADLS Gen2 (programmatically)

We are uploading files into Azure data lake storage using Azure SDK for java. After uploading a file, Azure data factory needs to be triggered. BLOB CREATED trigger is added in a pipeline.
Main problem is after each file upload it gets triggered twice.
To upload a file into ADLS gen2, azure provides different SDK than conventional Blobstorage.
SDK uses package - azure-storage-file-datalake.
DataLakeFileSystemClient - to get container
DataLakeDirectoryClient.createFile - to create a file. //this call may be raising blob created event
DataLakeFileClient.uploadFromFile - to upload file //this call may also be raising blob created event
I think ADF trigger is not upgraded to capture Blob created event appropriately from ADLSGen2.
Any option to achieve this? There are restrictions in my org not to use Azure functions, otherwise Azure functions can be triggered based on Storage Queue message or Service bus message and ADF pipeline can be started using data factory REST API.
You could try Azure Logic Apps with a blob trigger and a data factory action:
Trigger: When a blob is added or modified (properties only):
This operation triggers a flow when one or more blobs are added or
modified in a container. This trigger will only fetch the file
metadata. To get the file content, you can use the "Get file content"
operation. The trigger does not fire if a file is added/updated in a
subfolder. If it is required to trigger on subfolders, multiple
triggers should be created.
Action: Get a pipeline run
Get a particular pipeline run execution
Hope this helps.

Unable to publish Azure Data factory Pipeline changes

I have created a simple data factory pipeline for copying files from azure blob storage to azure data lake. For this i have used one event based trigger. Trigger will automatically run pipeline if new blob will come to the blob storage location. If i am publishing my pipeline with my trigger in stopped state then it is publishing without any error. But, when i am publishing with my trigger in started state then it is giving following error: -
Response with status: 403 Forbidden for URL:
https://management.azure.com/subscriptions/subscriptionId/providers/Microsoft.EventGrid/register?api-version=2018-02-01
As event based triggers are latest in ADF, I am unable to get any blogs related to this. Please help me. Any help will be appreciated.
I think you will have to stop your trigger first. Tumbling window trigger and schedule trigger also need be stopped and then updated.
Make sure that your subscription is registered with the Event Grid resource provider.

Launch Azure Container Service on Upload to Blob Storage

I have a use case where I'd like to launch a job on an Azure Container Service cluster to process a file being uploaded to Blob storage. I know that I can trigger an Azure Functions instance from the upload, but I haven't been able to find examples in the documentation of starting a job within Functions.
This diagram illustrates the AWS equivalent of what I want:
Thanks!
The Azure Event Grid feature is what you need. It is still in preview, but you can subscribe to the Blob Created event. You can set the subscriber endpoint to an Azure Function that puts a message in a queue to trigger your job, or you can expose a service on your cluster that will accept the request and do whatever you need done.
Microsoft provides a guide at https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-quickstart?toc=%2fazure%2fevent-grid%2ftoc.json#create-a-message-endpoint

Resources