Dynamically send blobs from azure storage account to email using azure logic apps - azure

I have files getting stored in my Azure Blob storage account regularly. I want to send these blob file contents as attachments via email. I have established a workflow using Azure Logic Apps.
Here's the workflow:
I am able to send the blob file, but I need to specify the name of the file always.
Is there any way I can get all the file contents dynamically without having to specify the name manually?

Just add Get Blob Metadata using path action to get the name of your file:

Related

How to get an large file into Azure Storage Account by URL?

I am trying to copy a file from URL into an Azure Storage Account.
Microsoft has a doc for exactly this use case with Azure Data Factory (ADF) via these instructions. I tried using a test file from here (or anywhere for that matter).
In ADF, I set up the Copy activity with an HTTP Linked Service pointing to one of the .bin files in above URL using a GET request and Binary copy (as is file copy). I am able to hit the Test Connection button on both the source URL and the destination Linked Service that points to an Azure Storage Account and it succeeds.
However, after running the ADF pipeline, the resulting file is a file with a name that appears to be a random gui and is corrupted (the contents is garbled).
What am I missing here? Is there some other way to load a giant file by URL to an Azure Storage container?

Upload file with content to Azure Data Lake storage

Common process to upload a file to the storage is to:
create new file
append content
flush data
I have a problem that storage contains the create file event, used by the databricks, and files are not “consumed” after the data flush.
Is it possible to create/upload a file together with the content? Like the upload file functionality on Azure Portal.
You can achieve your requirement using Azure logic Apps through When a blob is added or modified (properties only) (V2) trigger. Below is the flow of my logic app.
RESULT:
I'm trying to upload file to Azure storage from Postman using PUT Menthod.
REFERENCES:
Uploading files to Azure Blob Storage
It occurred that Flush operation has the close parameter, set as false by default.
When all data has been appended to the file, Flush should be performed with close set as true.
That will mean that file has been fully uploaded and the Storage account event will be triggered.
More info: https://learn.microsoft.com/en-us/dotnet/api/azure.storage.files.datalake.datalakefileclient.flushasync?view=azure-dotnet#parameters

How to get an excel file from web and store it in an azure blob storage

I have an ADF pipeline that process an excel file in azure blob storage. The excel file is actually downloaded from
Here and then manually uploaded on the azure blob storage.
I want to automate this process of downloading the excel from the link and then load it in the azure blob storage. Is there any way to do it using ADF or any other Azure Service
The non-code option that comes to mind is Logic apps.
Your Logic apps will look this. After the trigger you will need a HTTP action followed by a copy blob to copy that content into your storage account.
Your Create blob step will look like this. The blob content will be the response body of the previous http request.
You can have this scheduled at a regular interval.

How to get and parse .xlsx file in Azure Function called from Azure Logic App?

I am working on a logic app that gets a file azure file file shares, adds them in azure blob storage and then calls a Azure Function that receives that blob (.xlsx file).
After the blob is created the Azure Function will parse the data in the blob and will insert the data in MS dynamics CRM entity.
My question is how can I access the blob in Azure Function and parse it so I can the data that will be store in the entity?
I have successfully created the logic app that performs the mentioned steps:
My question is how can I access the blob in Azure Function and parse
it so I can the data that will be store in the entity?
From your screenshot, seems your logic app to trigger function is triggered by Http. So you can use azure function blob binding to access the blob in storage:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-input?tabs=csharp
(The information of blob name is incoming by json format, so binding can directly get the name.)
Regarding parsing, I think there are a lot of codes that can be used for reference. It is no need to be given here. If you need it, you can tell me(And please tell the language you are using.).

How to move files attached to mail to a VM on Azure

so I'm new to Azure and I need to create a service that when given an office365 email (subscriber) will Automatically move files attached to new mails
of the subscriber to VM on azure and then run some tests on them there (inside the VM).
The only way I found to implement it so far is creating a logic-app for each subscriber which is done manualy.
Any help would be appreciated!
Few things if you want to get started.
Create the logic app that stores attachments to the database when a new email is received for a specific user
Add some parameters to your logic app so user email/credentials/tenant are not hard coded.
https://blog.mexia.com.au/preparing-azure-logic-apps-for-cicd
Create an ARM Template to deploy this logic app.
Create another logic app that will deploy the previous logic app.
Whenever a new user is created, call the second logic.
Also do you really need to store your files in a database ? As an alternative you can use Azure Blob Storage to store all these files.
EDIT
If you need to move the files to a VM, I would suggest you to do this:
When you receive an email
Store the attachments in Blob storage.
Generate a SAS Token (with Read permission)
Put the URL + Sas Token of your file into aan Azure Servicebus Queue.
On the VM, Have a service that reads messages from the queue and download files.

Resources