How to move files attached to mail to a VM on Azure - azure

so I'm new to Azure and I need to create a service that when given an office365 email (subscriber) will Automatically move files attached to new mails
of the subscriber to VM on azure and then run some tests on them there (inside the VM).
The only way I found to implement it so far is creating a logic-app for each subscriber which is done manualy.
Any help would be appreciated!

Few things if you want to get started.
Create the logic app that stores attachments to the database when a new email is received for a specific user
Add some parameters to your logic app so user email/credentials/tenant are not hard coded.
https://blog.mexia.com.au/preparing-azure-logic-apps-for-cicd
Create an ARM Template to deploy this logic app.
Create another logic app that will deploy the previous logic app.
Whenever a new user is created, call the second logic.
Also do you really need to store your files in a database ? As an alternative you can use Azure Blob Storage to store all these files.
EDIT
If you need to move the files to a VM, I would suggest you to do this:
When you receive an email
Store the attachments in Blob storage.
Generate a SAS Token (with Read permission)
Put the URL + Sas Token of your file into aan Azure Servicebus Queue.
On the VM, Have a service that reads messages from the queue and download files.

Related

How to push a file from Azure Data Factory onto a MS Teams Channel

At the moment, I am having to routinely manually upload a file onto a Teams Channel.
I have managed to create a pipeline to upload the file into my Azure Data Lake. I would now like to push the file from my Azure environment to my Teams Channel. I have found that webhooks cannot work with files and that bots can send files in the chat but not "Upload" them into a channel.
Is there a way to upload files from Azure to MS Teams using Data Factory or other alternatives?
Thank you.
The most appropriate way to partially achieve this would be to use a Teams Adaptive Card Connector but I couldn't find any way to easily set it up. It seems quite complex. Second option would be to use a Teams Post Message Connector from Azure Logic App but unfortunately it might not yet support attachments but you can send links to files stored elsewhere (SharePoint, Blob etc). Here's what you can try in the meantime.
In ADF:
Create a Copy Activity that will send the files to a Storage.
Create a Web Activity that will send a notification to a Logic App when the pipeline. You only need to use this as a trigger for the Logic App.
In Logic Apps:
Create an HTTP Connector that will receive the pipeline run information the files using the Azure Storage Blob Create blob action.
After that, create a Microsoft Teams Post a message (V3) Teams Connector, choose your Team and Channel
Use the Get blob content using path action to get a URL to the file - you might need to construct this URL.
Create your message using variables from the above connector and parse a link to the file to be downloaded.
Depending on the content of the file, you could also try to retrieve partial content and display it in the message body directly (if you can consider that an alternative for you).
I've not tried using the Adaptive Card Connector but I know it does give you a far richer dynamic experience. You would need to spend some time to design a custom card solution. Use this playground to see if it's something you can explore in the future.
The flow is as follows and kinda runs in parallel:
ADF Pipeline runs > ADF Copy Activity saves file in Blob storage
ADF Pipeline runs > ADF Web Activity triggers Logic App HTTP Connector > Logic App retrieves file from Blob storage > Sends a message to a Teams channel with a link to the file.
Here are all the supported Teams Actions.

Is there a way to setup a Snowpipe on multiple Azure Storage Accounts?

My company set up multiple storage accounts (about 30) for an app with the same container name on the same Azure tenant ID (ex. azure://myaccount1.blob.core.windows.net/samecontainername,
azure://myaccount2.blob.core.windows.net/samecontainername,
azure://myaccount3.blob.core.windows.net/samecontainername)
I've created Snowpipes in the past and the process was:
Create a queue on the account and container
Create an Event Grid linked to that account and container and with a destination set as the created queue
Create a Notification Integration in Snowflake (to a queue related to that account and container)
Authenticate the service principal app Snowflake created (if it is the first time on that tenant ID)
Create a stage using the account and blob url (azure://myaccount2.blob.core.windows.net/samecontainername) and using the notification integration created
Create a pipe using the notification integration created and copy from the stage created into a variant table (for Json files)
The question is, is there a way to simplify this so I won't have to do the steps above 30 times? Each account contains the same container name
Seems not as snowpipe can have one NOTIFICATION INTEGRATION per queue. However, we have setup a copy statement from all accounts into one not used for PROD on which we set up a NOTIFICATION INTEGRATION to Snowflake and then a PIPE using that integration

Dynamically send blobs from azure storage account to email using azure logic apps

I have files getting stored in my Azure Blob storage account regularly. I want to send these blob file contents as attachments via email. I have established a workflow using Azure Logic Apps.
Here's the workflow:
I am able to send the blob file, but I need to specify the name of the file always.
Is there any way I can get all the file contents dynamically without having to specify the name manually?
Just add Get Blob Metadata using path action to get the name of your file:

Is there a way to know if a file has been uploaded to a network drive in Azure

I have a network location where every hour a csv file gets dumped. I need to copy that file to an azure blob. How do I know that a file has been uploaded to that network drive. Is there something like a file watcher in azure which monitors this network location? Also, is it possible to copy a file from network location to an azure blob through code?
I'm using .net core APIs deployed to an Azure App Service.
Please suggest a possible solution.
You can use Azure Event Grid but as of today Event Grid does not support Azure File Share.
As your fileshare in on-prem the only way I see is that you can write a custom publisher which can run on-prem and uses Azure Event Grid to send the event to Azure Event Grid and a subscriber which can be Azure Function does the work you want it to do.
https://learn.microsoft.com/en-us/azure/event-grid/custom-event-quickstart-portal
But it will only be an Event and not the file itself which has been added\changed and to do that you will have to then upload the file itself into Azure for processing as well. As the above way requires you to do two things I would recommend run a custom code on-prem which runs CRON job like and looks for the new or edited file and then uploads to Azure BLOB Storage and then execute Azure Function to do your processing task.
Since the files are on-prem you can use powershell to monitor a folder for new files. Then fire an event to upload the file to an Azure blob.
There is a video showing how to do this here: https://www.youtube.com/watch?v=Usih7UywZYA
The changes you need to make are:
replace the action with an upload to azure https://argonsys.com/microsoft-cloud/library/how-to-upload-files-to-azure-blob-storage-using-powershell-and-azcopy/
Run powershell in the context of a user that can upload files

Can azure event hub ingest json events from azure blog storage without writing any code?

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.
Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

Resources