Is there a way to setup a Snowpipe on multiple Azure Storage Accounts? - azure

My company set up multiple storage accounts (about 30) for an app with the same container name on the same Azure tenant ID (ex. azure://myaccount1.blob.core.windows.net/samecontainername,
azure://myaccount2.blob.core.windows.net/samecontainername,
azure://myaccount3.blob.core.windows.net/samecontainername)
I've created Snowpipes in the past and the process was:
Create a queue on the account and container
Create an Event Grid linked to that account and container and with a destination set as the created queue
Create a Notification Integration in Snowflake (to a queue related to that account and container)
Authenticate the service principal app Snowflake created (if it is the first time on that tenant ID)
Create a stage using the account and blob url (azure://myaccount2.blob.core.windows.net/samecontainername) and using the notification integration created
Create a pipe using the notification integration created and copy from the stage created into a variant table (for Json files)
The question is, is there a way to simplify this so I won't have to do the steps above 30 times? Each account contains the same container name

Seems not as snowpipe can have one NOTIFICATION INTEGRATION per queue. However, we have setup a copy statement from all accounts into one not used for PROD on which we set up a NOTIFICATION INTEGRATION to Snowflake and then a PIPE using that integration

Related

switch between blobs to get content as email attachment in azure logic app

I have a design in logic app which sends email with attachment when called from a pipeline in ADF. Attachment is the file Pipeline writes in the blob storage. Below is the design.
I created the pipeline in dev ADF and moved it to prod ADF. In dev adf pipeline copies data to dev blob and prod pipleine copies to prod blob. But logic app get blob content steps connects to dev blob. How can I switch blob from connecting to dev or prod as required.
You cannot have to different storage account using logic app blob connector based on the input of your HTTP trigger as there is no way to set the connector authentication dynamically during the run time.
The alternative would be creating two different logic apps for your dev and production. In case if you want to create two different logic app then you cannot leverage the blob connector. Instead of using blob connector you need to leverage the Native HTTP connector and call the storage REST API so you can dynamically create the URL and set the authentication.

Log Analytics data export to storage account- All tables

I want to use Azure Log Analytics with the data export feature to export all log tables to a storage account. There used to be an '--export-all-tables' option, but annoyingly this has been removed.
Is there a way I can export all tables? Not just the ones that exist at the moment, but any future ones that may be created?
Azure Policy?
Azure Functions?
Azure Logic App?
We can archive the data with the help of Logic App, as we run a query from a logic app and uses its output in other actions in the workflow. So here Azure Blob Storage connector is used to send query output to blob storage.
Here we just need Log Analytics Workspace and Storage account access to achieve this.
And to add on all the new data, we can create a trigger in logic app where we can run it once in a day according to our requirement.
After setting up the trigger “Click + New step to add an action that runs after the recurrence action. Under Choose an action, type azure monitor and then select Azure Monitor Logs.”
Later after configuring the whole workflow create blob and attach it to workflow as below:
Later we can run the logic app and check the storage for the logs.
Check for the Microsoft Documentation to understand more about Archive data from Log Analytics workspace to Azure storage using Logic App

Add Message to Azure Queue when Azure Table Storage is updated

Currently, I have an Azure Function App which runs every hour (timer trigger) that pulls data from Azure table storage and updates a NSG. I only did it this way because Function Apps currently DON'T support Azure Table triggers; however Function Apps DO support Azure queue triggers.
With that said, i'd like a message be sent to the queue every time my Azure Table is updated. That way, the Azure Table updates can happen immediately compared to every hour. Haven't figured out how to send messages to Azure Queue from Azure Tables though.
Any help?
There is no change feed, update triggers etc. on Azure Table storage. You could achieve this by switching to Tables API on Cosmos DB - which does have a Change Feed.

Can azure event hub ingest json events from azure blog storage without writing any code?

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.
Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

How to move files attached to mail to a VM on Azure

so I'm new to Azure and I need to create a service that when given an office365 email (subscriber) will Automatically move files attached to new mails
of the subscriber to VM on azure and then run some tests on them there (inside the VM).
The only way I found to implement it so far is creating a logic-app for each subscriber which is done manualy.
Any help would be appreciated!
Few things if you want to get started.
Create the logic app that stores attachments to the database when a new email is received for a specific user
Add some parameters to your logic app so user email/credentials/tenant are not hard coded.
https://blog.mexia.com.au/preparing-azure-logic-apps-for-cicd
Create an ARM Template to deploy this logic app.
Create another logic app that will deploy the previous logic app.
Whenever a new user is created, call the second logic.
Also do you really need to store your files in a database ? As an alternative you can use Azure Blob Storage to store all these files.
EDIT
If you need to move the files to a VM, I would suggest you to do this:
When you receive an email
Store the attachments in Blob storage.
Generate a SAS Token (with Read permission)
Put the URL + Sas Token of your file into aan Azure Servicebus Queue.
On the VM, Have a service that reads messages from the queue and download files.

Resources